Monday, July 13, 2009

Breaking backward compatibility - oh, please!

Most of our SQL Server applications are developed in a way so that they work with any version of the SQL Sever. However users are encouraged to use SQL Sever 2005 just because this particular version has been thoroughly tested.

As one can expect, few users install and use SQL Sever 2008 and usually they do not observe any major issues.

At least not until yesterday.

It turns out that one of our scripts, intended to allow users to select a SQL Server account and assign it to a employee row in a employee table, is based on sp_helpuser stored procedure.

And guess what? Someone from Microsoft has decided to change the names of columns returned by sp_helpuser!

The change is described here

http://msdn.microsoft.com/en-us/library/ms143179.aspx

among few other much more subtle changes.

Does the rename clears up any confusion or introduces a new value to the procedure results?

No!

It's just "GroupName" has been renamed to "RoleName" and three other columns named with "group..." have been renamed to "role...".

Does such change breaks your scripts?

YES!

It's just beautiful BOOM and the application does not work as expected!

I've been talking to few developers from Microsoft and what they always emphasise very firmly is that the backward compatibility for the client-code written by us, developers, is very, very important for them. They are not allowed to just rename "CreateWindow" to "CreateNewWindow" or "sp_helpuser" to "sp_userdata" just because thousands of people probably use previously released artifacts in their own code.

Apparently, this time someone has decided that there are really some lifetime reasons which would consider RoleName more correct that GroupName so that breaking the backward compatibility is somehow justified.

I really, really wonder what these reasons would be....

Wednesday, July 1, 2009

ASP.NET WebServices two-way (Response and Request) compression - a general solution

Objective

I will present a general solution to the two-way compression of WebServices" issue. The issue can be critical in a complicated SOA solution where a lot of data is passed not only from the server to the client but also from the client to the server. The solution is a part of a Data Service component I currently work on.

Motivation

On one hand, the application server can compress the HTTP data sent to the client "out-of-the-box", just enable "HTTP Compression". While such solution seems attractive, it does not provide two-way compression. It's nice to have a server's response compressed, however uploading huge amount of data is still a problem.

On the other hand, a generic solution already exists and consists in writing a custom SOAP extension. The solution by Saurabh Nandu dates 2002 is described here. There are however two issues with that approach. A small issue is that you have to apply a custom attribute to both server and client code. While applying a custom attribute to a server method is not an issue, the client proxy class is regenerated each time you update the reference which means that you have to remember to manually edit the proxy class after you update the reference. Unfortunately, there's also a big issue. We've observed that the solution causes random OutOfMemory exceptions in a production server environment! The randomness was a complete disaster for our proprietary software!

Towards the solution

Let's start with a basic code we'll enhance during this tutorial. Open Visual Studio, create a new solution with two projects: a console applications and an ASP.NET Web Service application.

Implement a method on the WebService.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Services;
 
namespace WebService1
{
    [WebService( Namespace = "http://tempuri.org/" )]
    [WebServiceBinding( ConformsTo = WsiProfiles.BasicProfile1_1 )]
    [System.ComponentModel.ToolboxItem( false )]
    public class Service1 : System.Web.Services.WebService
    {
 
        [WebMethod]
        public string HelloWorld( string Request )
        {
            return Request;
        }
    }
}

Note that the method has an input parameter and just returns it's value to the client. We are going to pass really long strings here and we'll gonna see how much data is actually passed to and from the server.


In a console application, add a web service reference and following code:



using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
 
namespace ConsoleApplication60
{
    class Program
    {
        static void Main( string[] args )
        {
            MyService.Service1 s = new ConsoleApplication60.MyService.Service1();
            s.Url = "http://localhost.:3112/Service1.asmx";
 
            StringBuilder sb = new StringBuilder();
 
            for ( int i=0; i < 10000; i++ )
                sb.Append( "qwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnm" );
 
            Console.WriteLine( s.HelloWorld( sb.ToString() ) );
 
            Console.ReadLine();
        }
    }
}

Note that I manually set the Url of the WebService and what I actually did was to add a dot after the "localhost". This is to be able to use a HTTP sniffer, Fiddler.


Run the application and inspect the HTTP session in Fiddler. Please take a look at "Content-Length" headers for both the request and the response, the content length of the request is 520318 bytes, and the content length of the response is 520352 bytes.



One way compression (compression of the response)

The OutOfMemory exceptions caused by the solution based of a custom SOAP Extensions made us to search for another approach and lead us to a common and well known "HttpCompressionModule on a server - HttpWebResponseDecompressed on a client proxy". This solution solves only the half of the compression issue - the data sent from the server to the client is compressed.


Start with a HttpCompressionModule: add a HttpCompressionModule.cs file into the WebService project:



using System;
using System.IO.Compression;
using System.Web;
using System.Web.Security;
 
public class HttpCompressionModule : IHttpModule
{
    private bool _isDisposed = false;
 
    public void Init( HttpApplication context )
    {
        context.BeginRequest += new EventHandler( context_BeginRequest );
    }
 
    void context_BeginRequest( object sender, EventArgs e )
    {
        HttpApplication app = sender as HttpApplication;
        HttpContext ctx = app.Context;
 
        if ( !ctx.Request.Url.PathAndQuery.ToLower().Contains( ".asmx" ) )
            return;
 
        if ( IsEncodingAccepted( "gzip" ) )
        {
            app.Response.Filter = new GZipStream( app.Response.Filter,
      CompressionMode.Compress );
            SetEncoding( "gzip" );
        }
        else if ( IsEncodingAccepted( "deflate" ) )
        {
            app.Response.Filter = new DeflateStream( app.Response.Filter,
      CompressionMode.Compress );
            SetEncoding( "deflate" );
        }
    }
    private bool IsEncodingAccepted( string encoding )
    {
        return HttpContext.Current.Request.Headers["Accept-encoding"] != null &&
          HttpContext.Current.Request.Headers["Accept-encoding"].Contains( encoding );
    }
    private void SetEncoding( string encoding )
    {
        HttpContext.Current.Response.AppendHeader( "Content-encoding", encoding );
    }
    private void Dispose( bool dispose )
    {
        _isDisposed = dispose;
    }
    ~HttpCompressionModule()
    {
        Dispose( false );
    }
    public void Dispose()
    {
        Dispose( true );
    }
}

Now, make the module active by adding



<httpModules>
    ...
    <add name="HttpCompressionModule" type="HttpCompressionModule" />
</httpModules>

to your web.Config.


Now go back to the client console application and add two methods to the partial proxy class. Assuming that the proxy's class name is MyService, add MyService.cs:



public partial class Service1 
 {
     #region WebResponseCompress
 
     protected override WebRequest GetWebRequest( Uri uri )
     {
         HttpWebRequest request = (HttpWebRequest)base.GetWebRequest( uri );
         request.Headers.Add( "Accept-Encoding", "gzip, deflate" );
     
         return request;
     }
 
     protected override WebResponse GetWebResponse( WebRequest request )
     {
         return new HttpWebResponseDecompressed( request );
     }
 
     #endregion
 }

Note that what this does is to make sure that the correct header is sent to the server and that the response is actually decompressed on the client side. The HttpWebResponseDecompressed is a common class but it's not in the Base Class Library so here it is:



public class HttpWebResponseDecompressed : System.Net.WebResponse
{
    private HttpWebResponse response;
 
    public HttpWebResponseDecompressed( WebRequest request )
    {
        try
        {
            response = (HttpWebResponse)request.GetResponse();
        }
        catch ( WebException ex )
        {
            response = (HttpWebResponse)ex.Response;
        }
    }
    public override void Close()
    {
        response.Close();
    }
    public override Stream GetResponseStream()
    {
        if ( response.ContentEncoding == "gzip" )
        {
            return new GZipStream( response.GetResponseStream(), 
                CompressionMode.Decompress );
        }
        else if ( response.ContentEncoding == "deflate" )
        {
            return new DeflateStream( response.GetResponseStream(), 
                CompressionMode.Decompress );
        }
        else
        {
            if ( response.StatusCode == 
                    HttpStatusCode.InternalServerError )
                return new GZipStream( response.GetResponseStream(), 
                    CompressionMode.Decompress );
 
            return response.GetResponseStream();
        }
    }
    public override long ContentLength
    {
        get { return response.ContentLength; }
    }
    public override string ContentType
    {
        get { return response.ContentType; }
    }
    public override System.Net.WebHeaderCollection Headers
    {
        get { return response.Headers; }
    }
    public override System.Uri ResponseUri
    {
        get { return response.ResponseUri; }
    }
} 

Now run the solution and inspect it with Fiddler:



Note that the content length of the server's response is not only 5771 bytes! (Note also that the 100:1 compression ratio is rather unusual and is caused by repetitive data I send to the server from the console application!)


Two-way solution (compression of requests and responses)

We are now ready to enhance the partial solution and enable a two-way compression (and this is my contribution to the issue).


First, go to the HttpCompressionModule and add a line:



...
 
void context_BeginRequest( object sender, EventArgs e )
{
    HttpApplication app = sender as HttpApplication;
    HttpContext ctx = app.Context;
 
    if ( !ctx.Request.Url.PathAndQuery.ToLower().Contains( ".asmx" ) )
        return;
 
    if ( IsEncodingAccepted( "gzip" ) )
    {
        /* INSERTED LINE HERE! 
           We add a filter to decompress incoming requests.
         */
 
        app.Request.Filter  = 
            new System.IO.Compression.GZipStream( 
                app.Request.Filter, CompressionMode.Decompress );
 
        app.Response.Filter = new GZipStream( app.Response.Filter,
  CompressionMode.Compress );
        SetEncoding( "gzip" );
    }
    else if ( IsEncodingAccepted( "deflate" ) )
    {
        app.Response.Filter = new DeflateStream( app.Response.Filter,
  CompressionMode.Compress );
        SetEncoding( "deflate" );
    }
}

Then, go back to the client application, and modify the partial class definition:



public partial class Service1 
{
     #region WebResponseCompress
 
     protected override WebRequest GetWebRequest( Uri uri )
     {
         HttpWebRequest request = (HttpWebRequest)base.GetWebRequest( uri );
         request.Headers.Add( "Accept-Encoding", "gzip, deflate" );
     
         // create compressed request
         return new HttpWebRequestCompressed( request );
     }
 
     // THIS WILL NOT BE NEEDED ANYMORE
     //protected override WebResponse GetWebResponse( WebRequest request )
     //{
     //    return new HttpWebResponseDecompressed( request );
     //}
 
     #endregion
}

and add the HttpWebRequestCompressed class:



public class HttpWebRequestCompressed : System.Net.WebRequest
{
    private HttpWebRequest request;
 
    public HttpWebRequestCompressed( WebRequest request )
    {
        this.request = (HttpWebRequest)request;
    }
 
    public override WebResponse GetResponse()
    {
        return new HttpWebResponseDecompressed( this.request );
    }
 
    public override Stream GetRequestStream()
    {
        return new GZipStream( request.GetRequestStream(), CompressionMode.Compress );
    }
 
    public override string Method
    {
        get
        {
            return request.Method;
        }
        set
        {
            request.Method = value;
        }
    }
 
    public override WebHeaderCollection Headers
    {
        get
        {
            return this.request.Headers;
        }
        set
        {
            this.request.Headers = value;                     
        }
    }
 
    public override string ContentType
    {
        get
        {
            return this.request.ContentType.ToString();
        }
        set
        {
            this.request.ContentType = value;
        }
    }
}

And that's it! Note that the HttpWebRequestCompressed uses HttpWebResponseDecompressed when getting the response back. This is why we do not need to override the GetWebResponse in the proxy class.


Now inspect the application with Fiddler:



Note that this time the content length of the request is 6130 bytes (and the request is unreadable in Fiddler since it's compressed) and the response's length is still 5771 bytes.


Source code

Please download the source code for this tutorial here. If you find any bugs or have any other related comments, please feel free to comment this post.