This is a migrated thread and some comments may be shown as answers.

HttpCompression in .NET 2.0

3 Answers 84 Views
Miscellaneous
This is a migrated thread and some comments may be shown as answers.
Shane Milton
Top achievements
Rank 2
Shane Milton asked on 15 Dec 2005, 07:31 AM
As much as we all love Telerik controls, we also dread the fact that a 50kB page using textboxes and dropdown lists can easily become over 500kB once you convert some of those into radInput boxes and radComboboxes and this is a very hard thing for some users who don't have broadband.  I'm working on a data-input page that is beginning to reach 1MB in size (just HTML) under some scenarios and we finally spent a few hours to do something about it.

With .NET 2.0 we are given System.IO.Compression that allows us to easily implement HTTPCompression on a per-application basis and if we want to get creative with our C#, we can customize it even more!

If you take a look here you will see a tutorial on how you can get this up and working.  Once you implement this, it works great, almost...

Once you follow all of their instructions and run a few tests you will find that your pages are now using HTTPCompression saving you mucho bandwidth as well as making large downloads much faster for your users!  HOWEVER, you will also notice that images begin to break and occassionally IE will crash!  All is not well!

As I'm sure many of you know, IE (and I believe some versions of Mozilla-based browsers though I could be wrong) have a/some bug(s) in them that causes them to not properly handle images and other things (PDFs don't get properly passed to Adobe, etc.) when they are compressed with HTTPCompression.

I've taken the above-linked demo and modified the logic to allow you to control what does and does not get compressed.  I basically take the URI being requested and do a search for ".htm" or ".aspx" and if I find it, I compress the response, otherwise I leave it alone.  This is not a fool-proof system as it would compress foo.htm.jpg even though it's an image and it would not (at least I don't think it would) compress "http://www.domain.com/" as the actual URI being requested doesn't specify the actual page so these two things should be kept in mind as "loopholes" in my system but it's good enough for me.  :)


Here's the code:



using System;

using System.Collections.Generic;

using System.IO;

using System.IO.Compression;

using System.Text;

using System.Web;

namespace MSDN.Demo.Whidbey.Compression

{

    public class HttpCompressionModule : IHttpModule

    {

        public HttpCompressionModule()

        {

        }

        void IHttpModule.Dispose()

        {

        }

        void IHttpModule.Init(HttpApplication context)

        {

            context.BeginRequest += new EventHandler(context_BeginRequest);

        }

        void context_BeginRequest(object sender, EventArgs e)

        {

            //

            // Get the application object to gain access to the request's details

            //

            HttpApplication app = (HttpApplication)sender;

            //

            // Accepted encodings

            //

            string encodings = app.Request.Headers.Get("Accept-Encoding");

            //

            // No encodings; stop the HTTP Module processing

            //

            if (encodings == null)

                return;

                  if (ShouldCompress(app.Request.Url.ToString()))

                  {

                        //

                        // Current response stream

                        //

                        Stream baseStream = app.Response.Filter;

                        //

                        // Find out the requested encoding

                        //

                        encodings = encodings.ToLower();

                        if (encodings.Contains( "gzip" ))

                        {

                              app.Response.Filter = new GZipStream( baseStream, CompressionMode.Compress );

                              app.Response.AppendHeader( "Content-Encoding", "gzip" );

#if DEBUG

                              app.Context.Trace.Warn( "GZIP compression on" );

#endif

                        }

                        else if (encodings.Contains( "deflate" ))

                        {

                              app.Response.Filter = new DeflateStream( baseStream, CompressionMode.Compress );

                              app.Response.AppendHeader( "Content-Encoding", "deflate" );

#if DEBUG

                              app.Context.Trace.Warn( "Deflate compression on" );

#endif

                        }

                  }

        }

            private bool ShouldCompress( string s )

            {

                  string str = s.ToLower();

                  if ((str.Contains( ".aspx" )) || (str.Contains( ".htm" )))

                  {

                        return true;

                  }

                  else

                  {

                        return false;

                  }

            }

    }

}



Enjoy!

3 Answers, 1 is accepted

Sort by
0
Rumen
Telerik team
answered on 15 Dec 2005, 08:42 AM
Hello Shane,

Very interesting read. By the way, as far as I know "gzip" Html compression can only be handled by IE - is this already supported by FireFox / Opera based browsers?

As far as combobox render size is concerned, a possible workaround is to sent empty comboboxes in the initial load to the client (Page.IsPostBack == False) and then use callbacks to load the data.

Rumen Stankov
PM, telerik
0
Shane Milton
Top achievements
Rank 2
answered on 15 Dec 2005, 03:00 PM
Doing some quick Googling, it appears that Opera 5.12 and newer supports GZIP (so says http://www.http-compression.com/) and FireFox appears to support it as well (was not able to find good details but it supports compression of some kind for sure).  But this code should handle whether or not the browser supports a certain kind of compression.  It looks at the encoding portion of the request and ensures "gzip" is in it before actually compressing it with that and if it's not there, then it checks to see if "deflate" is there to use that and if it still isn't there then it just stops trying and sends it uncompressed.  But I believe all modern browsers support GZIP (I don't know about Safari as we have no Mac test machines on site to test with).

As for the comboboxes, that's what we do.  It's just that we have around 80-100 or so comboboxes on a single page and 40ish or so radInput boxes on there as well, not to mention tabstrip and menu and we'll be adding window and perhaps upload.  Sounds impressive but it's mostly a boring data-entry page, just imaging filling out your tax forms all on one page and this has the potential to be about that complex.  Unfortunately, we have decided this is our best avenue to take with this because we will end up having to run dynamically generated javascript on any given number of fields immediately after any given field has been edited, so this is becoming quite the beast of a page and will just continue to keep growing.  Luckil most of our users have broadband of some sort.  :)

-Shane
0
Josef Rogovsky
Top achievements
Rank 2
answered on 16 Dec 2005, 05:39 AM
I haven't yet looked into the implications for .Net 2.0 but we've had pretty good success with ZipEnable from port80software.com for enabling http compression on our IIS 6.0 web servers.

It has built-in browser compatibility checking and will only server compressed pages to browsers that support compression.

I've also never had any problems with pdf files, as Shane mentioned was possible.
Tags
Miscellaneous
Asked by
Shane Milton
Top achievements
Rank 2
Answers by
Rumen
Telerik team
Shane Milton
Top achievements
Rank 2
Josef Rogovsky
Top achievements
Rank 2
Share this question
or