Best Practice No 4:- Improve bandwidth performance of ASP.NET sites using IIS compression

Introduction

Bandwidth performance is one of the critical requirements for every website. In today's time major cost of the website is not hard disk space but its bandwidth. So transferring maximum amount of data over the available bandwidth becomes very critical. In this article we will see how we can use IIS compression to increase bandwidth performance.

Please feel free to download my free 500 question and answer videos which covers Design Pattern, UML, Function Points, Enterprise Application Blocks,OOP'S, SDLC, .NET, ASP.NET, SQL Server, WCF, WPF, WWF, SharePoint, LINQ, SilverLight, .NET Best Practices @ these videos http://www.questpond.com/ 

1.jpg

My other .NET best practices article
 
.NET best practice 1:-
In this article we discuss about how we can find high memory consumption areas in .NET. You can read about the same at
http://www.c-sharpcorner.com/UploadFile/shivprasadk/452069230108152009163244PM/4520692301.aspx?ArticleID=50bdd822-23d0-4baa-ab0a-21314b94d9e5   
           
.NET best practice 2:- In this article we discuss how we can improve performance using finalize / dispose pattern.
http://www.c-sharpcorner.com/UploadFile/shivprasadk/657567608232009132704PM/6575676.aspx?ArticleID=594e945a-ba48-4fee-9ecd-7b182efd076b 
     

.NET best practice 3:- How can we use performance counters to gather performance data from .NET applications http://www.c-sharpcorner.com/UploadFile/shivprasadk/3253453409022009023755AM/32534534.aspx?ArticleID=9dbdcda3-53e9-46c3-9d09-2b46b9d5f5a2     

Thanks, Thanks and Thanks
 
Every bit of inspiration for this article has come from Scott Forsyth's article on IIS compression. You can say I have just created a new version with more details.

http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx 

Thanks to Jimmie to suggest the performance counters and other details of IIS compression.

I also did picked up some bits from this link of Microsoft http://technet.microsoft.com/hi-in/library/bb742379(en-us).aspx .

How does IIS compression work?

Note :- All examples shown in this article is using IIS 6.0. The only reason we have used IIS 6 is because 7.0 is still not that common.

Before we move ahead and talk about how IIS compression works, let's try to understand how normally IIS will work. Let's say the user requests for a 'Home.html' page which is 100 KB size. IIS serves this request by passing the 100 KB HTML page over the wire to the end user browser.


2.jpg

When compression is enabled on IIS the sequence of events changes as follows:-

User requests for a page from the IIS server. While requesting for page the browser also sends what kind of compression types it supports. Below is a simple request sent to the browser which says its supports 'gzip' and 'deflate'. We had used fiddler (http://www.fiddler2.com/fiddler2/version.asp ) to get the request data.

GET /questpond/index.html HTTP/1.1
Accept: image/gif, image/x-xbitmap, image/jpeg, image/pjpeg, application/x-shockwave-flash, application/vnd.ms-excel */*
Accept-Language: en-us
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0
Host: http://www.questpond.com/
Connection: Keep-Alive

Depending on the compression type support sent by the browser IIS compresses data and sends the same over the wire to the end browser. 


3.jpg
Browser then decompresses the data and displays the same on the browser.

Compression fundamentals: - Gzip and deflate

IIS supports to kind of compressions Gzip and deflate. Both are more or less same where Gzip is an extension over deflate. Deflate is a compression algorithm which combines LZ77 and Huffman coding. In case you are interested to read more about LZ77 and Huffman you can read at http://www.zlib.net/feldspar.html .
 

4.jpg
Gzip is based on deflate algorithm with extra headers added to the deflate payload.

5.jpg

Below are the headers details which is added to the deflate payload data. It starts with a 10 byte header which has version number and time stamp followed by optional headers for file name. At the end it has the actual deflate compressed payload and 8 byte check sum to ensure data is not lost in transmission.

6.jpg

Google, Yahoo and Amazon use gzip, so we can safely assume that it's supported by most of the browsers.

Enabling IIS compression
 
Till now we have done enough of theory to understand IIS compression. Let's get our hands dirty to see how we can actually enable IIS compression.

Step 1:- Enable compression

The first step is to enable compression on IIS. So right click on websites if  properties and click on the service tab. To enable compression we need to check the below two text boxes from the service tab of IIS website properties. Below figure shows the location of both the checkboxes.


7.jpg

Step 2:- Enable metabase.xml edit

Metadata for IIS comes from 'Metabase.xml' which is located at "%windir%\system32\inetsrv\". In order that compression works properly we need to make some changes to this XML file. In order to make changes to this XML file we need to direct IIS to gives us edit rights. So right click on your IIS server root if  go to properties and check 'enable direct metabase edit' check box as shown in the below figure.

8.jpg

Step 4:- Set the compression level and extension types

Next step is to set the compression levels and extension types. Compression level can be defined between 0 to 10, where 0 specifies a mild compression and 10 specifies the highest level of compression. This value is specified using 'HcDynamicCompressionLevel' property. There are two types of compression algorithms 'deflate' and 'gzip'. This property needs to be specified in both the algorithm as shown in the below figures.

9.jpg

10.jpg

We need to also specify which file types need to be compressed. 'HcScriptFileExtensions' help us to specify the same. For the current scenario we specified that we need to compress ASPX outputs before they are sent to the end browser.

Step 5:- Does it really work?

Once you are done with the above 4 steps, it's time to see if the compression really works. So we will create a simple C# asp.net page which will loop "10000" times and send some kind of output to the browser.

        protected void Page_Load(object sender, EventArgs e)
        {
            for (int i; i < 10000; i++)
            {
                Response.Write("Sending huge data" + "<br>");
            }
 
In order to see the difference before compression and after compression we will run the fiddler tool as we run our ASP.NET loop page. You can download fiddler from http://www.fiddler2.com/fiddler2/version.asp .

Below screen shows data captured by fiddler without compression and with compression. Without compression data is "80501 bytes" and with compression it comes to "629 bytes". I am sure that's a great performance increase from bandwidth point of view. 

11.jpg

0, 1,2,3,4...10 IIS compression levels

In our previous section we have set 'HcDynamicCompressionLevel' to value '4'. More the compression level value, more the data size will be less. As we increase the compression level the downside is more CPU utilization. One of the big challenges is to figure out what is the optimum compression level. This depends on lot of things type of data, load etc. In our further coming section we will try to derive which is the best compression level for different scenarios.

3 point consideration for IIS compression optimization

Many developers just enable IIS compression with below default values. But default values do not hold good for every environment. It depends on many other factors like what content type is your site serving. If you site has only static HTML pages then compression levels will be different as compared to site who are serving mostly dynamic pages.

Compression
Options
 File Type  Default Configuration
File types compressed Static .txt, .htm, and .html
  Dynamic  
Compression schemes Static Both gzip and deflate
  Dynamic Both gzip and deflate
Compression level Static 10
  Dynamic 0

The above table is taken from http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/25d2170b-09c0-45fd-8da4-898cf9a7d568.mspx?mfr=true 

If your site is only serving compressed data like 'JPEG' and 'PDF', it's probably not advisable to enable compression at all as your CPU utilization increases considerably for small compression gains. On the other side we also need to balance compression with CPU utilization. The more we increase the compression levels the more CPU resources will be utilized.

Different data types needs to be set to different IIS compression levels for optimization. In the further coming section we will take different data types, analyze the same with different compression levels and see how CPU utilization is affected. Below figure shows different data types with some examples of file types.

12.jpg

Static data compression

Let's start with the easiest one static content type like HTML and HTM. If a user requests for static page from IIS who has compression enabled, IIS compresses the file and puts the same in
'%windir%\IIS Temporary Compressed Files' directory.

Below is a simple screen which shows the compressed folder snapshot. Compression happens only for the first time. On subsequent calls for the same compressed data is picked from the compressed files directory.


13.jpg

Below are some sample readings we have taken for HTML files of size range from 100 KB to 2048 KB. We have set the compression level to '0'.

Actual KB Compressed in KB
100 24
200 25
300 27
1024 32
2048 41
Compression level set to '0'

You can easily see with the least compression level enabled the compression is almost 5 times.

14.jpg

As the compression happens only for the first time, we can happily set the compression level to '10'. The first time we will see a huge CPU utilization but on subsequent calls the CPU usage will be small as compared to the compression gains.

Dynamic data compression

Dynamic data compression is bit different than static compression. Dynamic compression happens every time a page is requested. We need to balance between CPU utilization and compression levels.

In order find the optimized compression level, we did a small experiment as shown below. We took 5 files in a range of 100 KB to 2 MB. We then changed compression levels from 0 to 10 for every file size to check how much was the data was compressed. Below are compressed data readings in Bytes.

File sizeif

Compression
Levels 
100 KB 200 KB 300 KB 1 MB 2 MB
0 32,774 35,496 37,699 52,787 109,382
1 30,224 32,300 34,104 46,328 92,813
2 29,160 31,004 32,673 43,887 87,033
3 28,234 29,944 31,628 42,229 83,831
4 26,404 27,655 29,044 34,632 44,155
5 25,727 26,993 28,488 33,678 42,395
6 25,372 26,620 28,488 33,448 41,726
7 25,340 26,571 28,242 33,432 41,678
8 25,326 26,557 28,235 33,434 41,489
9 24,826 26,557 28,235 33,426 41,490
10 24,552 25,764 27,397 32,711 42,610

The above readings do not show anything specific, its bit messy. So what we did is plotted the below graph using the above data and we hit the sweet spot. You can see even after increasing the compression level from 4 to 10 the compressed size has no effect. We experimented this on 2 to 3 different environments and it always hit the value '4' , the sweet spot.

So the conclusion we draw from this is, setting value '4' compression level for dynamic data pages will be an optimized setting.

15.jpg

Compressed file and compression
 
Compressed files are file which are already compressed. For example files like JPEG and PDF are already compressed. So we did a small test by taking JPEG compressed files and below are our readings. The compressed files after applying IIS compression did not change much in size.

Actual compressed file size File size after IIS compression
100 102
220 210
300 250
1024 980
2048 1987

When we plot a graph you see that the compression benefits are very small. We may end up utilizing more CPU processor resource and gain nothing in terms of compression.

16.jpg

So the conclusion we can draw for compressed files is that we can disable compression for already compressed file types like JPEG and PDF.

CPU usage, dynamic compression and load testing
 
One of the important points to remember for dynamic data is to optimize between CPU utilization, compression levels and load on the server.

17.jpg

We used WCAT to do stress with 100 concurrent users. For every file size range from 100 KB to 2 MB we recorded CPU utilization for every compression level. We recorded processor time for W3WP exe using performance counter. To add this performance counter you can go to process 
select processor time  select w3wp.exe from instances.

 

File size

Compression
Levels

100 KB

200 KB

300 KB

1 MB

2 MB

0

1.56

1.56

1.56

4.69

4.00

1

1.56

1.56

1.56

4.69

4.00

2

1.56

1.56

1.56

4.69

4.30

3

1.56

1.56

1.56

4.69

4.63

4

1.56

1.56

1.56

4.69

6.25

5

3.00

2.00

1.56

4.69

7.81

6

3.45

2.40

3.13

4.69

9.38

7

4.00

3.00

5.00

6.25

43.75

8

5.60

3.50

8.00

15.62

68.75

9

6.00

5.00

9.00

25.00

87.50

10

7.81

6.25

10.94

37.50

98.43


If we plot a graph using the above data we hit the sweet spot of 6. Till the IIS compression was 6 CPU utilization was not really affected.

18.jpg

TTFB and Compression levels

TTFB also termed as time to first byte gets the number of milliseconds that have passed before the first byte of the response was received. We also performed a small experiment on 1MB and 2 MB dynamic pages with different compression levels. We then measured the TTFB for every combination of compression levels and file size. WCAT was used to measure TTFB.

 

 File size in MB

Compression
Level

1

2

0

8.00

9.00

1

8.00

9.00

2

8.00

9.00

3

8.00

9.00

4

8.00

9.00

5

9.00

9.00

6

12.00

17.00

7

16.00

18.00

8

19.00

19.00

9

22.00

37.00

10

29.00

47.00

When we plot the above data we get value '5' as a sweet spot. Until the value reaches '5' TTFB remain constant. 

19.jpg 

Print shot of WCAT output for TTFB measurement.

20.jpg

IIS 7.0 and CPU roll off
 
All the above experiments and conclusion are done on IIS 6.0. IIS 7.0 has a very important property i.e. CPU roll-off. CPU roll-off acts like cut off gateway so that CPU resources are not consumed unlimitedly.

When CPU gets beyond a certain level, IIS will stop compressing pages, and when it drops below a different level, it will start up again. This is controlled using 'staticCompressionEnableCpuUsage' and 'dynamicCompressionDisableCpuUsage' attributes. It's like a safety valve so that your CPU usage does not come by surprise.

Conclusion

  • If the files are already compressed do not enable compression on those files. We can safely disable compression on EXE , JPEG , PDF etc.
  • For static pages compression level can be set to 10 as the compression happens only once.
  • Compression level range can be from '4' to '6' for dynamic pages depending on the server environment and configuration.The best way to
  •  judge which compression level suits best is to perform TTFB, CPU utilization and compression test as explained in this article.

    In case you want to do a sanity check please refer this article http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx  , i agree my results do not match exactly with scott but I think we are very much on the same page. 

Some known issues on IIS compression
 
Below are some known issues of IIS compression
http://support.microsoft.com/kb/837251 

http://support.microsoft.com/kb/823386 

Links for further reading

http://www.zlib.net/feldspar.html 

http://www.15seconds.com/Issue/020314.htm 

http://www.fiddler2.com/fiddler2/version.asp 

http://technet.microsoft.com/hi-in/library/bb742379(en-us).aspx 

http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx 

http://www.codinghorror.com/blog/archives/000059.html 

http://blogs.microsoft.co.il/blogs/yevgenif/archive/2009/02/08/web-performance-enable-data-compression-on-iis-6-0-server.aspx 

Up Next
    Ebook Download
    View all
    Learn
    View all