Sunday 27 March 2011

More Web-Optimization

Ok, so we've covered the basics of making the page size as small as possible.

Now on to the more obscure time saving methods!

CSS Placement
Always ensure your CSS styles, be them inline or external references, are placed inside the HEAD tag of your HTML page. If a web browser finds an element with a class that it can't find, it won't render that element until it's parsed the entire HTML page to ensure it doesn't have to re-draw the element at a later date. If you've referenced your styles inside the HEAD tag, the browser will have the required information available, if you don't, it won't.

CSS @Import 
This statement in CSS allows you to reference another stylesheet from your original stylesheet. While that is great, it unfortunately has the side effect of essentially placing a stylesheet reference at the bottom of your HTML page, which, as we've just covered, is a bad thing. Instead, just use another stylesheet reference directly within the HEAD of the HTML, or, as we'll cover in a second, combine the two stylesheets.

JavaScript Placement
Try and ensure all your JavaScript files are placed at the bottom of your HTML page. Unfortunately, scripts block parallel downloads so placing them at the end of your HTML page, after everything has been downloaded, prevents this from blocking anything useful.

Make CSS and JavaScript External
If you make CSS and JavaScript external then the web browser can cache the relevant files so, during the next page load, the browser can access the file straight from disk, rather than going off to web server to fetch it. Not only will it lower the load on your web server, it'll also save time. Loading from a local disk is a lot faster than fetching a file across the internet! Be warned though... if your website is running off of HTTPS then you can't cache files!

Reduce HTTP Requests
Every time you request a CSS file, or a JavaScript file, or an image, or just about anything that isn't within the plain HTML, then an HTTP request has to be made for the file. There's a performance overhead with this so, reducing them should speed things up. So, try combining all of your CSS files into one, then combine all of your JavaScript files into one. As for images, try using the Sprite and Image Optimization Framework produced by Microsoft. Essentially, it'll combine all of your images into one large image and then using CSS, will only display portions of that large image so it'll seem as if each image is actually a separate image to your users. Pretty fancy stuff and again, will reduce the number of HTTP requests!

Reduce DNS Lookups
If you're accessing your resources on different servers then each time you try and grab the resource from each different server then a DNS lookup needs to be performed. For those of you that don't know what that is, it's essentially the process of finding out the IP address of a given domain name (e.g. Microsoft.com -> 65.55.12.349). There's an overhead with this lookup so reducing the number will again improve performance. With this said however, a web browser can only download a certain amount of files in parallel for a given server (in IE 7, this is limited to 2 files at any one time, I think in IE8 it's been increased to 6). So, putting resources on different servers will enable the users web browser to download more files at a given time. Obviously there's a trade off here, the more servers you spread your resources over, the more you can download at any given time but the larger the DNS lookup time penalty.

Reduce 404 Errors
There's really no need to be getting any 404 error for a resource you may, or may not require. It may not even break anything but, a 404 means you've the added expense of creating an HTTP request that does absolutely nothing, and like I covered earlier, the less HTTP requests, the better.

Turn Debugging Off
This is an ASP.NET specific performance improvement. Within your web.config file, there will be something like this line:

<compilation defaultLanguage="c#" debug="true">

Make sure debug="false". When set to true, several things happen, firstly, extra dbg files are produced and run for each aspx page compiled, that will slow down your website. However, I've found that the bigger performance problem is the extra JavaScript validation that runs, especially if you're using the Microsoft AJAX framework. In one instance, just by turning debugging off, a page that was taking 18+ seconds to load, was reduced to 2.

Ok, and that's about all I can think of for the time being. Website performance optimization is a huge subject with many a web page devoted to it. Personally, I find Yahoo's research on this invaluable so if I were you, I'd check out this guide that they've produced. It covers everything above and more. Yahoo also make some pretty awesome tools for helping with this, specifically, I've used the .NET port of their compressor which is one of the best I've come across. If you've got any other tips that aren't covered here or on Yahoo's guide, feel free to let me know, I'd love to hear them!

Sunday 13 March 2011

HTTP Compression

Ok, so in my last post I said that minimizing the amount of data sent across the wire is a sure way of speeding up performance.

Well, there's a very simple way of doing this which I haven't discussed yet and that's by enabling HTTP compression.

HTTP Compression is a completely lossless way of making your data take up less space. There's two main forms of HTTP compression - GZip and Deflate. These two forms of compression are supported by virtually all of the main browsers now days so what one you choose to use is completely up to you but from my research, GZip seems to be the more popular.

So, how do you enable HTTP compression? Well, there's two ways:
  1. You can do it within IIS (See here for instructions on how to do that: MSDN)
  2. If you don't have access to IIS then you can do it in code using our friend Response.Filter. To do this, just use the following code and place it within your Application_BeginRequest method within your global.asax class:


void Application_BeginRequest(object sender, EventArgs e)
{
    if (Request.Headers["Accept-encoding"] != null 
        && 
        Request.Headers["Accept-encoding"].Contains("gzip"))
    {
        Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress, true);
        Response.AppendHeader("Content-encoding", "gzip");
    }
    else if (Request.Headers["Accept-encoding"] != null 
             && 
             Request.Headers["Accept-encoding"].Contains("deflate"))
    {
        Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress, true);
            Response.AppendHeader("Content-encoding", "deflate");
    }
}


So, what we're doing here is, we're checking to make sure that the web browser supports GZip compression and if so, we set up a new GZipStream which will compress our output before sending it out to the client. If the browser doesn't support GZip compression then we fall back to Deflate and check to see if the browser supports that and if so, we use that. If neither is supported then we just send the data back uncompressed.


All very simple so there's no excuse not to use it!

My next post will continue in the same web-optimizing vein, where I'll discuss other, lesser known methods of speeding up performance of web pages.

Sunday 6 March 2011

Optimizing Website Performance

If you've built any reasonably sized website, I can all but guarantee that someone will utter the immortal words "Can this work a bit quicker?". You'll then spend days/weeks/months doing just that so, over the coming weeks I'm going to write a series of blogs to help with this. Each blog will work on a different area and today's area is page size.

From experience, I've found that this is one of the biggest factors (well, at least in terms of client performance). The smaller the page, the faster it is to download, the faster it is to render, it's just plain faster!

So, how do you go about reducing page size? Here's a few options...

UpdatePanels / AJAX / PageMethods

Well, firstly, use AJAX calls or UpdatePanels whenever possible. Both will only cause a small portion of your page to be sent to the client, rather than the whole page. This has a dramatic improvement. If you use UpdatePanels then you'll still have the overhead of the page life cycle but, if you use AJAX calls or PageMethods (which are just ajax calls) then you'll avoid this so it'll be even quicker.

Remove ViewState

If you ever view the HTML that an ASP.NET WebForms page produces, then you'll see a hidden field with the name __VIEWSTATE which will consist of a huge string. This string is how WebForms maintains state but, by passing it to the client each time, the size of the page is a lot bigger than it needs to be. So, we can do two things here, firstly, disable ViewState where ever you can. On every WebControl there's an EnableViewState property, setting that to false will disable ViewState for that control. Secondly, we can store the ViewState on the server rather than sending it to the client. Whenever I do this, I usually store the ViewState on the Session object. There's a few articles that will tell you how to do this, personally, I'd suggest reading this one before you start coding anything though: http://www.hanselman.com/blog/MovingViewStateToTheSessionObjectAndMoreWrongheadedness.aspx

Client IDs

If you're using ASP.NET 4.0 then you have the option of changing the ClientIdMode. This is a new feature and essentially lets you specify the exact ID of server controls when they're rendered on the client. In previous versions, if you had a control with an id of "example", then assuming it was the only control on the page, it'd be rendered with the id of "ctl00_example". If you then start getting nested controls you'll get the id of "ctl00_parentId_childId_example", as you can imagine, in large systems these ids can get pretty large, pretty fast. In .NET 4.0, you can set the ClientIDMode property of the page to be "static". When this is done, all the IDs will be rendered on the client with the id that was actually set on the server. So, if we had a control with an id of "example", then no matter where it was rendered, it'd always have the id of "example". No extra characters to make it unique. Obviously, you have to be a bit more careful when deciding on the IDs of your controls, you don't want any conflicts but just by changing that property, you can save yourself a significant amount of space.

Disable EventValidation

I'm a little reluctant to suggest this one but I'll mention it anyway. Essentially, on the Page object, there's a property called "EnableEventValidation". When set to true (which is the default), it'll validate any postback and callback events for invalid data. So, for example, it'll ensure that the value sent back for a DropDownList is actually present within the list and it'll ensure you're not trying to postback a value for a control that isn't visible. To help it do this, it sends data to the client in the form of a hidden field called "__EVENTVALIDATION". If you check the HTML of your page, you should be able to see it. Obviously, this takes up a few extra bytes that aren't strictly needed so, if you disable it, this hidden field disappears and your page size gets a little smaller but if you do this, I seriously suggest you re-implement the validation on the server. For more information regarding event validation, check out the MSDN article about it.

Minification

External javascript files and CSS files are also sent across the wire and can affect how responsive your site seems. To help with this, most files can benefit from being "minified". By this I mean that you'll give a tool a script, it'll strip out all the white space. It'll rename all the local variables into a one letter equivalent and essentially, it'll get rid of absolutely everything that isn't necessary, leaving you with the smallest possible file. There are plenty of tools out there that'll do this for you, for free. A quick google search revealed a few: Microsoft MinifierMinify CSS

Image Size

Images are by far and away, the biggest files that'll be requested by a client on any normal web page request and if you don't try and optimize these, it makes all the above points a little pointless. Most images are bloated, they contain a lot of information that simply isn't needed and can be removed with no loss what so ever to quality. Yahoo's Smush It! is an excellent tool for image optimization and I strongly suggest you use it. You simply give it an image, it takes it, strips out all the unnecessary stuff and returns the smaller image, with no loss to quality. It's an excellent tool and should be a bookmark on every web developers computer.

Well, that should get you started. If you follow the above then you should see a dramatic decrease in your page sizes and hopefully, an increase in performance. In my next blog I'm going to talk about implementing loss-less HTTP compression using GZip and Deflate. These will decrease your page size even further!