Best practice for hosting static content on high traffic websites?

Associate
Joined
19 Jun 2003
Posts
1,680
Location
West Yorks, UK
Best practice for hosting static content/parallelized loading?

Hi all,
I have a e-commerce website that is getting more and more traffic (~1000 visits per day). I want to alter the structure around to serve static content (CSS, layout images etc) from a different domain to help with the site speed.

The site is running on a Linux server. I have full control over the server, the site and any DNS records.

My plan was to setup a new subdomain on the same server, http://static.domain.com, copy relevant files to it and update the stylesheets etc to use the new subdomain. However, i've seen a couple of sites mention setting up a CNAME DNS record pointing back to the same site and do it that way.

Does anyone have any suggestions as to what is the best practice for setting this up? Paying for a CDN website isn't an option.

Cheers,
Matt
 
Last edited:
Moving to a subdomain on the same server wouldn't have any effect - you're still running from the same server. Look into lighttpd and Litespeed as great lightweight HTTP servers - lighttpd uses its own config, and Litespeed is a drop-in Apache replacement.
 
1000 visits per day shouldn't be stressful at all... a standard Apache configuration should be able to deal with that without a problem.

Typically, lighttpd servers get setup for sites doing hundreds of thousands or millions of requests per day. By adding in an additional layer of complexity, you might introduce unweclome instability/other issues that you hadn't forseen.
 
Sorry, looks like I didn't phrase it that well. What I meant was, to make the site load quicker for the end user, I could take advantage of parallelized loading. As I understand it, most modern browsers should load from at least 2 different DNS names at the same time.

Re-reading my post, it looks like I typed something completely different to what my brain was wanting to type.

To take advantage of the parallelized loading, will the seperate sub-domain work sufficiently, or is there a better way to do it?
 
As I understand it, most modern browsers should load from at least 2 different DNS names at the same time.

Most browsers these days will try nameservers for the IP of a website in a round-robin fashion. This allows for rudimentary load-balancing.

The bottleneck of a site loading though is typically the end-user's internet connection - so loading from two servers simultaneously isn't likely to improve performance. If you can, you might want to look at using gzip to compress your pages (php.ini or a .htaccess can usually do this) which can improve loading times considerably.
 
If you can, you might want to look at using gzip to compress your pages (php.ini or a .htaccess can usually do this) which can improve loading times considerably.

Agreed, this had a big impact on our server, which hosts 4 sites, serving around 100k visits per day.
 
As long as you have spare server CPU power gzip is good and should reduce loading times for most people on a modern computer. :)
 
Thanks guys - gzip and some caching already in place.

The problem i'm trying to solve is that the images for the layout load, followed by the images for the products. Because of the complexities of the design, and the number of product images on one page, I want to try and load them at the same time. Whilst this won't have any affect on server loads etc, it should help the user's perception of the speed of the website. It seems quite a common technique - both Google and Yahoo mention it as something you should do, but don't quite go into enough detail about it.

Matt
 
The problem i'm trying to solve is that the images for the layout load, followed by the images for the products. Because of the complexities of the design, and the number of product images on one page, I want to try and load them at the same time. Whilst this won't have any affect on server loads etc, it should help the user's perception of the speed of the website. It seems quite a common technique - both Google and Yahoo mention it as something you should do, but don't quite go into enough detail about it.

Matt

You might be right actually - i'm not sure if the concurrency values set in Firefox etc are per site or per host. If it's per site, using a sub-doamin to load images from shouldn't improve loading speed...but perhaps if it's per host, you might get a benefit. I don't think it would hurt to try, although with these things it can be hard to do an empirical test. :o
 
There's a whole host of settings under network.http in about:config - all sorts of things :)

Perhaps try combining the layout images into larger images, then using offsets to show the different images, especially for hover images etc?
 
What format are your image files in?

Convert them to PNG 8. You can drastically reduce the size of images over JPG, without too much loss of quality or no loss of quality.

If your JPGs were saved at maximum quality (say in photoshop) then they are actually bigger then they need to be. Much bigger in fact!!! 95% is ideal. I realise that 5% may not sound much, but trust me it is. Also aim no lower than 50% as well.. JPG are awful.
 
Thanks guys - gzip and some caching already in place.

The problem i'm trying to solve is that the images for the layout load, followed by the images for the products. Because of the complexities of the design, and the number of product images on one page, I want to try and load them at the same time. Whilst this won't have any affect on server loads etc, it should help the user's perception of the speed of the website. It seems quite a common technique - both Google and Yahoo mention it as something you should do, but don't quite go into enough detail about it.

Matt

You could try using the CSS Sprite technique. For example this is the master image Youtube loads. Have a search around and you'll find different examples of it used. Also worth a mention http://spritegen.website-performance.org/

dm9yyr.png
 
In addition to reducing your image quality slightly and using css sprites.

Look into compressing your css and javascript. There are a few compressors out there. You might shave off about a third of their size.

If you use any popular javascript library use the free google API hosted service. http://code.google.com/apis/ajaxlibs/documentation/index.html#AjaxLibraries. Instead of using a downloaded js file hosted on your server, you simply use the path url in the link above in your script tag. Should be faster coming from their server :)

You could also maybe look into amazon S3 for hosting you images. It's used by a lot of websites, small and large. Although it will costs a wee bit, around 10p per GB transfer.
 
Lots of good advice above. I'd add that you should also look at merging separate CSS and JS files together, this reduces the number of http requests per page.

Other than that if you combine the above ideas with the extra domain for static content, you should be flying.

akakjs
 
Thanks guys. I'm already doing all of the above, except the sprite image thing. I didn't realise you could use it that effectively. I'll look further into it.
 
You could try using the CSS Sprite technique.

What happens in the following scenario?

A user on a fairly slow connection visits the homepage of a website and the image requests for that page reference a file named big_sprite.png

Before big_sprite.png completes loading, the user clicks a link which takes him to another page on the same website. This page also references big_sprite.png

The incomplete big_sprite.png is presumably in the cache as subsequent pages start to load.

Does the browser :

a. Resume downloading big_sprite.png?
b. Discard the incomplete big_sprite.png and download afresh?
c. Serve up the incomplete big_sprite.png and leave it at that?

I've mostly observed 'a' when revisting pages where I left before large image files had completed downloading, but are there situations where the result would be the less desirable 'b' or the totally unacceptable 'c'?
 
eXor;19384744a. Resume downloading big_sprite.png? b. Discard the incomplete big_sprite.png and download afresh? c. Serve up the incomplete big_sprite.png and leave it at that? I've mostly observed 'a' when revisting pages where I left before large image files had completed downloading said:
Mostly depends on the caching / expiration related headers the web server sends.
 
If you use any popular javascript library use the free google API hosted service. http://code.google.com/apis/ajaxlibs/documentation/index.html#AjaxLibraries. Instead of using a downloaded js file hosted on your server, you simply use the path url in the link above in your script tag. Should be faster coming from their server :)

The big benefit with this is if a client has already visited someone elses page that uses that specific library, then they are likely to have the file cached locally and just need to send a HEAD request to see if it has changed.

Sorry all of this should have been combined into one post, but postcount++

Load your page up with the firebug window or developer tools in Chrome, it will give you a timeline of the page & the requests that comprise it. You'll find browsers pipeline requests, so if you have 10 resources on the page the browser will make 4 concurrent requests, then when one is complete the next one is made. Numbers completely pulled out of my posterior, but the principle is right. Using sub domain(s) for static stuff help alleviate this, as those pools are per host (whether it's IP or DNS don't know off the top of my head).

One of the optimised www servers for static stuff is a good choice to go alongside Apache, they typically have a much shorter path internally to handling request/response than Apache.

Bah: Just realised this was a necro thread :) Advice still stands.
 
If you went down the Sep server / domain route, how would that be handled by ssl? Would you get the annoying pop up saying not all content is from the ssl domain?
 
Back
Top Bottom