0

"Best Practices" suggestions

Hello,

first of all, the "Best Practices for speeding up your website"-rules are a really good collection to improve the user experience and I agree with most of them but I think some of them need a bit of discussionk. Maybe you have other experiences or other ideas, but that's why I'm writing today.


Use GET for AJAX Requests

Using GET for Ajax coincides with my experience. If you send lots of data using GET, you eventuallly reach the limit of 2k bytes in Internet Explorer (like outstated on the page). Maybe the idea is worth to share, that parameters could be added to the URL in a compressed form (should only be considered if the compression is better than the overhead that e.g. base64 brings with it [133%]).


Flush the Buffer Early

The listed point is clear that additional data can be loaded even when calculations and querys are running on the server. Normally, the server side processing should be optimized so that it does not take 500ms to make content available. The biggest problem should be the I/O problem of transfering the document to the client and not I/O behind the scences. If this is really the performance bottleneck, you know where you can tweak your site, but sending the header early covers not my experience. After a long time of hacking and benchmarking, I came to the following solution:

ob_start() -> writing the entire page to the buffer -> [attention: the following points are hacks of PHP, but should also be possible with a default setup] writing the buffer directly to a shared memory segment (tmpfs) -> and sending the data direclty to the clients file descriptor using sendfile(). I will also publish some benchmarks about this topic, but I would like to mention that this approach performed better with most of my setups over the chunked option using flush(), which results in several writev() calls (on linux 2.6).


Configure ETags

Maybe you should mention that one should try to create the ETags for themselve using a scripting language - or do not put the focus on Apache as webserver at all. With my recently published lua script (link below) I show that this is easy possible and problems behind a cluster setup are obsolete.


Avoid Redirects

Avoiding redirects is a very critical point and even Yahoo does not follow this own rule in some situations. A reduction to "Avoid redirects for included files" would be more apt, because redirecting the site is common practice in search engine optimization and is also the neatest technical solution to let the user agent know where the requested document can be found.

In my opinion, it should be allowed to forward the main document for up to three times, without beeing doubtful that this behavior could be classified as performance alarming. Such redirects are in the regular "keep alive", which reduces the problem again. Newer browser also cache 301 redirects for the current session, because they are permanent, as the name suggests.


Minify JavaScript and CSS and Minimize HTTP Requests

A mention besides the YUI compressor could also be Googles closure compiler, which achieves really good compression results.

I think it would also be a good idea to make a hint of how it is possible to reduce and minmimize CSS and JavaScript files in one go. As I already indicated above, I've published an article about solving the problem of combining and minimizing static content using lighttpd. I also linked to some extern ressources to read how one could implement such a combining/minimizing script with PHP or Apache.

Regards,

Robert Eisele

by
0 Replies

Recent Posts

in Suggestions for YDN