Nowadays, even regular Web surfers know some of the things to avoid when designing a Web site for fast performance: Cut the number of requests to the Web server. Shrink JPEG sizes. Employ a content delivery network vendor like Akamai Technologies Inc. or Limelight Networks Inc.
Problem is, according to Steve Souders, steps like these aimed at optimizing the Web server make only a tiny impact.
"We used to tear apart the Apache [Web server] code to figure out what Yahoo was doing," said Souders, who was Yahoo's chief performance engineer for several years before moving to Google in the same role.
But after performing a detailed analysis, Souders discovered something startling: Only 10 to 20 percent of the time it took to load a Web site could be attributed to the Web server.
The vast majority was the result of code executing inside the Web browser, said Souders at a talk on Tuesday at Microsoft's Tech Ed conference in Los Angeles (download PowerPoint here).
That may have made sense a decade ago, but in today's era of PCs powered by dual and quad-core CPUs, it doesn't. And the cost of the delays created can be high.
Google has found that a 500-millisecond delay results in a 20 percent decrease in Web traffic, while Amazon.com has seen a 100-millisecond delay cutting its sales by 1 percent, Souders said.
Better browsers, better performance
To fix, Souders first recommends a free tool he created called Yslow that analyzes and then grades how well a Web page is designed for maximum speed. Originally developed for Internet Explorer, Yslow 2.0 is an add-on for Firefox integrated with the Firebug Web development tool. It is downloadable here.