Can We Make Slow Websites Go Faster ?

Intechnica recently hosted an event called Faster Websites – aimed at discussing with retailers some of the means and methods that can be adopted in improve performance of their online presence. As part of the preparation for this event we evaluated the websites of the potential attendees as well as the top 50 leading retail sites in the UK.

As would be expected there was a wide range of results from the very fast to the quite slow. So how could these slow websites become faster? Are there any quick fixes to be made?

I had a look into the performance of some of the slower sites to see if there were any quick wins that I could propose to improve their speed. I did a very limited investigation using WebPageTest under normal traffic conditions (as far as I know) and came up with the following observations.

Most follow general good practice. With very few exceptions most were doing the obvious things (minifying javascript, compressing content, using a CDN etc). This illustrated that there was unlikely to be a simple, config based solution to the slowness.

Slowness was caused by client side, not server side, issues – None of the sites spent more than 0.5 seconds waiting for a server response, indicating that the server is not struggling to return content. This is as would be expected for a site homepage that is not under load.

Very large page weights, especially javascript – A large amount of the slowness was being caused by simple page weight issues. All of these sites were requesting well over 100 elements with some requesting over 200 items. The largest chunk of this was images, as was to be expected. As these are retail sites, there is an argument to be made that high quality imagery is to be expected and is essential for business. However one site was requesting close to 70 images, taking 3.5mb of data. It would certainly be worth investigating whether these images could be compressed, loaded asynchronously or even just removed.

Of more concern to me across all these slow loading sites was the general size and number of javascript files that were being requested. Sites were requesting over 40 distinct javascript files and file sizes totalling 300kb+ were common, with one site topping 600kb of javascript content. In most cases this javascript had already been minified and compressed. In all these cases the use of javascript should be fully investigated and rationalised. CSS and even HTML files were similarly large (50kb+) and could equally be rationalised.

Complex DOMs – Most of the slower sites had more complex DOMs, often topping 2,000+ elements. This does not necessarily cause an issue but when combined with complex javascript and content manipulation it can easily lead to slowdown.
In the examples I tested this was illustrated in how long the startup event was taking for some pages. In one example this took over 1.5 seconds. This illustrates a page that is far too complex and needs rationalising.

3rd party scripts causing performance issues – There were a couple of sites that were slowed down in their load time by waiting for responses from 3rd parties for content (e.g. from Facebook). In one case this was causing a 12 second delay.
As a site owner you really can’t let your performance be in the hands of 3rd party content. You must aim to make all these calls asynchronous after page load if possible.

Part of any performance assessment should include poor performance of these elements. Overall the impression that I got was that effort on these systems was still mainly focussed on providing server side performance, and the state of the client side was being generally ignored beyond following standard good practice. A more considered approach could easily (days not months of effort) speed these pages up dramatically.

Andy Still (LinkedIn) has over a decade of experience in IT, Andy specialises in application architecture for cloud Andy_Stillinfrastructures and has a track record of developing highly complex applications for large volumes of users and data, including systems capable of processing over 100,000 transactions in under a minute. Andy has spoken at several events across the UK on topics ranging from Agile to Performance by Design, including the London Web Performance Group and BCS SIGiST. In 2006, Andy co-founded Intechnica, where he is Technical Director.

As part of the team at Intechnica Andy works closely with customers on engineering applications for scalability and high performance. This article was previously published at Performance By Design and has been reproduced at Practical Performance Analyst with prior permission.  PerformanceByDesign is an online initiative owned by Intechnica a digital consultancy specializing in performance assurance, cloud services and the development of business critical web and mobile applications. Intechnica provides services that helps customers with Custom Application Development, Performance Engineering of Large Systems and Architecture/Design/Build of Cloud Solutions.

Related Posts

  • Responsive and Fast: Implementing High-Performance RWD : Free eBookResponsive and Fast: Implementing High-Performance RWD : Free eBook Is Responsive Web Design Slowing your website down? It doesn't have to. This book provides readers with practical insight on improving responsive website performance, including guidelines to use as an easy starting point. You’ll get tactical and strategic solutions to performance […]
  • Step-by-step Guide to Optimizing your Apache site with mod_pagespeed – ZoompfStep-by-step Guide to Optimizing your Apache site with mod_pagespeed – Zoompf If you host your own website running Apache and are not already checking out mod_pagespeed, you really should.  Created by Google, mod_pagespeed is “an open-source Apache module which automatically applies web performance best practices to pages, and associated assets (CSS, JavaScript, […]
  • HTTP/2 : Winds Of Change – Part IIIHTTP/2 : Winds Of Change – Part III This article is part 3 of a 5 part series by Mark B. Friedman (LinkedIn). To read the previous posts part of this series please see links to Part 1 and Part 2 below. Click here to read part 1 of the series - Link Click here to read part 2 of the series - Link What's In Store […]
  • Examining Website Performance Through #OptimizeDigital SeriesExamining Website Performance Through #OptimizeDigital Series Limelight networks is running a blog series called #OptimizeDigital which focuses on web operations and performance. From the light blog website, " This series will examine online performance from every angle: What does performance mean to your business? Why is delivering consistent […]