Tech Tuesday: How To Optimise Your CSS and Javascript For Fast Front-end Performance

11 April 2017 | Masimba Sagwete | About a 9 minute read
Tags: ANDtech, CSS, front-end, JavaScript, non-functional, performance, Product Analysts, speed, tech

Web performance is important because it affects how enjoyable your website is to use. ‘Performance’ refers to the speed at which web pages are downloaded and displayed on the user’s web browser. It is also now a very strong SEO ranking signal and there is strong evidence that it can dramatically affect ecommerce conversions. As described on this Kissmetric blog, an experiment by Marissa Mayer when she was a Google VP, delaying pageload time by 0.5s reduced traffic by 20%.

Whereas it used to be about factors like network, server response time and CPU usage, the advent of cloud services and broadband fibre have brought it to the front-end. Because as well as managing the objective indicators of speed, we also need to manage the perception of speed.

Starting with objective measures, there are 2 key tactics that need to be employed to achieve better performance on the web and they are:

  • Reducing file size i.e. limiting how much the user has to download
  • Limiting the number of HTTP requests the browser has to send for the user to view your content

File size

The average web page is now around 2MB which if you went back to the 1980s, is the amount of memory taken up by a copy of the ground-breaking video game, Doom. As the pipe (bandwidth) grows, we tend to fill it up by sending more through it. The current BBC homepage which is a 1.2MB download. There’s no real difference in downloading time for on fibre nowadays versus on 56k dial up a few years ago when it would have been a lot of text amounting to no more than a few tens of KB.


If you aren’t YouTube, the primary consumer of bandwidth on your website will be images. To see this in action, compare the AND Digital homepage which at 2.1MB is nearly 5x smaller than the 11.1MB behemoth that is our Club Ada page which has a photo each for all 80 of us.

Club Ada Page and Homepage comparison

There are several things you can do to limit their size:

    1. Use the right file format i.e. don’t do what the BBC did and use the .jpg file extension for a 5MB uncompressed bitmap of their main website logo
    2. Use HTTP compression for all uncompressed file formats (TIFF, BMP)
    3. Get rid of useless file information including thumbnails and EXIF data which can be done through APIs like pngcrush, jpegtran, gif2png, gifsicle which will automatically reduce the file size without any perceptible loss in quality
    4. Use the more efficient ‘deflate’ compression instead of LZW compression

Outside of images, serve all files that don’t have native compression using HTTP compression. This includes files such as:

  1. HTC (polypill older IE versions)
  2. RSS feeds – they’re just text!
  3. XML, JSON, robot.txt; also text!
  4. Fonts – SVG (fallback for fonts especially on mobile phone; they’re XML documents so also… text)
  5. ICO (favicon which is essentially a bitmap)

It may not seem like it would make a big difference but compressing all of the HTML, JS and CSS needed for the AND Digital homepage means the user only has to download 300KB instead of 1.2MB which could easily be the difference between a bounce and a conversion especially on a shaky mobile network.

Number of requests

With bandwidth being less of a factor – especially on the desktop – latency has become more important. Browsers spend a lot of time waiting for responses to their requests e.g. the delay from the West to the East coast of the US is 80ms (light in fibre is 66% of the speed of light) and this adds up serially as you increase the number of files i.e. ten 10kB requests will always be slower to download than one 100kB request. Bigger files also have more redundant data and are more compressible as a result. There are several ways to reduce request count including:

  1. Batch requests so you can request one big file rather than multiple small ones e.g. aggregate CSS and JS
  2. Concatenate scripts using automate build managers i.e. combine JS and CSS into one file
  3. Reduce redirect chains and use URL rewrites instead e.g. http vs https, non-www to www. and/or trailing slashes
  4. Use CSS sprites to combine all your background images into one big image and detecting disused background images

Set expiration limits on browser cache requests and use conditional GET so the browser doesn’t have to wait for a 304 – not modified server response to find out if a new file needs to be downloaded.


More requests means more waiting time

A lot of these problems will be solved when HTTP 2.0 goes mainstream because of 2 key features:

  1. Request pipelining i.e. the browser will be able to send multiple simultaneous connections without having to wait for each one to be fulfilled
  2. Being able to push assets before they are requested by the client

Perceived performance

Once you’ve tackled objective measures of performance, you need to tackle perception of speed or how quick your app ‘feels’ to a user. In his 1993 book ‘Usability engineering’, Jakob Nielssen describes 3 speed the user perceives:

  1. 100ms – which feels instantaneous
  2. 1s – which is acceptable but feels detectably slow
  3. 10s – beyond this, your application won’t keep the user’s attention

There are 3 simple things you can do to make your site feel faster without drastic re-engineering.

  1. Progressive rendering of images – the concept is essentially allowing the user to see a lower quality version of an image while the higher quality version is downloading then showing the high quality version when it’s available. PJPEGs have been around since 2010 but according to Facebook, mobile apps have been slow on the uptake. PJPEGs use multiple, progressively higher quality versions of an image which update as they stream in real time. Being able to see an image immediately, even if it’s of low quality, give the user the impression of speed and responsiveness.
  2. Google goes one step further and pre-renders the dominant colour before the image even loads…
  3. Animate transitions – Google’s material design guidelines describe animations as ‘Distraction from what’s happening behind the scenes’. This means that if you know your content will take longer than a second to load, you should animate the transition between states so the user focuses on them rather than the fact you’ve kept them waiting
  4. Loading spinner – a loading screen with a spinner that runs indefinitely make cause the user to lose their patience for long operations that might take a long e.g. uploading files. A progress bar with real time feedback on progress might be a bit more complex to implement but it’s more reassuring for the user

For more of these, this Treehouse blog has comprehensive advice on areas where you can find improvements.

A disciplined approach to managing performance

While the transition from HTTP 1.1 is underway, there are several habits which will help you stay on top of performance, namely:

  • Set a performance budget with targets for key metrics including pageload time, time to first pixel and average page download size
  • Benchmark regularly to track how you’re doing against your budget and where possible, make running these tests part of your integration pipeline
  • Automate as much as you can and this can include concatenation tools for your CSS/Javascript build
  • Use a CDN which optimises images on the fly and reduces latency by locating servers close to the use e.g. Cloudflare or
  • Make sure you have uniform coverage across the site instead of in key places like just on your homepage
  • When you’ve tackled these, make sure you also address issues with perception

Overall, poor performance is software not behaving as intended so the reasons behind this should be treated like any other bug i.e. test early, test often and triage so you know which bug to work on next. It shouldn’t be the last thing on the checklist before releasing to production; it should be baked into your development process.

Find out more about web performance in this great 60-minute YouTube video “Optimizing front end web performance like a rockstar” by Billy Hoffman. And for more on perceived performance, check out this highly-rated talk from Spotify’s Tobias Ahlin on animation.

Read More From This Author

Share this blog post

Related Articles


We’re looking for bright, dynamic people to join our team!

Discover More Roles