Serving Image at the Right Time

People try their best to keep the user on their website by having quality content and beautiful images, but beautiful images usually make the website slow, slow, and very SLOW. It takes time to download them and we want to make sure the website is fast enough so people actually see the pictures before they bounce off. Here are some of the best practices I have used to tackle such problem.

Lazy Loading

Hopefully at this point you haven’t grown sick of this word whenever you google “Web Optimization” but this is one of the most proven methods that keeps the user from downloading 5 MB before seeing anything. The basic idea is to only load the images that are actually in the viewport and when those are done, you can start preloading other images below it if you’d like. This makes sure the network priorities are given to photos that matter to the user at the moment.

You can achieve this by using a library called lazysizes. What it essentially does is it will detect whether the picture is in viewport then download the image if it is.

<img data-src=“image.jpg” class=“lazyload” />
<img data-srcset="responsive-image1.jpg 1x, responsive-image2.jpg 2x" class="lazyload" />

You can use it on img tag, img tag with srscet for allowing browsers to download the best version of the image, or with picture tag with multiple sources.

Or you can use the platform. You can add the new loading attribute to your images so your browser decides when to load them. There are 3 values: * lazy: check the viewport then decides if needs to load * eager: just load everything * auto: browser will decide Currently only Chrome supports this but we may start seeing supports on other browsers in the future.

An alternative to any of these is to write it yourself. You can utilize Intersection Observer for detecting if this image is in the viewport. This observer will call your callback when the criteria are met such as 20% of this element is in the viewport. You can write a generic image component in your code that implements this observer and control the image. Remember to include polyfills if you need to support older browser.

Serving WebP

Webp is a new image format that has better compression and still maintain most of its quality. During testing, Webp can provide at least 25% more compressed image than JPG does. You can use service like imgix that can let you process your image on the fly with just some parameters in image url. This also includes serving Webp format.

One observation for this is that the color starts deteriorating in Webp when we tune down the quality too much and this is especially visible when you have big high-res monitor. So remember to play around with the quality or size to find the best parameters for your images.

Low Res to High Res

To make it look like the images are loading and to preserve space, you can download the image but in lower resolution first so it is fast to load and then download a bigger one later. The speed that people feel, also known as perceived speed, will be fast even though you know it hasn’t completed downloading the images yet. You change these low-res images to base64 strings so it’s even faster than actually download an image.

Bonus: Keep the Space

One of the pain points I have with lazy-loading is that because when you construct the DOM the images are not there yet until JavaScript realizes this particular image can be downloaded, the designated space for that image isn’t there until downloaded. This makes the DOM elements jump around and that is not a pleasant experiment for your end users.

If you somehow know the height, width, or ratio beforehand, you could potentially calculate how much space you need to keep for these images. This way your elements won’t jump around, libraries that rely actual DOM dimension won’t be broken. For example, if you have a slider component that can display a carousel of beautiful images but you need to lazy load, preserving spaces will not confuse your component.

Secondly, you can utilize skeleton loading to both denote that the page is loading and also preserving space even without knowing the actual size. Skeleton loading gives a rough layout of what the actual layout looks like plus loading state so for example you may have placeholder avatar and placeholder post text, then when it finishes loading it will feel like those data were always there. There have been many tests that show people think the page with skeleton loading is faster than page with blank page even though they load at the same speed. This will need to be tested because it may not be suitable for everybody. If your website is really optimized and really fast, showing a loading state like this may give an impression that the website is slow, impacting the perceived speed negatively.

If you like what you read please share on Twitter and let me know what you think, thank you!