Jared Tong ― Web Developer

Why do some pictures load top to bottom and others go from blurry to clear?

June 2020

Loading Images on the Web, explained

As a full-stack software engineer, I was inspired by this ELI5 Reddit thread that asked:

Why do some pictures online load top to bottom (line of pixels line by line) and others go from blurry to clear?

Let’s try and answer that question!

Technique 1: Progressive Image Encoding

The first reason you observe that behaviour is because it may be inherent to how the image is encoded.

This encoding determines the order that image information is loaded by the decoder web browser.

Progressive encoding starts with the equivalent of an image at very low quality, and with each progressive level of information downloaded, adds to the decoded image already displayed. To the user, this looks like a series of scans of increasing quality, like going from a blur to a clear image.

Let’s observe this difference between two kinds of JPEGs, baseline (top to bottom) and progressive (series of scans of increasing quality):

One obvious advantage of this method is that users immediately get a preview of the final image. However, it’s not always obvious when the full-image is fully loaded, which may lead users to think your images are of low-quality.

You may haved noticed that the progressive image on the right still begins by loading from top to bottom. You can add another process, called interlacing, to ensure that an entire low-resolution layer of the image is loaded immediately. In brief, interlacing works by creating layers of pixels at certain intervals, and filling in the rest of the image based on that information.

How does an image get encoded as progressive?

  • Creators can export the image as a progressive JPEG.

    • Image Editing Programs like Photoshop and Sketch have options to export JPEGs as progressive.
  • CDNs and Image Hosting Sites can optimize your images for you, by encoding your baseline JPEGs as Progressive for example, when you upload to them.

  • What about Social Media sites you upload to? You can bet that they’re encoded.

    • Twitter, Yelp, and Facebook, are just some who have announced publically that they use progressive JPEGs.

Further Resources:

Technique 2: Low Quality Image Placeholder

AKA: Swapping a low resolution image for the full-size image once its fully downloaded.

For web apps that really want to control the entire user experience, it is not sufficent to rely on progressive jpegs and default browser behaviour. For instance, apps may want to delay loading the full-size thumbnail until the user scrolls to it, but still pre-fetch a low-resolution thumbnail so that it displays the moment the user scrolls to it, for a snappier user experience. Or, developers may have switched over to webp instead, which does not support progressive encoding but has better compression.

So, a common best practice is to replicate the effect of progressive images with code. This means that the low-resolution image finishes downloading first and is displayed to the user. Only when the full image is downloaded will the code swap the high resolution image in.

Example of Blur Up Technique Image Credit: Facebook Engineering

A benefit of this method, besides the obvious effect that the site looks fully loaded earlier, is that browsers know the image-size through the placeholders earlier, so there is no recalculation of layout when the full images load. Otherwise, content would jump around while full images load.

How does a low-resolution image get generated and swapped in? Often at build time or when your image is uploaded, a low resolution version is created from the full-size image. At run time, there is an event listener for when the full-size image is loaded, which waits to swap the images. CSS Transitions is used for a smooth opacity swap.

There’s even a variant that generates an outline of your image as an SVG instead.

This technique introduces the cost of making one additional request to load the low-res image, among others. It’s also more costly if you have to make a request to get the image URLs, before you can load the images themselves.

  • Facebook optimized this away by turning the image into a format that could be returned by the existing API request that fetches the URL for the full-size image, reducing the potential number of API calls from 3 to 2.

  • Static site generator Gatsby(-image), generates the low-res image as a base 64 string so there’s no additional HTTP request.

Further Resources:

class Image extends React.Component {
constructor {
this.state = {
imgLoaded: false
}
}
handleImageLoaded() {
this.setState({ imgLoaded: true })
}
render() {
const isVisible = checkIntersectionObserver()
const imageStyle = {
opacity: this.state.imgLoaded ? 1 : 0,
transition: this.state.imgLoaded ? `opacity 300ms` : `none`
}
const imagePlaceholderStyle = {
opacity: this.state.imgLoaded ? 0 : 1,
transitionDelay: `300ms`
}
return (
<>
<Img style={imagePlaceholderStyle} src={image.base64}/>
{isVisible && (
<picture>
<Img
srcSet={image.srcSet}
onLoad={this.handleImageLoaded}
style={imageStyle}
/>
</picture>
)}
</>
)
}

Conclusion

Both techniques work alongside each other to enhance the user experience. Even something as simple as an image blurring up actually has had a lot of work behind it. We haven’t even covered any of the underlying algorithms. Let me know if you’re interested!