Front End Optimization

In the context of web services, the term “front end” relates to the interaction between your website and a visitor’s browser. Front end optimization (FEO), also known as content optimization, is the process of fine-tuning your website to make it more browser-friendly and quicker to load.

Broadly speaking, FEO focuses on reducing file sizes and minimizing the number of requests needed for a given page to load.

During the FEO process, web designers draw a distinction between the perceived and the actual page load time. Perceived load time is considered because of its impact on the overall user experience (UX), while the actual load time is often used as a performance benchmark metric.

Content delivery networks (CDNs) play an important role in the FEO process of front end optimization, as they are commonly used to streamline many of the more time-demanding optimization tasks. For example, a typical CDN offers auto-file compression and auto-minification features, freeing you from having to manually tinker with individual website resources.

Front End Optimization

Fact: Front end delays comprise up to 80% of your website response time.

Case in Point: Time to First Byte (TTFB)

Time to first byte, often used to measure a website’s response time, is one of the most important—as well as one of the most misunderstood—performance metrics.

From an actual load time perspective, TTFB is the duration it takes for the first data byte to arrive from a server to the requesting browser. From a perceived load time perspective, however, TTFB is the duration it takes for the browser to parse the first byte after downloading the HTML file.

Only perceived TTFB impacts user experience, making it the more valuable of the two metrics.

Time to First Byte - TTFB

Reducing HTTP Requests

When loading a web page, a browser has to open a separate TCP connection for each HTTP request made, which is equal to the amount of page elements it’s required to download.

The problem is that there is a limit to the number of concurrent connections a browser can open to a single host. This limit exists to protect a server from being overloaded with a high number of HTTP requests. However, it also serves as a potential bottleneck, often forcing the browser to start queuing connection requests.

As the maximum connection threshold is quickly reached, various FEO techniques are employed to minimize the number of individual page elements. One of the most common is resource consolidation—the practice of bundling together multiple smaller files.

Reduce HTTP Requests

For Example…

Say your website template consists of one HTML file, two CSS files and 16 images—including your logo and various menu backgrounds. In total, a browser needs to make 19 HTTP requests to load an empty page on your site.

A visitor using the Google Chrome browser can only open six TCP connections to your server at once, so the browser has to queue the remaining 13 requests.

However, if you consolidate all of the template images into a single sprite image, you can reduce the number of requests from 19 to just four.

Not only does this let Chrome parse the page in one “sitting,” but it reduces the number of round trips needed to load the page.

Example: Reduce HTTP Requests With Sprite Image

The CDN Factor

CDNs can further reduce server response time by pre-pooling connections and making certain that they remain open throughout a session.

While CDNs don’t reduce the number of requests per se, pre-pooling improves performance by eliminating delay times associated with closing and reopening TCP connections.

Note: The HTTP/2 protocol, still in the early adoption stage, introduces multiplexing—a connection method that permits multiple requests and responses to be sent via a single TCP connection.

In the near future, this may minimize the benefit of resource bundling.

File Compression

Every one of your website pages is built from a collection of HTML, JavaScript, CSS and (possibly) other code files. The more complex the page, the larger the code files and the longer the load time.

With file compression, these files can be shrunk to a fraction of their original size to improve site responsiveness. Preferred for its quick encoding/decoding times and high compression rates, gzip is the most popular file compression choice. It can shrink a code file by as much as 60 or even 80 percent.

Note:  Gzipping is not effective in reducing image file sizes, as they are already compressed.

Gzip’s strong point is, to some degree, its ability to consolidate all files into a single compressed tarfile (a.k.a., tarball). The downside is that it prevents individual file extraction. This isn’t an issue for web content, however, as it has to be decompressed anyway for the entire page to load properly.

Gzip File Compression

The CDN Factor

Nearly all CDNs provide automated file compression, seamlessly Gzipping all compressible code files (e.g., CSS and JS files) before serving them to website visitors.

Cache Optimization

HTTP cache headers play an important role in the way browsers parse a website, for they determine which content items are cached and for how long.

Caching is storing your static files, which tend to be your largest ones, outside of your server—either on visitors’ local drives or a nearby CDN PoP. This can vastly improve the website’s load speed.

The downside is that manual cache header management can be a tedious and inefficient task. Moreover, caching mechanisms often run into issues when handling dynamically generated content created on-the-fly as a page begins to load (e.g., AJAX objects and even dynamically generated HTML files).

Cache Optimization

The CDN Factor

Many CDNs offer cache control options, usually by way of a user-friendly dashboard. With it they allow you to set site-wide policies, manage caching rules for individual items and even set policies for entire file groups based on things such as file type and location, (e.g., always cache all images in the “/blog/” folder for 60 days).

CDNs have also begun integrating machine learning techniques. These follow content usage patterns to automatically optimize caching policies, thereby caching the typically “uncacheable” dynamic content. This relieves you of nearly all cache management tasks.

Code Minification

Minification is an FEO process that recognizes the difference between how developers write code and how machines read it.

The idea is that—while developers write code for easy reading comprehension, with spaces, line breaks and comments—machines can read it without any of these elements, making them nonessential characters

Minification trims code to its barest essentials, often reducing it by half before compression.

Before and After Minification

Before After Minification Before After Minification Before minification (201 characters) After Minification After Minification After minification (137 Characters) = File sized decreased by over 30%

The CDN Factor

CDNs have the capacity to completely automate code minification. As an on-edge service, already serving much of a site’s content, it’s very easy for CDNs to minify all JavaScript, HTML and CSS files on-the-fly, as they’re being sent to visitors’ browsers.

Gzip AND Minify:

While minifying and gzipping code may seem redundant, combining both methods offers the best results.

Thus, minifying files before you gzip them will shrink the tarfile size by an additional 5 to 10 percent.

Image Optimization

Caching and compression are the two most common image optimization methods, with caching being the more effective of the two. This is because, unlike code files, all image formats are already compressed.

As a result, to further reduce image file size, you have to tamper with that image’s data, either by removing some of the header information or by reducing the original image quality. This is known as lossy compression.

Image Optimization

Note: While discarding data and diminishing resolution is often unadvised, lossy compression can be useful for some high-resolution images, because our eyes can’t naturally perceive the full range of visual information such images hold.

For example, lossy compression can remove color gradations and reduce pixel complexity without noticeably affecting our perception of the image.

The CDN Factor

CDNs, the go-to solution for image caching, are often purchased for that purpose alone. Furthermore, some CDNs also help automate the process of image compression, letting you choose between page load speed and image quality.

More advanced CDNs also offer a progressive rendering option, putting a twist on the original lossy compression concept. With progressive rendering, the CDN quickly loads a pixelated version of the image. The CDN then progressively replaces it with a series of progressively better-and-better looking variants until the actual image is ready to load.

Progressive rendering is useful for its ability to diminish perceived load time without sacrificing image quality.

Progressive Rendering in Progress

Vector and Raster Images

Another image optimization technique is to replace some of your regular (raster) images with their vector counterparts.

This technique is applicable to images composed of simple geometric shapes: lines, curves, polygons etc. A typical vector image is an icon or a diagram.

You should use vector images whenever you can, because:

  • They are very small in size, as they only need to hold data for a set of coordinates—not for each individual pixel.
  • Being resolution-independent, they’re able to be zoomed in and out indefinitely, without any impact on quality. This makes them perfect for responsive designs.
Vector and Raster Images