Site speed – a guide for beginners

What if we looked at a site’s speed not in terms of points scored on a PageSpeed test, but in terms of what realistically happens to our site from the moment it’s put online until it’s displayed to the user?

I think that everything would then become more logical, and the elements that affect the speed of the page would cease to be black magic. And that’s what the following article is meant to be – a logical explanation of the various elements of site speed with beginners and intermediates in mind.

This guide was written with beginners in mind and uses the simplest possible vocabulary. By force of nature, I do not consider unusual situations and programming environments.

For starters: map

Let’s take a look at a map of the most important elements affecting page speed….

… And then let’s analyze it step by step.

Step 1: Prepare files

Let’s first take a look at the steps we need to take before the site loading process even starts, which is the proper preparation of the files.

Graphics optimization

The first point is fairly obvious and I cite it here just for the sake of formality.

Many people think that the preparation of graphics in page speed optimization is simply to compress them properly with an appropriate graphics program.

Of course, this is important – it is often possible to do lossless (that is, not visibly affecting the quality of the photo) compression that significantly reduces the file size.

Note, however, that it matters not only what size the image file will be, but also how you embed that image on the page.

First of all, serve graphics scaled for display.

If a 2080 x 1542 photo with our face is to serve as a tiny avatar, let’s shrink it to the right size. If we do not know what size these should be, because, for example, in the mobile version they will be small, and in the desktop version they will take up the entire screen, it is possible to prepare several versions of them and serve them, for example, with the srcset attribute.

If possible, it’s also a good idea to specify the dimensions of the image with each graphic; this way, the browser will be able to reserve space for it even before the image is loaded and won’t be forced to “rebuild” the appearance of the page after it is loaded.

How does optimizing graphics affect site speed?

This is one of the first and most important optimization measures, because it often reduces loading times the most and is relatively simple to implement in most cases.

Websites in which a small logo is actually a huge graphic of several megabytes are not uncommon.

Reduce the size of HTML, CSS and JS code

I have already described this subsection in the article:
Remove unused code – CSS and JS optimization in practice.

How will reducing the size of HTML, CSS and JS code affect the speed of the site?

This, of course, depends on how much unnecessary code you manage to remove. The amount of code removed will be directly proportional to the improvement in loading speed, and in addition, removing unused and often “heavy” JS scripts can save a lot of time for their unnecessary execution by the browser.

Limit the number and size of external files

The need to reduce to an absolute minimum the number and size of files (e.g. JS scripts) attached from outside is obvious. Each such file is the need to read, process and execute a script.

In addition, each such file is also the need to connect to a server other than the one on which our site is hosted, which prolongs the whole process. As if that weren’t enough, external JS files are virtually beyond our control. We cannot set a period for caching them; also, a possible failure on the provider’s server results in problems on our site.

How will reducing the number of external files affect the speed of the site?

It’s very important to limit the number of external scripts. But the problem is that it can’t always be done, because it’s hard, for example, to run a site without analytics from Google Analytics. So we do it with our heads. If, for example, a year ago we installed a Facebook Pixel, or a script that creates heatmaps and have never used it in our lives, you should definitely get rid of it. The less that happens on the site, the better.

Minify HTML, Minify CSS, Minify JavaScript

Minify or code minification is actually a fairly simple process that involves removing unnecessary spaces, spacing, formatting, comments, commas and other unnecessary symbols from code.

Of course, we don’t do it manually, because it would be practically unfeasible. For code minification, Google recommends several programs such as HTMLMinifier, CSSNano or UglifyJS, a list of which you can find here.

How does code minification affect page speed?

Although it sounds unrealistic, cutting unnecessary junk in files in this way can really reduce their volume significantly, and thus speed up the browser’s download of the site.

However, we have a fundamental “but” here. Minification does not always work on a “set it and forget it” basis. Unfortunately, but the minified code needs to be tested for proper operation. It happens, for example, that our not-so-transparent JS code after minification also becomes not-so-functional code.

Step 2: Server-side loading

The Internet user’s browser has sent a request to our server, so we can start playing.

Server response time (TTFB)

TTFB or Time to first byte is the time from when the browser makes the first request until it receives the first bytes of data. You can easily check this time in DevTools in Chrome.

Google recommends TTFB under 200 milliseconds, anything under 100 milliseconds is a very good result. Results above 600 milliseconds mean that something is probably wrong.

But what does that time really come from?

TTFB consists of 3 groups of factors:

  • request sent to the server – here delays can be due to a number of reasons, including a slow DNS response (e.g. when the server is physically far from the client), or complicated firewall rules
  • server-side processing – the server has to “work through” the query and generate a response – that is, for example, make queries to the database, execute server-side scripts and send the finished response to the world in the form of the first portion of data
  • response to the client – here the key element is the speed of data transfer by the server and the speed of the Internet connection at the recipient; they determine after what time the first byte of data sent by the server will reach the recipient

In practice, it is the second group of factors that can sometimes be the most troublesome and save the most money. The fault does not always (or even most often) lie with a slow server.

If the site, for example, makes a large number of database queries or performs some complex operation every time it loads, even the best server will not be able to handle the traffic and will run noticeably slower. The answer could be server-side caching or the use of CDNa.

However, let’s always first make sure that our site is simply well-written and does not overload the server. After all, changing the server to a more efficient one may then prove pointless, as the situation will repeat itself when traffic increases.

How does reducing server response time affect page speed?

If your TTFB oscillates at or below 200 milliseconds, give yourself the gift of further optimization until it is necessary. The farther into the woods you go, each additional millisecond to be pinched will already require more complex and difficult steps.

If the TTFB is much larger, the answer to how shortening it will affect speed is obvious. You then need to think about how your application is built.

If you measure TTFB for a test, simple static page written in HTML and it is significantly lower than on the other URLs, the problem almost certainly lies in the design of your site and/or database.

Step 3: Download process

Ok, the server has prepared the page for us to upload – we start downloading the data.

Gzip compression

If you were born before 90. there is a good chance that you happened to try to fit a large amount of data onto a 1.44 MB floppy disk. And there is a good chance that you were using some kind of compression program at the time, such as. WinZip or WinRar.

File compression in this case is something similar to “packing” files with WinZip. HTML, CSS and JS files are automatically compressed into gzip format and downloaded by the browser in that form – so the browser has less data to download. Once downloaded, the browser automatically decompresses the files and reads them.

Compression is unlikely to include image files – these are compressed by default in most cases.

How does file compression affect page speed?

Compressing files to gzip format for files over 150 bytes (which is almost all of them) will always make sense. The time it takes to read compressed data is shorter than the time it takes to download files of the uncompressed version.

In short – regardless of the amount of time you save, compression is such a standard and simple enough thing to implement that it is worth doing in almost every case.

It is interesting to note that gzip compression can adversely affect TTFB. This is because the server does not send the first bytes of data as soon as it generates them – it has to wait for the whole file to be generated to compress it completely. However, the overall page load time will still be shorter because of this.

Redirection limitation

301 redirects are a way to automatically move a user from one URL to another. Google’s robots respect the redirect, also going to the indicated address. So if, for example, your site’s URL changes, Google’s robot has the correct address indicated thanks to the 301 redirect, and the user’s browser takes it there almost imperceptibly.

Well, that’s right: almost imperceptibly. The browser sends a request to retrieve the contents of the URL, receives a redirect message back, and then sends the request to the new address. So in short, he has more work to do (more times he connects to the server).

The problem arises when a so-called “problem” arises. redirect chain – that is, when a requested URL redirects to a URL that also redirects to another URL. This can happen, for example, in the case of redirects of the type:

domena.com -> www.domena.com -> www.domena.com/pl

How does limiting 301 redirects affect page speed?

Here, of course, the effect depends on the scale of the problem. In extreme cases, when the redirect chain is long, occurs within multiple URLs, or even falls into a loop, fixing this problem can have a salutary effect on site speed.

Avoiding bad requests

A bad request (not to be confused with a 400 Bad Request error) is one that cannot be fulfilled and returns a 404 or 410 code. In short, if, for example, in the code there is a reference to a JS script: domain.pl/skrypt.js , and at the indicated address there is no such script – we are dealing with a bad request.

How does avoiding bad requests affect page speed?

It’s simple – referring to non-existent resources is a waste of time. This is because the browser must send the request in vain; it receives no file in response.

CDN – Content Delivery (Distribution) Service

A CDN is, in a nutshell, a network of high-speed servers that hold a relatively “fresh” copy of your site (or just elements of it, such as image files) on their disks and serve it from the fastest server for a given user. Such a solution for some parties may be salutary, for some completely unnecessary.

How will the CDN affect the speed of the site?

While the answer “it depends” fits almost every element of improving site speed, the use of CDNs raises the most questions. For some large sites, a CDN or similar technology is almost a necessity. Also, if, for example, we host a site in Poland, and the users of our site are, for example, mainly Americans, a CDN can also prove useful. The CDN can also improve security, protecting against DDoS attacks, for example.

However, CDN is an optimization that is quite demanding, needs proper configuration, and is paid for, based on commercial third-party solutions. In most cases, if I already recommend CDN, it’s only as the last step in the optimization process – it’s worth doing all the other recommendations first and checking their effect. If you run a simple site on WordPress, with hosting and clients in Poland, do not stream video on the site and do not provide graphics and downloads, and above all: do not have a huge amount of traffic – the chance that you need a CDN is negligible.

Step 4: Loading in the browser

We already have – or any moment now will have – the page downloaded from the server, now our browser takes care of rendering it – it thinks about how the page should look, what the various scripts do, and finally displays it to us.

Browser cache

The use of browser cache, or browser cache, can significantly reduce page load time for repeat visitors. This is because for each type of file (or for individual individual files) we can set an “expires” or “max-age” header (or use another method – depending on the configuration) – that is, to tell the browser how often it should download the latest version of the file from the server.

So if we load a large photo on our first visit, for example, the browser will be told how long it should keep it in its cache. If it is, for example, 30 days, then for the next month on subsequent visits to the site, the browser will not waste time downloading the same image again and will simply load it from the cache.

So if we are not still experimenting with the design of our site, files such as CSS sheets, graphics or JS script can have the expires header set even for longer periods such as a year.

How does the use of browser caching affect page speed?

The impact of browser cache usage can be very large, but keep in mind that the difference will only be noticeable in returning users.

In some cases, browser caching can also speed up site loading for first-time visitors to our site. This happens, for example, when downloading external scripts and files, which are required unchanged on other sites.

The best example is the fonts downloaded from Google Fonts. The same font used on two sites is downloaded only once, when visiting the second site the browser already uses the cache.

In the case of externally loaded files, however, we have no control over the “expires” header. And you can read about whether fonts – also given the described property – are better loaded from Google Fonts or locally in my article from a year ago.

Rendering and JavaScript

When the browser downloads the code of our site, it begins to analyze it and build the so-called “code”. HOPE tree. Analyzing the code line by line adds more elements to the structure of the site even before it displays them to us. If it encounters JavaScript code in the code, it must stop the analysis and execute the encountered script to know what effect it will have on the site structure.

This affects the speed of the site very negatively, because before the browser even displays anything to us, it will waste a lot of time executing scripts that are often not relevant and could wait.

It is even worse if the script is downloaded from an external server – here comes the need to make a connection to another server and download the file. Meanwhile, the user keeps waiting for the page content to be rendered.

So what should be done? First of all – avoid load-blocking scripts. If we have to use them, we are left with the defer and async attributes.

<script async src="skrypt.js"></script>
<script defer src="skrypt.js"></script>

The async attribute changes the default way of building the DOM tree. Scripts marked with it do not block rendering of the site, and load in the background, so to speak. This also has its downsides – it is impossible to control the order of execution of such scripts, or, for example, to refer to document.write.

The defer attribute completely blocks the execution of scripts marked with it until the rendering of all critical site elements is completed.

Both methods are not ideal, because some scripts need to be loaded early – for example, many JS frameworks will not work if you mark libraries with the async or defer attribute.

How does the restriction of JavaScript blocking the rendering of a website affect its speed?

Very, especially if, for example, our site template uses a lot of JavaScript water features. The most important thing is to review all the scripts that our site loads and get rid of as many of the unnecessary ones as possible. Especially if they are externally loaded scripts.

In addition, this removes the threat of, for example, the script provider’s server crashing, which can result in a wait of several tens of seconds for the script to load.

Step 5: Measure the effects

The process is over, let’s now measure its effects.

The Internet is teeming with “magicians” who provide you with optimization in order to score 100 out of 100 on the PageSpeed test. It’s great if your site achieves such a result, you’ll be able to brag about it to your mother-in-law at dinner, but it won’t do you much good beyond that.

Tests such as. GTMetrix should be considered as a guideline. Yes, achieving a high score will certainly have a positive impact on the speed of the site, but these points cannot be an end in themselves.

I rarely praise Google’s actions, but one initiative of theirs is, in my opinion, fantastic and very right.

These are the new Core Web Vitals indicators, which, importantly, are about to become a direct ranking factor.

I wrote more about Core Web Vitals some time ago in this article. As a quick reminder – this is a list of indicators that actually and realistically affect page speed and user experience.

In a nutshell, these are:

  • First Input Delay – after how long it is possible to interact with the page (e.g. click a link).
  • Cumulative Layout Shift – by how much distance the content shifts from the first time it is shown as more elements of the site are recharged
  • Largest Contentful Paint – how long it takes for a page to load the largest contentful element found in the browser window

We can measure Core Web Vitals in several ways, such as by performing a Google PageSpeed test or in Search Consoli.

Are there shortcuts?

Of course they are. For WordPress and other popular CMS there are ready-made plugins. Even if you want to use one, it is worth reading the above write-ups to know what is being done and why. Remember that all such plugins require configuration. There is no good solution for all parties and for each you have to work out these settings yourself. Because, for example, a 30-day cache for the home page of a news site is not a good idea.

Another shortcut that can ultimately prove to be very bumpy is AMP implementation. I’ve already written a bit about AMP as well, but the key facts are: when it comes to optimizing page speed, AMP will do most of the work for you. But it will tie your hands on many other issues. Your decision.


And so it goes. There are, of course, other issues, such as lazy loading, CSS sprites and many others, but implementing those discussed should bring the most benefits. It’s important, of course, whether you outsource your site’s speed optimization to a specialist or undertake it yourself, to always know what you’re doing and what benefits it can bring you.

And if you need help with speed optimization or with any other SEO problem, I remind you that I help with minor issues free of charge, and a little more for a fee I do comprehensive SEO with a focus on the technical aspect.

Leave a Reply

Your email address will not be published. Required fields are marked *