- - SEO - LINK BUILDING - SEO - LINK BUILDING - SEO - LINK BUILDING

GOOGLE PAGESPEED INSIGHTS NOW SHOWS YOU YOUR SCORES DIVIDED BY PLATFORM AND SPLIT INTO TWO, ‘PAGE SPEED’ AND ‘OPTIMIZATION.’
The Speed Update launched on July 9, 2018, resulted in these changes. Googles uses field data to measure speeds instead of being reliant on lab data. Google is able to calculate how fast your average user finds your site by extracting information from the Chrome User Experience Report (CrUX) database.
Basically, this means that your speed score, and possibly your website’s SEO ranking, could be impacted even if your website is blazing fast on your system because visitors using older smartphones might have a slower experience. You need to double down on optimising speed to avoid this.
The following are Google’s nine PageSpeed Insight Rules, their best-practice advice, and some advanced steps to help you optimise your site.
Your mobile site experience might feel substandard due to delayed page rendering caused by redirects. Additionally, the Hypertext Transfer Protocol (HTTP) request-response round trip is added by each redirect. They can also add additional round trips also to perform the domain name system (DNS) lookup, transmission control protocol (TCP) handshake, and transport layer security (TLS) negotiation.
It doesn’t matter whether you’re website is driving leads or online sales via eCommerce, your website design and development should be responsive, and there should only be a single redirect from the concerned URL to your target landing page.
Redirects should be avoided wherever possible, but if you do end up having to use a redirect, use the correct one for your needs.
If you don’t have an alternate page to redirect users to, or when you delete old content and redirect to new content, use permanent redirects (301). When redirecting users to device-specific URLs or when making short-term changes, such as limited time offers, use temporary redirects (302). Your link equity will be maintained regardless.
HTTP redirects cause some latency on the server-side, while JavaScript-based redirects slow down the client-side as they need to download the page, then parse and execute the JavaScript before the redirect is triggered. Both types of redirects are supported by Googlebot.
The download time and data usage for your clients are reduced when you reduce your content size. This also enables the page to render faster and improve your SEO rankings.
Gzip all content that can be compressed. Sample configuration files for most servers can be found through the HTML5 Boilerplate project.
Compression is an excellent tool, but it’s even better if you don’t send the resource. Eliminate unnecessary data by periodically reviewing your site’s resources before they are compressed for best results.
Alternatives to Gzip include Brotli. A lossless compression algorithm, Brotli combines a modern variant of the LZ77 algorithm, Huffman coding, and second order context modelling. Most modern browsers support it, and its compression ratio is comparable to the best of currently available general-purpose compression methods. As Brotli compresses very slowly and decompresses fast, it is best to pre-compress static assets with Brotli+Gzip at the highest level, and dynamic HTML should be compressed with Brotli at level 1–4.
HTML code, as well as various digital assets, that are required by your page, can be compressed. But different techniques and algorithms need to be applied to your web fonts, images, CSS, and so on to achieve optimal results. For example, HPACK compression for HTTP response headers will reduce unnecessary overhead if you’re using HTTP/2.
If a page doesn’t load within three seconds, 53 percent of mobile visitors will abandon it. Hence fast server response times are a must.
To avoid Central Processing Unit (CPU) starvation, slow application logic, slow database queries, slow routing, slow frameworks, and slow libraries, it is essential to have high-quality website development.
Server response time should be under 200ms.
WebPageTest.org, Pingdom, GTmetrix or Chrome Dev Tools are some tools that will pinpoint existing performance issues and enable you to identify the bottlenecks in your content delivery process. It’s important to keep in mind that even if your tests show a site speed <200ms, a user on a 3G network, using an older-generation Android device, might experience 400ms RTT and 400kbps transfer speed. Your Site Speed score can be negatively impacted by this. Your aim must be to improve the user’s experience by having:
While configuring your server:
More round trips mean more delays and higher data costs for your visitors, and are the result of fetching resources over the network. Having an effective caching policy helps to mitigate this by helping the client figure out if and when it can reuse responses that it has returned in the past.
Have well defined and explicit caching policies that answer the following:
A minimum cache time of one week is recommended by Google and up to one year for static assets.
The browser can cache a response without needing to communicate with the server automatically by using Cache-control directives such as how (For e.g., “no-cache” and “no-store”) and for how long (For e.g., “max-age,” “max-stale” and “mini-fresh”).
Data transfer can be prevented if a resource hasn’t changed since the last time it was requested by using an Entity tag (ETag) HTTP header to communicate a validation token. Resource update checks are more efficient this way.
Google has developed a checklist and a flowchart to enable you to cache as many responses as possible for the longest possible period and provide validation tokens for each response.
To avoid revalidation, the rule of thumb is that mutable (likely to change) resources should be cached for a very short time, whereas immutable (static) resources should be cached indefinitely.
Redundant data from the resources delivered to your visitors can be eliminated by minification. Your overall site speed and performance can be significantly impacted by this.
There should be no redundant data within your web assets (e.g., comments or space symbols in HTML code, repeated styles in CSS, or unnecessary image metadata).
Even though it sounds like compression, minification is a lot more granular. Page size can be reduced by compression algorithms, but most of them don’t strip unnecessary code from CSS (/* … */), HTML (), and JavaScript (// …) comments, collapse the cascading style sheets (CSS) rules, or carry out other content-specific optimizations.
Minification is not limited to text-based assets like hypertext markup language (HTML), CSS, and JavaScript. It can also be applied to images, video, and other types of content as deemed appropriate. For example, if you’re publishing images on a photo-sharing site, you might retain the various forms of metadata and various payloads that they contain.
The load of minifying the thousands (if not millions) of different resources on your website can be reduced by using tools. PageSpeed Module by Google does this automatically, and it can be integrated with Apache or Nginx web servers. HTMLMinifier (for HTML), CSSNano or CSSO (for CSS), and UglifyJS (for JavaScript) are some of the alternate tools available.
60% of your web page size is accounted for by images, and large images can significantly slow down your site. Optimising images helps by reducing file size without significantly impacting visual quality.
It is essential that your website and images are responsive. When you want to specify different images depending on device characteristics, use the picture element and use a srcset attribute and the x descriptor in the img element to inform browsers when to use specific images.
This checklist provides the most common optimisation techniques:
There’s no single optimal solution when it comes to image optimisation. There are many techniques available to reduce the image size, but finding the best fit for your images will require careful consideration of format capabilities, the content of encoded data, quality, pixel dimensions, and more. Visit Google’s guide to Optimising Images for more tips.
When rendering a page, browsers typically follow these five steps:
Basically, before it can be rendered, a page needs to process CSS. This process often requires multiple round trips, which will delay the time to render when your CSS is full of render-blocking external stylesheets.
Eliminate small external CSS resources by inlining small CSS directly into the HTML document.
Bear in mind that even though inlining small CSS can speed up the time it takes for a browser to render the page when you inline large CSS files, it increases the size of your above-the-fold CSS and will actually increase render time.
Likewise, inlining CSS attributes on HTML elements can result in unnecessary code duplication. By default, this can be blocked by a Content Security Policy.
Loading your content will require multiple round trips to load and render your content if your above-the-fold content exceeds the initial congestion window (typically 14.6kB compressed). Users, especially mobile users, can experience significant delays in page loading and high latencies.
The size of the above-the-fold content should be no more than 14kB (compressed).
Data size required to render above-the-fold content should be limited by using all the tips and tricks detailed above, to reduce the size of your above-the-fold content.
HTML markup should be organised to render above-the-fold content immediately. The rate at which your above-the-fold content loads and renders can be significantly improved by changing your HTML markup structure. Bear in mind that the required changes may vary from page to page. One example is that you may need to split your CSS into different parts. An inline part responsible for styling the above-the-fold portion of the content and a stylesheet that defers the remaining part. Or you might need to reorder what loads on your page first (For e.g.: main content before widgets).
A page needs to build its DOM by parsing the HTML before a browser is able to render your page, as mentioned in step 7. Before it can continue building the DOM tree, every time a parser encounters JavaScript, it has to stop and execute the new script. In the case of external script, this delay is even more pronounced and can add tens of thousands of milliseconds to the rendering process.
All blocking JavaScript, especially external scripts, in above-the-fold content should be removed.
The browser can be instructed not to block DOM construction while it waits for the script to be loaded and executed, by marking your script tag as async. But bear in mind that this should only be done if you know that you don’t need to change anything within the DOM tree while it’s being parsed/constructed.
To avoid extra network requests, scripts that are necessary for rendering page content should be inlined. To execute quickly and deliver good performance, these should be as small as possible. You should make non-critical scripts asynchronous and defer them until after the first render. Keep in mind that asynchronous scripts may not execute in a specified order.
It is not necessary to render JavaScript libraries used to enhance interactivity or add animations or other effects (For e.g.: JQuery) above the fold. Make these JavaScript elements asynchronous and defer them down the page whenever possible.
Two experiments, one before and one immediately after Google rolled out their update were conducted to find out the impact the Speed Update actually had on SERP positions.
It was discovered that the correlation between a mobile site’s position in the SERPs and its average optimisation score was already extremely high (0.97), but that a site’s first contentful paint (FCP) and DOM content loaded (DCL) metrics (now displayed on PageSpeed Insights beneath the Page Speed score) had little to no bearing on the position, even before the update.
One week after the update, no significant changes were noticed. This is understandable, given that it takes time for the update to come into full action. The correlation between FCP/DCL and position is low, while the correlation between optimisation score and position in mobile search engine result pages (SERPs) is high.
The optimisation scores of sites ranking within the top 30 positions of mobile SERPs have all increased by an average of 0.83 points within the past three months. There seems to be an industry-wide improvement in the quality of websites.
What you should take from this is that you can’t afford to become complacent. The standards for what constitutes a fast, optimised site are evolving. Like SEO as a whole, improving speed is a process, and you have to keep tweaking and improving to remain relevant.
Give us your email and we'll do the same thing every other performance marketing agency does... never email you.