Since the start of 2020, Google has rolled out the Core Web Vitals report into Google Search Console (GSC); effectively replacing the previous speed report.
Although the speed report has been removed, many of the previous metrics have been pulled through to this new report, which is being powered by its new web vitals engine.
As with many Google releases, they’ve kept their cards close to their chest and have yet to reveal just exactly what has been changed. However, it does show various new metrics that website owners need to be familiar with going forward. It also shows a split between mobile performance and desktop performance.
Over the last 18 months, Google has been pushing through its mobile first update. Therefore, it’s not surprising to see that user experience on handheld devices is at the forefront of this new update.
With such a focus on user experience including speed, mobile friendliness algorithm shifts and improving search experience to offer featured snippets, structured data, ‘zero search results’ pages (where only one definitive answer to a query is provided) it shouldn’t come as a surprise that Google is now shifting the limelight back to websites.
Google’s focus now seems to be on providing a top quality user experience. Web Vitals aim is to provide guidelines and quality signals for website owners to ensure their site meets the modern needs of users.
Of course, this is all common sense. By providing the best experience for your users on your website, you can reap the rewards of more visitors and actions on your site, whether that’s through sales, leads or enquiries.
The criticism in the past seems to be that many of the suggested metrics and changes that web owners should be making on their sites is too technical, and requires too much development knowledge. Whilst that may be the case still, Core Web Vitals aims to break this into more manageable sections.
The set of metrics for 2020 splits out the user experience into three facets: loading, interactivity and visual stability. There are additionally a set of thresholds for acceptable levels of delay for each.
For each of the above metrics, to ensure you’re hitting the recommended target for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop devices.
The easiest way of seeing your Core Web Vitals is within your GSC account and navigating to the property.
Otherwise, you can use the newly released Web Vitals Chrome Extension to see your own website’s performance. Crucially, you can see how you can compare with your competitor’s websites and other sites.
Many of the optimisations and improvements that are suggested to be made on your site will vary depending on your platform, server capacity, hardware and many other variables. To help, here are some key optimisations to help you get started.
Unlike First Contentful Paint (FCP), which measures how long it takes for the first bit of content to render, Largest Contentful Paint (LCP) looks at how long and when the largest content element on your webpage becomes visible.
A good score would be a LCP load of under 2.5 seconds, whilst anything over 4.0 seconds is poor.
The most common causes of a poor LCP score are:
A faster response time from your server improves this LCP metric and just about every other. You should look to optimise your server where possible, which may include implementing a CDN to serve media, caching assets, serving HTML pages cache-first or establishing third-party connections (such as payment gateways or marketing integrations).
Whilst HTML provides the structure to your webpage, JavaScript and CSS make it easy on the eye. However, loading these elements too soon can block the webpage from rendering correctly, causing an unnecessary delay. There are several improvements you can make, including reducing JavaScript & CSS blocking time, minify or defer non-critical CSS or inlining important CSS styles to reduce repeat requests for the same styles.
Image and video elements as well as other media can affect load speed. Optimising, resizing and compressing your images is one of the most effective ways of improving performance, which marketers commonly overlook. You could also implement a CDN to serve media, preload important resources where possible, use progressive image formats such as JPEG 2000 or WebP or simply don’t use images if it’s not relevant to the content.
Some websites use JavaScript to load and render pages directly in the users’ browser, which should be approached in a more careful way to avoid longer LCP wait times and risk your users leaving your site before the full page has rendered. Minimise critical JavaScript where possible, user server-side rendering or pre-rendering to help improve client-side performance.
First Input Delay (FID) measures a webpage’s responsiveness, from when a user first interacts with a page to when the browser is able to respond and serve the interaction. This could be clicking or tapping on a link or entering a string of text into a field.
A good time would be a FID of less than 100ms, whilst anything over 300ms would be an indicator of a poor score.
Problems with FID occur when multiple requests are being parsed, compiled and executed on your website at the same time and cause errors when the user’s browser cannot process the requests.
There are several ways of improving FID, but may require substantial scoping and development to rectify.
The easiest way of reducing the time it takes to execute JavaScript on your site is reduce the amount of elements that require it. If you’re not able to, then aim to minify and compress JavaScript files and defer any used files from loading.
A piece of code which blocks the main thread for over 50ms is considered to be a Long Task, which can hamper page load and cause unresponsive pages. By breaking up these tasks into shorter, more manageable requests you may be able to reduce the delay in load.
Both first and third party script can cause issues and delays with interaction readiness, including long execution times and ineffective fetches. Reduce the waterfall effect of fetches to improve latency and, where possible, lazy load third party code until it is needed to not impact on the load time of critical elements that your users are expecting to see first.
Layout shifts are the bane of web users on just about every device. Think about that time you were looking for a quick recipe, only to be thrown down the page whilst a huge image loads.
Not only are these inconvenient to users, they could also push them further away from committing an important action on site – i.e. if your Add to Cart or Subscribe button jumps away from your user at the last minute, it can cause all kinds of frustration.
Cumulative Layout Shift (CLS) aims to measure the stability of a website. This is done by measuring elements on a page that are still moving after 500ms of a click, scroll or other input from the user.
A good score would be a CLS of less than 0.1, whilst anything over 0.25 would be an indicator of a poor score.
A CLS is most commonly caused by on-page or dynamic media content; having the knock on effect onto other elements on the website. This causes the viewport to move around unexpectedly until the website is fully loaded and rendered.
This does not just include ensuring that your media have sufficient height and width tags in your code, but by managing the aspect ratio required within the CSS to ensure that there is sufficient space reserved in the initial load to serve the image without disruption to the viewport. The same applies to externally served content from iframes or embeds – think Google Maps, videos from YouTube etc.
To explain this element easily, think of the GDPR consent pop-up. You land on a website, only to be shunted down the page by intrusive pop-ups, disrupting the rest of the page’s layout. Again, ensuring that space has been reserved for it within the CSS can reduce the risk of the page jumping around.
Web Fonts are becoming increasingly popular, but if they don’t load correctly you could be at risk of your users seeing invisible text whilst your font downloads and renders. The ‘font-display’ rule allows you to reduce the risk of these occurring but ideally you should aim to preload your web font and serve at first paint.
It’s inevitable that more updates and tweaks to these signals will occur over the next few months. As more data is collected on website performance, there’s no telling how many metrics will be interred into the Core Web Vitals pack in the future.
Similarly, it’s not yet been confirmed whether or not these metrics will become a significant ranking factor for websites. Although, with such a focus on customer experience over the last few years, it is surely inevitable.
As such, by taking action now you can start to work through the most pressing issues that cause slow loading, lag on site and improve your overall user experience.
Not only will this help you produce a more stable, effective and pleasant web experience for your customers, which in turn should lead to more conversions, but you will be able to ensure that you are meeting some of the clearest performance criteria we’ve seen from Google in years.
Google Search Console – a free tool from Google, allowing you to report and measure various metrics involved with your website. This includes search performance, on-site issues and user experience.
PageSpeed Insights – another free tool from Google, PageSpeed Insights analyses the content of your website or a web page and provides suggestions on how to improve performance on mobile and desktop.
Dev Tools in Chrome – built directly into the browser, you can see various elements of a webpage to analyse its performance more carefully. The quickest way of accessing is through the Inspect element when right clicking. The Timings section of the Performance panel in Chrome DevTools includes a LCP marker and shows you which element is associated with LCP when you hover over the Related Node field. The Experience section shows a CLS score with affected elements highlighted for you to see where your problematic areas are.
GTMetrix – a free service with a premium option to analyse your site’s performance. It allows you to use multiple tools to create an overall view of what elements are causing load issues in the form of a waterfall report.
Although Core Web Vitals is a relatively new aspect to the GSC report, the signs that something like this has been in the pipeline for a number of years.
It will be interesting to see how older websites and websites like local news services who rely on intrusive, third party ads to supplement their revenue will fare.
Ultimately, this update comes down to providing web users with a quality experience; something that website owners should be striving for in the first place.
What’s different, is that these new metrics come with a clear threshold of what Google deems to be good, acceptable and poor levels of performance. If you don’t comply, there’s every chance that you may lose traction in search results.
If the everyday user becomes more aware of these changes, they will be more demanding of a quick, high quality online experience. This should raise everyone’s game and produce a lot more quality websites across the board.
If you’re unsure how your website is affected, do get in touch. We can provide an in-depth performance report with a clear audit on which variables you should address first. We also help you to prepare yourself should there be any changes in Google’s search algorithms over the next months as a result of this new tool.
Whilst a lot of the work required to improve your website’s speed may require development, the framework surrounding Core Web Vitals provides a checklist of must-have, quick-win and ongoing performance improvements that can provide instant results from your SEO strategy work.
Contact us today to find out how we can improve your website’s speed with a performance audit.
Drop us a line on 01785 279985
Send us an email hello@kanukadigital.com