Core Web Vitals Update, What Web Owners Need To Know

Over the last few months Google has rolled out the Core Web Vitals report into Google Search Console, effectively replacing the previous speed report.

Even though the speed report has been removed, a lot of the previous metrics have been pulled through to this new report, which is being powered by its new web vitals engine.

What’s been replaced in the Core Web Vitals report?

As with most releases from google, they’ve kept their cards close to their chest and have yet to reveal just exactly what has been changed.  However, it does show various new metrics that website owners need to be familiar with going forward and shows a split between mobile performance and desktop performance.

As we’ve seen over the last 18 months, Google has been pushing through its mobile first update, so it’s not surprising to see that user experience on handheld devices is at the forefront of this new update.

What are Core Web Vitals?

With such a focus on user experience including speed, mobile friendliness algorithm shifts and improving search experience to offer featured snippets, structured data, ‘zero search results’ pages (where only one definitive answer to a query is provided) it shouldn’t come as a surprise that Google is now shifting the limelight back to websites.

Google’s focus now seems to be on providing a top quality user experience, and Web Vitals is their attempt to provide a set of guidelines and quality signals to allow website owners to ensure their site meets the modern needs of users in 2020.

Of course, this is all common sense – by providing the best experience for your users on your website you can reap the rewards of more visitors and actions on your site, whether that’s through sales, leads or enquiries.

The criticism in the past seems to be that a lot of the suggested metrics and changes that web owners should be making on their sites is too technical, and requires too much development knowledge, and whilst that might be the case still, Core Web Vitals aims to break this into more manageable sections.

The current set of metrics for 2020 splits out the user experience into three facets: loading, interactivity and visual stability – as well as a set of thresholds for acceptable levels of delay for each.

  • Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
  • First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.

For each of the above metrics, to ensure you’re hitting the recommended target for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop devices.

How to measure Core Web Vitals

The easiest way of seeing your Core Web Vitals is within your Google Search Console account and navigating to the property in question.

Otherwise, you can use the newly released Web Vitals Chrome Extension to see your own website’s performance and, crucially, see how you can compare with your competitor’s websites and other sites. 

A lot of the optimisations and improvements that are suggested to be made on your site will vary depending on your platform, server capacity, hardware and many other variables, but we’ve cherry picked some key optimisations below to help you get started.

How to optimise Largest Contentful Pain (LCP) 

Unlike First Contentful Paint (FCP), which measures how long it takes for the first bit of content to render, Largest Contentful Paint (LCP) looks at how long and when the largest content element on your webpage becomes visible.

A good score would be a LCP load of under 2.5 seconds, whilst anything over 4.0 seconds is considered poor.

The most common causes of a poor LCP score are:

  • Slow server response times

A faster response time from your server improves this LCP metric and just about every other.  You should look to optimise your server where possible, which may include implementing a CDN to serve media, caching assets, serving HTML pages cache-first or establishing third-party connections (such as payment gateways or marketing integrations).

  • Render-blocking JavaScript and CSS

Whilst HTML provides the structure to your webpage, JavaScript and CSS make it easy on the eye.  However, loading these elements too soon can block the webpage from rendering correctly, causing an unnecessary delay.  There are several improvements you can make, including reducing JavaScript & CSS blocking time, minify or defer non-critical CSS or inlining important CSS styles to reduce repeat requests for the same styles.

  • Slow resource load times

Image and video elements as well as other media can affect load speed.  Optimising, resizing and compressing your images is one of the most effective ways of improving performance which is commonly overlooked.  You could also implement a CDN to serve media, preload important resources where possible, use progressive image formats such as JPEG 2000 or WebP or simply don’t use images if it’s not relevant to the content.

  • Client-side rendering

Some websites use JavaScript to load and render pages directly in the users’ browser, which should be approached in a more careful way to avoid longer LCP wait times and risk your users leaving your site before the full page has rendered.  Minimise critical JavaScript where possible, user server-side rendering or pre-rendering to help improve client-side performance.

How to optimise First Input Delay (FID) 

First Input Delay (FID) measures a webpage’s responsiveness, from when a user first interacts with a page to when the browser is able to respond and serve the interaction.  This could be clicking or tapping on a link or entering a string of text into a field.

A good time would be a FID of less than 100ms, whilst anything over 300ms would be an indicator of a poor score.

Problems with FID occur when multiple requests are being parsed, compiled and executed on your website at the same time and cause errors when the user’s browser cannot process the requests.

There are several ways of improving First Input Delay, but may require substantial scoping and development to rectify.

  • Reduce Javascript execution time

The easiest way of reducing the time it takes to execute JavaScript on your site is reduce the amount of elements that require it.  If you’re not able to, then aim to minify and compress JavaScript files and defer any used files from loading or

  • Break up long tasks

A piece of code which blocks the main thread for over 50ms is considered to be a Long Task, which can hamper page load and cause unresponsive pages.  By breaking up these tasks into shorter, more manageable requests you may be able to reduce the delay in load.

  • Optimise your page for interaction readiness

Both first and third party script can cause issues and delays with interaction readiness, including long execution times and ineffective fetches.  Reduce the waterfall effect of fetches to improve latency and, where possible, lazy load third party code until it is needed so as not to impact on the load time of critical elements that your users are expecting to see first.

How to optimise Cumulative Layout Shift (CLS)

Layout shifts are the bane of web users on just about every device.  Think about that time you were looking for a quick recipe, only to be thrown down the page as a huge image loaded, or if you’re reading a news article only to be interrupted by an intrusive advert.

Not only are these inconvenient to users, they could also push them further away from committing an important action on site – i.e. if your Add to Cart or Subscribe button jumps away from your user at the last minute, it can cause all kinds of frustration.

Cumulative Layout Shift (CLS) aims to measure the stability of a website by measuring elements on a page that are still moving after 500ms of a click, scroll or other input from the user.

A good score would be a CLS of less than 0.1, whilst anything over 0.25 would be an indicator of a poor score.

A CLS is most commonly caused by on-page or dynamic media content, which has the knock on effect onto other elements on the website, causing the viewport to move around unexpectedly until the website is fully loaded and rendered.

  • Images and media without dimensions

This does not just include ensuring that your media have sufficient height and width tags in your code, but by managing the aspect ratio required within the CSS to ensure that there is sufficient space reserved in the initial load to serve the image without disruption to the viewport.  The same applies to externally served content from iframes or embeds – think Google Maps, videos from YouTube, Twitter post embeds etc.

  • Dynamic, external content

To explain this element easily, think of the GDPR consent pop-up.  You land on a website, only to be shunted down the page by an intrusive pop-up which disrupts the rest of the page’s layout.  Again, ensuring that space has been reserved for it within the CSS can reduce the risk of the page jumping around.

  • Issues with Web Fonts

Web Fonts are becoming increasingly popular, but if your fonts are not loaded correctly you could be at risk of your users seeing either unstyled text or invisible text whilst your font downloads and renders.  The ‘font-display’ rule allows you to reduce the risk of these occurring but ideally you should aim to preload your web font and serve at first paint.

What happens if you don’t take action

It’s inevitable that more updates and tweaks to these signals will occur over the next few months and as more data is collected on website performance there’s no telling how many metrics will be interred into the Core Web Vitals pack in future.

Similarly, it’s not yet been confirmed whether or not these metrics will become a significant ranking factor for websites, although with such a focus on customer experience over the last few years then it is surely inevitable.

As such, by taking action now you can start to work through the most pressing issues that cause slow loading, lag on site and improve your overall user experience.

Not only will this help you produce a more stable, effective and pleasant web experience for your customers, which in turn should lead to more conversions, but you will be able to ensure that you are meeting some of the clearest performance criteria we’ve seen from Google in years.

Useful performance tools

Google Search Console – a free tool from Google which allows you to report and measure various metrics involved with your website, including search performance, on-site issues and user experience.

PageSpeed Insights – another free tool from Google, PageSpeed Insights analyses the content of your website or a web page and provides suggestions on how to improve performance on mobile and desktop.

Dev Tools in Chrome – built directly into the browser, you can see various elements of a webpage to analyse its performance more carefully.  The quickest way of accessing is through the Inspect element when right clicking.  The Timings section of the Performance panel in Chrome DevTools includes a LCP marker and shows you which element is associated with LCP when you hover over the Related Node field.  The Experience section shows a CLS score with affected elements highlighted for you to see where your problematic areas are.

GTMetrix – a free service with a premium option, to analyse your site’s performance and allows you to use multiple tools to create an overall view of what elements are causing load issues in the form of a waterfall report.

To wrap up…

Although Core Web Vitals is a relatively new aspect to the Google Search Console report, the signs that something like this has been in the pipeline for a number of years.

It will be interesting to see how older websites with clunky UI and websites such as local news services who rely on large, intrusive, third party ads to supplement their revenue will fare as the creases are ironed out over the next few months.

Ultimately, this update comes down to providing web users with a quality experience – something that website owners should be striving for in the first place.

What’s different, is that these new metrics come with a clear threshold of what Google deems to be good, acceptable and poor levels of performance and if you don’t comply, there’s every chance that you may lose traction in search results.

If the everyday user becomes more aware of these changes, even subconsciously, they will be more demanding of a quick, high quality online experience then this should raise everyone’s game and produce a lot more quality websites across the board.

Future-proofing your website

If you’re unsure how your website is affected, do get in touch.  We can provide an in-depth performance report with a clear audit on which variables should be addressed first and how you can prepare yourself should there be any changes in Google’s search algorithms over the next months as a result of this new tool.

Whilst a lot of the work required to improve your website’s speed may require development, the framework surrounding Core Web Vitals provides a checklist of must-have, quick-win and ongoing performance improvements that can provide instant results.

Contact us today to find out how we can improve your website’s speed with a performance audit.

To learn more about how we can help to future-proof your website, get in touch using the form below.