At Kanuka Digital, we know a thing or two about SEO, and people ask us all the time for a primer on SEO basics. We’ve compiled some of our top tips in one handy guide for you to reference.
By the time you reach the end of this SEO basics guide, you’ll have the basic understanding of what SEO is, why it’s important, how to do SEO and how to get great results in an ever-changing SEO environment.
What you’ll learn:
1. What is SEO & Why is it Important?
2. Keyword Research & Keyword Targeting Best Practices
4. Common Technical SEO Issues & Best Practices
6. Additional SEO Considerations (International & Local SEO Best Practices)
You’ve likely heard of SEO, and you likely understand that SEO is the process of affecting the visibility of a web page in an unpaid search engine results page (SERP). However, this doesn’t really help you answer important questions for your business and your website, like:
What’s likely interesting to you is to know how SEO can help to help drive more relevant traffic, leads, sales and revenue for your business. That’s what we’ll focus on in this guide.
Search engines work through three primary functions:
When search engines crawl your site, they send out a team of bots (known as crawlers or spiders) to find new and updated content.
These crawlers begin by fetching a few web pages, then they follow the links on those pages to find new URLs.
As a result of hopping along this path of links, the crawler is able to find new content to add to their index. This can later be retrieved when a searcher is seeking information that the content on that URL is a good match for.
Once your site has been crawled, your valuable content is stored and organised in the index database. Once a page is in the index, it can potentially be displayed as a result for relevant search queries.
When a search is performed, search engines scour their index for the most relevant content. They then order that content to solve the searcher’s query. This ordering by relevance is referred to as ranking.
Generally, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.
However, if your content isn’t accessible in the previous stages, your site will not rank in the SERPs.
Remember: Even if a page is crawlable and has been indexed, it won’t necessarily make it to page 1 as there are many factors and competitors also doing SEO!
When users visit your page through organic search, they are looking to answer a query. SEO allows you to create a site that searchers may find valuable or useful, improving rankings and conversions.
SEO is also low cost, meaning you don’t need to have an advertising budget to drive organic traffic to the site, as opposed to paid ads.
As well as this, people are more likely to visit sites that they can trust (Google will push these sites to the top of the search page), so it’s a no-brainer that we should be optimising our content for search engines.
The first step in SEO is to determine what it is you’re optimising for. This involves understanding the search terms (known as keywords) that you want your website to rank for in search engines.
However, there are a few factors to consider when determining the keywords you want to target on your site.
Firstly, determine what keywords you want your page to rank for. Here are some things to consider:
Once you have your keywords, you’ll want to get into the mindset of the user. You need to know:
Once you’ve answered these questions, you’ll have an initial list of potential keywords. This will help you get additional keyword ideas.
Now that you have an idea of what things users are searching for, you can use various tools and software to conduct relevant keyword research. Below is a list of our favourite keyword research tools:
Once you have your keyword list, you need to implement your keywords into your content. Each page on your site should target a core term and a handful of related terms.
Here’s an example of a page that has been optimised for on-page SEO:
Remember: it’s important to create content for a positive user experience, not just to solely benefit your SEO.
Now let’s look at a few basic, but critical, on-page elements to understand how to drive search engine traffic to your website.
Related: Read our Beginner’s Guide to Writing Metadata
The title tag is not your page’s primary headline. The headline you see on the page is typically a H1 HTML element. The title tag is what you can see at the very top of your browser.
Including the term that you want to rank for in your pages is valuable and the most impactful place you can put your keyword is your page’s title tag.
Google displays the length of a title tag based on pixels, not character count. Generally, 55-60 characters is a good rule of thumb. Where possible, integrate your main keyword into your title tag in a natural and compelling way to be user friendly.
Title tags are the “headline” in organic search results, so think about how clickable your title tag is.
Although the title tag is your search listing’s headline, the meta description is effectively your site’s additional ad copy.
Google may not always display the meta description that you set, but if you have a compelling description of your page that would make searchers likely to click, you can greatly increase traffic.
Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.
Make sure that every page on your site has a meta description that are unique descriptions for each page on your site. Meta descriptions should be 50-160 characters long.
Below is an example how a meta description would show in search results:
Different types of pages will have different roles. Cornerstone content that you want many people to link to should be different from your support content that you want people to find and get an answer from quickly.
That said, Google increasingly favours certain types of content, so there are a few things to keep in mind:
Thick & Unique Content – there is no specific number when it comes to word count on your site. However, if you have a large number of pages with 50-200 words or duplicated content, that could get you into trouble. If a large percentage of your site contains thin, duplicated and low value content, identify a way to “thicken” those pages. You can also check your analytics to see how much traffic they’re getting, and exclude them (using a noindex meta tag) to stop them from appearing on search results.
Engagement – over recent years, Google has put more weight on engagement and user experience metrics. Your content should always answer the questions that searchers are looking for in order to stay on your page and engage with your content. Ensure that your pages load quickly and are not designed in a way that would disengage searchers and send them away.
Shareability – in the same way you want to be careful when producing large quantities of thin content pages, you also need to consider who is likely to share and link to new pages you’re creating on your site. Large quantities of pages that aren’t likely to be shared or linked to will make those pages less likely to rank well in the SERPs.
Alt attributes are an HTML element that allow you to provide alternative information for an image if a user is unable to view it.
Your images may break over time, so having a useful description of the image can be helpful from an overall usability perspective. This also gives you another opportunity to help search engines understand what your page is about.
The way you mark up your images can impact the way that search engines perceive your page, as well as how much search traffic from images your site generates.
Make sure you don’t keyword stuff your core keyword and every possible variation of it into your alt attribute. As a rule of thumb, if it doesn’t fit naturally, don’t include your target keyword here.
By writing naturally about your topic, you’re avoiding over-optimisation filters, giving you a better chance to rank for valuable, long tail variations of your core topic.
Be sure not to skip the alt attribute though! Try to give a thorough, accurate description of the image (you’re essentially describing it to someone who can’t see it).
Your site’s URL structure is important from a tracking perspective and a shareability standpoint.
Don’t cram in as many keywords as possible; create a short, descriptive URL.
Moreover, don’t change your URL if you don’t have to. Even if your URLs aren’t visually appealing, if you don’t feel as though they’re negatively impacting users and your business in general, don’t change them.
If you do have to change your URL structure, use a 301 permanent redirect. This is a common mistake businesses make when they redesign their websites.
Finally, once you have all of the standard on-page elements taken care of, you can consider going a step further and better helping search engines to understand your page.
Schema markup isn’t currently a ranking factor. However, it does give your listing some additional “real estate” in the search results, the way ad extensions do for Google Ads.
If no one else is using schema, you can gain an advantage in CTR because your site will show things like ratings while others won’t.
While the basics of SEO have changed in recent years, traditional SEO is still incredibly important when generating traffic from search engines.
Technical SEO is really its own discipline, but there are some common mistakes and issues that many sites face that you should be aware of.
If you haven’t set up Google Search Console (GSC) already, make sure you do! This free tool helps you identify any problems when indexing or crawling your site.
Go to the GSC main page and enter your domain. You’ll then see several options for verification. Select the most suitable option and verify your site.
Use the coverage report to check that there are no issues that are preventing your site from being crawled and indexed. You can do this by selecting the report from the sidebar menu on the left:
A sitemap tells the crawler which files you think are important in your site.
For most websites, you’ll find that this sitemap sits at:
https://www.yourdomain.com/sitemap.xml
Note: This can differ between platforms
If you need to create a sitemap, you can generate one using the Sitemaps page and submit it to Google Search Console.
Follow the steps below to add new XML sitemap in Magento 2:
1. Navigate to Marketing > SEO & Search > Site Map.
2. Select an existing sitemap to edit or, if you don’t have a XML sitemap yet, select the ‘Add Sitemap’ button.
3. Set the XML file name in the ‘Filename’ field, enter the ‘Path’ where it will be located, then ‘Save & Generate’.
Now you’ll see the list of sitemaps.
4. Select ‘Generate’ to generate the XML sitemap file.
As standard, Magento 2 XML Sitemap consists of links for products, product images, categories and static pages.
Search engines are placing an increasing emphasis on having fast-loading sites. The good news is this is not only beneficial for search engines, but it can lower bounce rates and increase conversion rates.
Google’s PageSpeed Insights tool gives you some specific suggestions on what to change on your site to reduce your load time.
Related: Why Site Speed Matters
HTTPS is the secure version of HTTP. HTTPS is encrypted to increase security of data transfer, which is particularly important for eCommerce sites.
Run a check to see if you can access your site using https:// rather than http://.
If you can, that’s great! There’s no further action needed. If you find that your site is still on HTTP, speak with your developer to migrate to HTTPS.
Thin and duplicate content is another area of emphasis.
By having the same or near-identical content on multiple pages, you’re diluting link equity between multiple pages as opposed to concentrating it on one page. As a result, you have less chance to rank for competitive phrases compared to sites that are consolidating their link equity into a single page.
In the eyes of search engines, having large quantities of duplicated content makes your site look cluttered with lower-quality content.
Some free tools to check if you have duplicate content on your site include:
Your sites robots.txt file can be found at https://www.yourdomain.com/robots.txt and will look something like this:
If you don’t have robots.txt files, make sure you create one.
You may be asking, why do I need to create a robots.txt file? The answer is because robots.txt instructs bots on how to crawl your site. For example, if there are pages that bots shouldn’t crawl that aren’t blocked, this can result in too many pages (either duplicates or low value) being indexed.
Since many search engine algorithms are largely based on links, having high-quality links to your site is incredibly important when driving search traffic.
You can do all the work on your on-page and technical SEO, but if you don’t have links to your site, you won’t show up in search results listings.
There are a number of ways to get links to your site, but many of them have become extremely risky. If you are looking to leverage the channel, riskier and more aggressive methods of getting links likely aren’t a good fit for your business.
Furthermore, trying to create links specifically to manipulate rankings doesn’t create any other value for your business in the event that the search engine algorithms shift and your rankings disappear.
Bad links can result in a site being penalised, which results in a huge drop in organic traffic. You then need to clean the link profile and wait for Google to process it which can take a while.
A way to develop links in a more sustainable approach is to focus on more general, sustainable marketing approaches. For example, you could create and promote useful content that includes specific terms you’d want to rank for and engage in traditional PR.
There are some search environments that require unique approaches. These include the following:
Once you have started writing great SEO content and putting all of the above steps into motion, how do you track your SEO success?
The answer is to track the amount of organic search traffic to your website. If you’re using Google Analytics, you can view your organic traffic by navigating to:
Acquisition > All Traffic > Channels > Organic Search
Important: One of the difficulties with SEO is that we don’t have keyword data in Google Analytics – the Query Report in Google Search Console is as close as it gets. We get organic clicks, organic impressions, organic CTR and organic rank for all the queries that had organic impressions. However, we can’t really tell that X keyword has brought X revenue.
Below are some of our favourite free SEO tools that you can use to help with your SEO success.
Keyword research tools
Google Keyword Planner – Helps you to find unique keyword ideas.
Keyword Generator – Generates hundreds of keyword ideas that you could use.
SERP Checker – Gives you the estimated search traffic potential for the term based on the top three results.
Keyword Difficulty Checker – Allows you to check the Keyword Difficulty score for the keyword.
On-page SEO tools
Yoast SEO – A WordPress plugin to add title tags and meta descriptions to posts and pages.
Ahrefs’ Webmaster Tools – Finds missing title tags, meta descriptions and alt attributes on your site.
Link building tools
Ahrefs’ Backlink Checker – See the top 100 backlinks to any website or page.
Gmail – No fancy outreach software needed for Gmail, simply send outreach emails to link prospects.
Technical SEO tools
Google Search Console – This highly useful tool discovers index coverage errors, page speed issues and more.
Ahrefs Webmaster Tools – Finds SEO issues on your website, including many technical issues.
Everything above should serve as a pretty thorough introduction to the basics of SEO. We understand that there’s a lot to take in, so feel free save the article and go through the process of SEO together, step by step.
If you have any questions, don’t hesitate to get in touch!
Drop us a line on 01785 279985
Send us an email hello@kanukadigital.com