A Technical SEO Checklist (2021)

by | Sep 10, 2021

Although there are many moving parts to an SEO strategy, ensuring that your technical SEO checklist is ticked will help you create a solid foundation for your page and ensure that your site is optimized to be crawled and indexed. 

Google is now hyper focused on the user experience and there are a few identifiers they will flag poor user experience. That is why working towards fixing these technical issues will allow your site to be categorized and favoured by Google, and ultimately allow your content to be more visible. 

Regardless of your industry, having sound technical SEO as the foundation of your marketing strategy will allow the content you are creating to be discovered. With that in mind, here are some ways you can ensure that your website is set up appropriately to be crawled by the bots. 

Find and Fix Crawl Errors

Google Search Console can help you identify any crawl errors. When checking the Coverage Report you will be able to identify both errors and excluded pages on your site. Take the time to explore why these are being excluded. Resolving any issues will improve your crawl rate and help search engines to better index your website. Ensure that you: 

  • Correctly implement all redirects with 301 permanent redirects.
  • Go through any 4xx and 5xx error pages to identify potential redirection opportunities.

Improve Your Site Speed 

It’s proven that people will not wait around for your site to load. With 53% of mobile users bouncing, if your page doesn’t appear within 3 seconds, you will want to make sure that your pages are loading quickly. An ideal page loading time is less than 2 seconds. Load times can be affected by large image files, redirect loops and chains, server capacity and more. Use our free PageSpeed Insights Tool to crawl multiple URLs and identify areas of your site that may be slowing down your loading times.

Fix Broken Links 

Broken links are another signal to Google of a poor user experience. A list of your site’s broken links can be found in your Site Audit Report. These can be reconciled easily by either fixing the broken link and replacing it with a new one or removing it all together.  

 Secure Your Site 

Using a security certificate (SSL encryption) on your site has been a known ranking factor since 2004. Secure websites protect the integrity of user data collected on your website. This means your site can be trusted with sensitive information like email address, credit card numbers, etc. Regardless of the content on your website, Google recommends an SSL certificate on every page to ensure it is secure. These can be easily purchased through your hosting service provider. 

Ensure that Your Website is Mobile Friendly 

Google has made the change to mobile-indexing first for all websites. That means that Google checks if you have a mobile-friendly version of your website first before they move on to indexing your browser site. If you have not ensured that your site is mobile-friendly, it will directly affect your organic search visibility. You can test whether or not your site is mobile friendly by using Google’s Mobile Friendly Test Tool

Set up a 301 Redirect to the Correct Version of Your Site 

For multiple domain name versions, ie:

https://www.jonsmith.com
https://jonsmith.com
http://www.jonsmith.com
https://jonsmith.com


You will want to ensure that you make it very clear to Google, which version to consider for indexing to avoid duplicate content and pages. This can be done by: 

  • Setting up 301 redirects to the primary version of the URL. 
  • Add canonical tags on all pages
  • Set pages to ‘noindex’ if and when appropriate, such as category or login pages 
  • Setting up the preferred domain in Google Search Console
  • When possible, consolidating any duplicate content 

Ensure Your Website has an XML Sitemap

An XML sitemap clearly outlines the structure of your website, and like the name suggests, it’s a map for search engine bots to follow in order to properly understand and index your website. This is especially helpful if you have a website with multiple pages. This saves search engines from guessing the hierarchy of your site and tells them exactly which pages are the most important (and therefore should be crawled and indexed first). Update your sitemap regularly by adding: 

  • Any new content, such as blogs, products, promotions, etc.
  • Only URLs with a 200 server status response

Ensure Your Site has an Optimized Robots.txt File

A robots.txt file provides instructions to search engines how to crawl your site. This will help optimize your crawl budget and ensure that your most important pages are being indexed. Additionally, disallow any pages that you don’t want to be indexed. Examples of assets to exclude in your robots.txt file are: 

  • Temporary files
  • Admin pages
  • Cart & checkout pages
  • Search-related pages
  • URLs that contain parameters

Great SEO starts with a solid foundation and ensuring that all of these technical measures are taken is a great first step in allowing your content to be discovered by search engines. Want more SEO news? Check out weekly content on our blog or get in touch directly. We’re always happy to chat.  

As Per That Last Article…