Search Engine Optimization (SEO)

7 Quick SEO Wins to Improve Your Keyword Rankings

10.27.2022 16 Minutes

Marketers often look for quick SEO wins to catapult their pages in the rankings for high-value keywords. However, they sometimes miss out on the “why” of these strategies and what contextualizes their usefulness for individual sites. By exploring the “why” and the “how” behind these SEO tactics, marketers can better implement strategies that will have greater positive impact on our website rankings.

In a recent article, SEO specialist Jason Barnard proposes that search engines such as Google and Bing take the individual ranking factors of a page (topicality, quality, speed, freshness, etc.) and combine them into a “bid,” or combined score, that is then used to rank individual pages in search engine results pages (SERPs).

The higher the bid, the higher the chance that a particular page will show up on the first page in related keyword searches.

Search engines will calculate this final bid by multiplying the scores of each individual ranking factor together. This means that any one inhibiting score can severely handicap a page’s overall bid. Therefore, it is better to score “well” across multiple ranking factors than it is to score above average in one factor but below average in other areas.

Achieving an exceptionally low score for a single ranking factor — such as site speed — will disproportionately affect your site’s rankings by canceling out any positive ranking factors you may have in other areas.

For this reason, the fastest way to improve your site’s rankings is often to improve your website’s scores across the seven most common ranking factor categories:

  • Topicality — The relevance of the page to the user and how well the page satisfies user intent.
  • Quality — The overall quality of the page, including the accuracy of the information, experience of the author, and the trustworthiness of the domain, among other factors.
  • Speed — How quickly the user can begin to interact with the page after clicking on the link in the SERPs.
  • Entities — The correlation between any known entities in the search and any known or associated entities found on the page or its related domain. For example, “baseball gloves” as an entity is related to the ideas of “baseball,” “baseball equipment,” and “sporting equipment.” Thus, a domain that talks about these related ideas is assumed to also have some knowledge about the core entity as well.
  • RankBrain — A machine learning algorithm that Google uses to process and sort search results. RankBrain uses contextual factors such as data from a page’s prior sessions alongside the user’s location, personalization, and specific words used in the query to interpret a user’s true intent and deliver more relevant results depending on the implied context of the search query.
  • Structured Data — A way of helping Google and other search engines understand the content and contexts of a page. Structured data provides explicit clues (such as the publisher, author, or purpose of a page) to Google to help refine and target specific types of user intent.
  • Freshness — The relative “newness” of a page as it relates to the page’s accuracy and content. Pages for user searches about recent events (such as “what was the final score of the Bulls game tonight”) or topics that affect health and welfare (such as topics relating to finance, legal issues, or medical decisions) inherently require additional freshness when compared to searches for general information or historical events.

While the tactics in this article may target more than one of these ranking factor categories (a page’s content, for example, could contribute to a page’s topicality, quality, and freshness scores), it’s important to keep in mind that the goal is to improve your site’s lowest common denominator. This approach will improve the overall quality of your site by removing the negative factors holding it back.

By choosing a quantity over quality approach (i.e. choosing to resolve many SEO problems quickly over trying to optimize for one particularly difficult issue), you will see the greatest improvement in your rankings in the short-term while also positioning yourself to tackle larger SEO issues in the future — once you’re more established in the SERPs.

1
Set up analytics and track key performance indicators (KPIs)

It’s impossible to know where to focus your resources unless you understand what content is driving traffic to your website. If you haven’t already, start by making an account in Google Analytics (or another comparable website analytics tool) and installing the tracking code on your website.

You can gain valuable insights into your website’s core pages (and problems) by simply tracking metrics such as bounce rate and events such as user clicks. Specifically, you can use this information to identify some of the problems outlined below and track your progress towards fixing them.

Similarly, you should also make an account on Google Search Console so you can track how your pages are performing in the SERPS. This is also a required step for several of the topics below (such as resolving indexation bugs) and is generally a best practice for all business websites.

2
Review your site’s indexation

The primary purpose of SEO is to make your pages and website more visible in search results. This means that the first SEO quick win should be to confirm (1) that crawlers such as Google can properly read and index your site, and (2) that search engines are indexing the right pages on your site.

Your first step, then, is to make sure your XML sitemap and robotx.txt files are published and publicly visible on your website.

Your XML sitemap is the document that Google and other search engines will read to figure out what pages you deem important enough to show in search results. It also provides valuable information such as when the article was last updated and how many images are on the page.

Your robots.txt file is how you can keep certain content such as tag pages or the WP Admin dashboard from being viewed by web crawlers. This file can even exclude certain crawlers from viewing your site — generally as a way of limiting server requests from non-essential web crawlers.

To check whether these are present on your site, go to your homepage and then add “/sitemap.xml” to the end of the URL in your browser (i.e. “example.com/sitemap.xml). If a page opens that lists all of the public pages on your website then you have a sitemap installed. If not, you should consider uploading a sitemap yourself. Or, if you’re using WordPress, you should consider installing a plugin such as Yoast or RankMath that can create and publish a sitemap for you.

Once you confirm that your sitemap is present, perform the same steps but for “/robots.txt” to make sure this file is also publicly visible on your site. Check the contents of this file to ensure that crawlers are blocked from reading anything in the /wp-admin/ directory except for functionality-related files such as the Ajax Admin PHP file. Note that this is the default setup for all new WordPress sites, so you may already have such a file on your site.

After you’ve confirmed both files are published on your website, return to Google Search Console and check that your sitemap is submitted under Index > Sitemaps. Google Search Console will then show which pages are being indexed for search results.

Moving forward, spend 5-10 minutes every month checking Google Search Console for crawling errors and other bugs that may harm your website’s presence in the SERPS.

3
Noindex low-value pages that do not satisfy search intent

Completing the previous task will naturally lead into the topic of noindex pages.

As a quick summary, Google and other search engines will “index” any page they come across unless specifically told otherwise. For site-wide applications we can use the sitemap to prioritize the indexation of certain pages. We can then use the robots.txt file to create site-wide rules for indexing, such as blocking access to certain directories or pages. This will naturally funnel Google to the pages you want to show in search results.

However, you should note that Google can still index pages even if you don’t include them in your sitemap. For example, Google may find links to these otherwise hidden pages on other domains. To specifically exclude pages from the search results we need two things to happen.

First, the page must not be blocked by robots.txt, meaning search engines can crawl the page and read its contents. Second, the page must include a noindex meta tag or header to tell the search engine to never index that particular page.

The benefit of this strategy is that we are limiting the pages that are indexed by search engines to only the pages that satisfy user intent. If a page has little to no value to the average member of your audience, then indexing it will only result in a poor user experience.

It’s common for sites to noindex pages that have value to users who are already on the website but may not have value to users who are entering the website for the first time.

Commonly, this means adding noindex tags to author archives, certain custom post types (such as testimonials), duplicate or thin content (such as automatically generated pages), admin and login pages, internal search results, tag pages, category pages (depending on how you use them), or any other pages that you want to keep out of search results.

As is the common theme of this article, this will leave only your strongest results in the SERPS, leading to higher and more accurate average engagement metrics and a better experience for your users.

4
Optimize low-hanging fruit for Google’s Core Web Vitals

As mentioned above, page speed and user experience are two of the key factors that Google and other search engines will consider when ranking a page in the SERPs. Or, put more accurately, pages that have slower loading times and/or unintuitive layouts are more likely to lead to page bounces and dissatisfied users, and Google will correlate this information with the idea that a page or domain is of a lower quality.

For this reason, it is crucial that you ensure your page scores at least close to average in Google’s three core web vitals:

  • Longest Contentful Paint (LCP) — The amount of time it takes to render the largest element in the user’s viewport, from the time when the user requests the URL. This is typically a large image or video file.
  • First Input Delay (FID) — The time from when a user first interacts with your page to the time when the browser responds to that interaction. Basically, how quickly the page becomes interactive.
  • Cumulative Layout Shift (CLS) —The sum of all individual layout shifts across the entire page, where zero means no shifting and one means large layout shifts are present. Having page elements shift while the user is trying to use the page is generally a bad user experience.

You can check your site’s historic web vitals in Google Search Console under Experience > Core Web Vitals. You can also manually review your page performance metrics using the web.dev measure tool.

It’s important to take steps to fix any metrics that score as “Poor” on this report, as they are actively harming the overall quality of your site. Often, you can resolve these issues by targeting and fixing the most common offenders that slow down sites: images, unessential or unminified Javascript, and font loading issues.

Optimize your image sizes before uploading, and consider serving images in next-gen formats such as WebP, adding lazy loading images that appear further down the page, and/or compressing images using a tool such as Imagify. These strategies can help lower the overall weight of the images on your site, thus lowering your average LCP score by reducing the size of the page when it first loads in the user’s browser.

In general, take advantage of resources that can identify issues on your website and provide recommendations, like the Core Web Vitals test. It’s also helpful to use optimization and caching plugins such as Imagify and WP Rocket on your website to help reduce the overall weight of your pages on load.

This improves user experience and indicates to search engines such as Google that your site is of a higher overall quality. Finding ways to minify, defer, or otherwise reduce the loading time and/or layout shift of codes and fonts is another a great way of reducing the time it takes for your pages to become interactable.

5
Improve your site's security

Security vulnerabilities such as the lack of an SSL certificate, a weak or poorly installed firewall, outdated plugins and themes, and even data integrity issues such as weak passwords on your admin account are all risks that commonly lead to hacks, loss of user data, and other poor user experiences.

There are two quick SEO wins marketers can implement to help improve your site’s security and show search engines that your site is safe for users.

First, you must implement an SSL certificate on your website and redirect all HTTP URLs to their HTTPS counterparts. SSL certificates are a way of ensuring that any data a user uploads or types into your website is encrypted as it travels from the user’s browser to the server where you store the data. Failing to protect user data in this manner is regarded as a negative ranking factor by search engines.

Second, you should regularly update all themes, plugins, and other resources that you use on your site to improve your site’s overall security. Many of these updates include bug fixes and patches that address vulnerabilities that could leave your site open to hackers or bots. While such updates aren’t necessarily a ranking factor, needing to take your site down for several days to resolve the fallout of a security breach will most certainly influence your site’s rankings in the long-term. Regularly check your CMS and plugins to make sure they are up to date and running smoothly.

6
Implement structured data and schema

Imagine for a moment that you searched for a banana bread recipe on Google. Once the page loads, you notice that the entire site and even the recipe is written exclusively in French. You may be able to make out some of the words, and there may be clues on the page that can help you understand what’s going on, but it’s clear that you’ll never really know what the page is saying unless you learn how to read French.

Now, imagine you go through the same process again. However, this time the page has a box at the very top, in English, that describes who made the recipe, what ingredients you’ll need, and outlines the various steps you need to follow to make the banana bread. You still can’t understand the French, but the section in English gives enough contextual clues for you to piece together what the page is trying to communicate.

This example is, effectively, what structured data and schema try to do for the web crawlers. No matter how intelligent a crawler might seem, it still can’t “read” basic English in the same way that you or I can. For this reason, web developers try to add machine-friendly content to their websites to help crawlers read the page and contextualize its content in a way that is helpful to users in the SERPs.

For example, on a recipe page such as the banana bread example from above, the “English box” equivalent for a web crawler would be a .json file or other form of markup that explains (1) what kind of site the recipe is published on, (2) who wrote the content, and (3) how it can best present the recipe to users.

So, the website might mark up their page in a format like this:

  • Page Type = Recipe
  • Description = Tasty banana bread made using classic ingredients and baking methods.
  • Name = My Grandma’s Best Banana Bread
  • Publisher = circle S studio
  • Author = Andrew Michael
  • Ingredients = Banana; flour; butter; eggs…
  • Instructions = Begin by peeling the bananas…

With this data, it’s not necessary to read the entire article to understand the recipe. Simply having access to this core information (author, ingredients, instructions) can give you enough clues to “read” the recipe’s core elements. By explicitly organizing all the important information in key-value pairs, you can provide numerous clues to search engines about the purpose and origin of a page.

Structured data for a company website might include information such as business type, email address, logo file, and social media profiles. When a crawler lands on your website, it will read this structured data section and gain a better understanding of your company. Google and other search engines can use this information to better direct users to your website (relevance + trustworthiness) while also improving your click-through rates through schema-specific rich results such as publication dates and review scores.

Best of all, implementing structured data on your site, especially if you’re using a WordPress SEO plugin such as Yoast or RankMath, is exceptionally easy. It’s wise to implement an automated structured data tool such as Yoast or RankMath so you can include important brand information on all relevant pages of your site.

This can all improve the correlation between your business’s Entity in the SERPs and the specific information you’re presenting on your website, thus increasing your website’s overall quality, relevance, and trustworthiness in the eyes of search engines such as Google.

7
Review title tags and descriptions for duplicates and keyword opportunities

You can quickly and easily improve the overall quality and relevance of your website by aligning the core message of each of your core pages towards one central user goal. Every page on your website should have a specific, unique purpose. Properly conveying this purpose to search engines can help improve the overall relevance score of each of your website pages.

If someone searches for information about your business, you want to direct them to your “about” or “home” pages. If someone wants to learn more about a service you’re offering, you want them to land on your services page. Optimizing this experience can naturally lead to more satisfied users, less bounces, and a longer average time on page.

Take a look at your website’s core pages, identify the primary purpose of each page, and make sure that purpose is accurately communicated in the page’s content. This includes editing title tags, meta descriptions, image alt tags, headers, and general body content. All this information should relate back to the core purpose of educating your audience about your company, brand, and experience.

It’s not necessary to make major changes to the copy to see improvements in your overall page metrics. Simply ensuring that the overall direction of your content is in line with user intent by making small changes to your titles, descriptions, and headers is generally enough to bring the core pages up to speed.

Devote regular time to improving your site

You can make real, holistic improvements to your website by devoting an hour every day, week, or month towards improving your SEO. Begin with key SEO problems that are common across all industries. After resolving these errors and implementing baseline best practices, start optimizing your site for specific keywords or audiences.

Remember, a site that is slightly above average in every factor will always rank above a site that only scores well in a single area. Taking the time to optimize for a broad range of ranking factors is a great way to improve your site’s overall quality and provide superior experiences to users.

© 2024 circle S studio Privacy Policy
© 2024 circle S studio Privacy Policy
Subscribe To InsightsSubscribe
Subscribe To Insights

By signing up you are agreeing to our Privacy Policy.