Skip to Content

HCU and PRU – These Recent Google Search Updates Can’t Be Gamed

19 min read

There’s a trend emerging with the latest Google updates where it’s gotten to the point where you really can’t figure them out. Sure, there will always be SEO pundits claiming otherwise. But nothing could be further from the truth.

Here’s why.

Google has become so confident in going from their old mantra of updates and rollouts to this multilevel machine learning organism. Its unparalleled sophistication means it can get down to the query level or the search result level on a particular device with a completely different algorithm.

And so what you’re seeing in the last few years, most recently in the last year, is Google being very confident. In effect they’re saying here’s the problem, here’s the solution, and we’re launching it as an update.

Think of Core Web Vitals. Google says they don’t like bad websites and they give you metrics to monitor. But they don’t reveal how they implement it, just that they don’t like it and you should get going on fixing it now.

Same thing with Product Review Update (PRU). Google sees affiliates getting rich off of really bad content. Google says it’s a big problem and that they’re creating a solution to solve it.

What happens?

They release an update where they’ve never been that granular with their decisions. Times are changing!

Helpful Content update

Now we have the Helpful Content Update (HCU). Google says there’s a lot of low quality content out there. So they create a set of guidelines and this time they say that they’re starting to monitor and update it.

What’s interesting about this update is that, although it performs a page-level analysis, it rolls this up into a site-wide factor. They’re looking at every single page and building a composite score that relates to a page being some degree of helpful or not.

They’re actually looking at page topic pairs — looking at how many are good and how many are bad and a percentage of good and bad.  

Get your 7-step process for diagnosing and fixing unhelpful pages.

There are some other factors involved, like how powerful are the pages? But basically they evaluate whether you have more good content than bad content. More importantly, they’re able to move that bar and say that this is a good site or this is a bad site — the site-level weight.

So the easiest way to think about that is a hot air balloon floating in the sky where the Helpful Content Update can weigh down the entire hot air balloon. It could be a big weight or it could be a small weight based on the way they implement it. Very similar to Core Web Vitals in its implementation. But that is the current state of this update.

Now, while some people may think HCU has improved their rankings, that’s not quite true. What’s really happening is that a lot of people are going down. Same effect but different cause — the devil is in the details.

Here’s the thing. Google’s not going to tell you where the bars are set. I know they’re evaluating page types differently — types of pages and intents differently.

So collections of sites that have various page types of various intents are going, they put those sites into boxes. There is documentation that states that they are absolutely doing it, with page types and intents in mind. However, there isn’t anything else that’s publicly available that would describe that to be true. Based on my analyses, it appears that they’ve set the initial bar very low.

The Impact of HCU

Internally, I refer to HCU, PRU and Core Web Vitals as the boa constrictor updates because they can actually like slowly choke a site or an industry based on Google wanting to put influence to this or not influence to this, and that’s what you’re starting to see.

So the sites impacted by this are heavily skewed towards what’s obviously garbage — where someone put out 50,000 pages on one section of their site and it’s all trash. It provides no user value. Likewise with poor quality sites that were about too many disparate things.

My perspective is, and this is speculative, but I do have information that the intent behind this is to ratchet it slowly and combine it with other boa constrictor-like updates such that it is meeting their anti-spam team goals of just improving the web.

There are elements of this from my research that do connect to whether your site has a clear and obvious editorial theme, but based on what I know, the larger sites that have many themes that can be okay too.

So topic authoritativeness, separate from the Google Helpful content update, is calculated at the topic page and topic site section levels as well. They can actually calculate all page topic combinations and then all site section topic combinations to evaluate topic authoritativeness in Google.

Here’s a great example that comes from Google literature — Johns Hopkins website is very authoritative on healthcare, but their job board isn’t a place you should go to get medical advice. So they have to, and they always have had to, evaluate authoritativeness, at the section level and not only at the site level.

What makes Google’s Helpful Content Update so remarkable is that this is the first time they’ve said that one bad apple can spoil the whole bunch. Maybe not to that extreme, because we don’t know how many bad apples you can get away with and we don’t know the extent of the impact. That’s why I like the hot air balloon analogy — it’s as if you’re putting a drag on your whole site.

People often ask why Google keeps making these updates. It depends on what team you’re talking to inside Google, but overall, they want to get you to the end of your search as quickly as possible.

The internal teams who I know, truly and absolutely want the best user experience. The teams don’t necessarily all work together. The antispam team’s different from the query rewriter team, query parsing, or the brain team. You can’t even find the team names, but if you look hard enough, you can see all the teams broken down, where they live and why.

Now, what’s the benefit to getting an answer quicker? Because I would expect they maybe want someone to buy something.

If you land on a page and are tricked, if you receive bad medical advice from a site, if you receive bad financial advice from a site that you found on Google, over time that’s like super bad for a search engine.

Yahoo is the living example. Microsoft Network and Bing. Microsoft Network once had organic results that were paid inclusion through a mechanism called InktoMe and it almost destroyed their entire company. You could buy your way into Microsoft organic Search results.

Google is able to broadcast their updates publicly without fear they’ll be reverse engineered, like it was back in the day when it was just links and spam. Back then they had to take a clandestine approach, but slowly you’re beginning to see these things happening in plain sight.

The reason?

Google’s algorithm isn’t an algorithm. It’s more like a multi-part organism — it’s complex and continues to evolve.

Topic Authority and Google’s Helpful Content update

Let’s take for example, topic authority. It used to be a lot more related to links, citations and the like. But now, because of computational improvements, it’s more related to content on the site. So there’s a really good book if you’re interested in reading it, called The Beauty of Mathematics in Computer Science. It’s written by Jun Wu, a former staff research scientist in Google who invented Google’s Chinese, Japanese, and Korean Web Search Algorithms and was responsible for many Google machine learning projects.

On a related note, there’s something that is likely to occur when they ratchet it up. I say this because it happened during the authority updates of last year that weren’t publicly announced. If you saw your pages go down and you were only looking at them as total traffic. That’s a mistake. You’ve gotta look at them as a collection of  each word page combo and see if you can see any trends of what dropped.

I’ll give you an example, although I can’t say who it is. You have a super powerful and authoritative page that was ranking for terms that it doesn’t actually satisfy, and because you were so powerful, you were getting the invite to the party anyway. If you saw traffic from that (what we call an intent mismatch) start to go down and the page is still ranking number one for the thing it actually does. But the thing it doesn’t do, it stopped or slowed down. That’s an indicator that you had an authority drop.

So that’s what you’re seeing from this too. And you’re having an authority drop generally, but because you’re not allowed to rank for stuff that you’re not covering. And that is an indicator that you lost your invite to the cool kid party to be able to rank for over time.

And that’s when you start seeing those trends. What you’re looking for is that it’s happening across the board.

You had a page that was doing well for construction management software and what is construction management software and best construction management software.It can only do one of those things. But you were so powerful that it got all three of those things. And so until you are the authority in the middle of the funnel or on the top of the funnel, you can’t have both with one page that is an affiliate page or trying to capture.

HubSpot’s a great site to analyze. If you wanna look at a manipulation of this, if you wanna look at just the fundamentals of how it can be done. But realize that they’ve got the invite for the party because they’re so powerful. They’re a great website to analyze because you can see what it looks like when a page ranks for something — even though it doesn’t satisfy intent, it’s so powerful. So what I like to look at is landing pages.

Another example is if you see a one trick pony site that only does one thing, unless they’ve declared that’s their niche. I used to manage a website called whatis.com — it’s about awareness phase definitions. If we didn’t have the infrastructure of the entire TechTarget network, we wouldn’t have been invited to the party to rank for the what is pages. But that site is so tuned for definitions that Google’s not expecting them to have the whole file covered.

But if you’re a SaaS company with 200 pages and 40% of your site are definitions, that’s super suspect and weird. That’s what you’re starting to see fail. It’s not because your definitions aren’t good. It’s because, going back to topic authority, it just doesn’t look right and it doesn’t make any sense.

What To Do if You’ve Been Hit by HCU

With the last authority update and this update some people ask whether you’re completing the entire buyer journey for that word. True, but it’s not to say that’s a formula. It’s to say that each word’s going to have its own requirements. So ask yourself whether it’s normal and if the situation would make sense.

There’s a number of things you can do if you believe you’ve been hit by the Helpful Content Update.

Most importantly, do a competitive cohort analysis and look at things that nobody’s covering in that cluster. Prepare for your competitors to die with this update. Go look at them and find their errors. Find where they’ve got weak spots that put them at risk. When Google ratchets it up, you want to be in the right place at the right time when the competition gets a bigger weight on their hot air balloon.

Get your 7-step process for diagnosing and fixing unhelpful pages.

And those techniques are critical. Find the ways that they cover a topic. Where do their editors have blind spots? Go in those directions so you are there when they get dragged down.

Same thing with product reviews. They’ve got the invite and they’re in authority mode. Now they’ve already stated certain page types aren’t able to be sculpted.

I’ll give you a great example. If you have extraordinary authority and have beautiful clusters and link dynamics you can, up until recently, sculpt internal link structure, (see MarketMuse Connect). And you can manipulate your internal link structure such that your landing page will rank.

It’s really hard to do that right now. So if you go through Way-Back Machine and through historical search, like in SEMrush or whatever, go look for situations where HubSpot ranked for a head term with a short landing page. Those are SERPs you want to analyze. Those SERPs are ones I’ve analyzed exhaustively because it’s that when you see those weird fundamental exceptions to the rule,

That’s where you can really learn when you’re analyzing SERP. One thing I look for is when there’s an exception to the rule in the SERP. That’s a critical thing.

I alluded to this earlier in the article, but be careful if you have pages that are disparate sets of things that have no obvious correlation (e.g. best man’s speech topics, how to start a nail salon). Those are very risky pages to rely on because Google in many cases does not consider them to be semantically comprehensive.

They’re separate because the words on those pages don’t really mean anything. Think about a long list of disparate content (e.g. ways to improve your life). Don’t analyze that page the same way you would analyze “what is” page or a guide because there’s no way to semantically analyze that page. Just because you wrote “buy lots of crackers and you’re gonna enjoy your life more” doesn’t mean the page is about crackers.

Those are dangerous pages to manage and they’re actually really hard pages to manage. So that’s something to think about.

Product Reviews Update

Google’s Product Reviews Update is a page level analysis of pages professing to have analyzed a product or enabling the comparison of products. The question I often get is whether this has leaked into B2B at all.

Definitely! Even though it was originally very heavy B2C, you’re not absolved just because you’re in B2B.

With the Product Reviews Update there are two things for which you need to solve:

  1. Making sure to have first-hand experience of the products which you review.
  2. Ensure your website properly conveys that you have used the products that you review.

One thing to keep in mind. There’s not a lot of information about Product Reviews Update that you can validate for Google publicly. But we know that the language on product reviews is actually being analyzed at two different levels. So authority is done at the page and sentence level.

The Product Reviews Update is calculated and they’re looking at real, in-depth, even word-by-word usage. Words, depending on industry, can signal doubt you actually reviewed the product influence the composite scoring for the page level Product Reviews Update. So be wary about wishy-washy text.

Avoid Using Stock Images

Stock imagery is like the kiss of death when it comes to product reviews. You can include photography from the manufacturer, but it should not be used in isolation.

Why?

Because it sends the wrong signal. Anyone can snap a picture of a product they’re reviewing, unless of course, they’re not actually reviewing it. Think of original photography as another validation signal that you independently reviewed the product. Illustrations can also work and you can pair them with stock photography, but using only stock photos is a recipe for disaster.

One trick that works really well is to take pictures with multiple products. When doing a comparison review, this shows that you actually have those two or three products. Google is able to use computer vision in some cases and they’re able to look for different components. In fact, I’d be surprised if the computer vision teams weren’t involved in this.

Original video with proper schema markup is trending to be a good characteristic of these pages. You can take snippets from the video to use as images thereby saving the cost of original photography.

And of course, make sure to identify the people behind these reviews. Showing that a product review was done by “cool toasters admin” is just plain unnecessary.

These recent updates should be a reminder to never take anything for granted. I had a client that had a million visits a month from a page, and for two years they wouldn’t touch it. And every month when I would talk to them, I would say, “Fix that damn page. It’s going to kill you!”

It was “the top 10 ‘X’ beverage products” and they didn’t touch it. They wouldn’t touch it. They’re like, “It’s our best page, we’re not touching it.”

And it died.

The second rollout of PRU killed it. So if you’ve got page structures that are stock blurb, affiliate link, stock blurb, affiliate link, stock blurb, affiliate link, there are only two choices:

  1. Light a match, dig a hole, and crawl in it.
  2. Fix it at the page level and then build out clusters.

The Need for Full Funnel Coverage

For a review site to perform, you need full funnel coverage. If you only do reviews, then you can have some longevity. But it’s a hard road to solely do reviews and perform well — you’ll get intent-pigeonholed. If you’re only doing reviews don’t expect a rank for your “what is” with that review page. And if you start writing a “what is page,” you’re actually not a one trick pony site. So it’s a catch 22.

While it’s true that you should have content throughout the buyer journey (awareness, consideration, purchase, and post purchase), remember that it’s a correlation. Yes, you need all that content. But it’s not a specific amount, it’s topic by topic.

So, writing some definitions is probably going to be valuable for a publisher. But writing a certain number of definitions — you can’t predict that at scale. More will be better in most cases if you can’t predict the percentage. It’s different for every topic and we’ve analyzed that exhaustively.

And so it’s not a one size fits all answer, even if the touts are starting to publish that it is. They’re the same ones that said you should build one page for one word and never do anything else for cannibalization. All those videos about that are gone. Or got updated because it’s not true.

And this PRU update was the time when they started to move all those off. So take your favorite tout and go compare their old videos on web archive and you’ll see. So be aware that’s coming.

You’re gonna start to see it. Remember that the only thing you can predict is that you do need it. As for how much of it, that’s topic by topic.

To maximize your content investment, especially for high-priced items, develop repurposing strategies so you can slice and dice it into a dozen or more pieces — individual benefit content, short video, audio screencast and other visuals.

Takeaways

And always remember to run some competitive cohort profiling. It’s my favorite tactic period.

If you have competitors it doesn’t matter if they’re three letter sites for four letter sites. If they’ve got terrible product reviews, they’re a ticking time bomb. Start planning for those good days when they’ll have to pay the piper.

Is it possible you’ll still see garbage content rank? Possibly.

And that’s okay. It doesn’t mean that Google’s broken. It doesn’t mean that these exceptions are the rules. The exceptions to the rules are the most important part of doing what I do. And when you see those, it’s really important to keep an eye on them. They’re the most fun to watch.

What you should do now

When you’re ready… here are 3 ways we can help you publish better content, faster:

  1. Book time with MarketMuse Schedule a live demo with one of our strategists to see how MarketMuse can help your team reach their content goals.
  2. If you’d like to learn how to create better content faster, visit our blog. It’s full of resources to help scale content.
  3. If you know another marketer who’d enjoy reading this page, share it with them via email, LinkedIn, Twitter, or Facebook.
Co-founder & Chief Strategy Officer at MarketMuse

Jeff is Co-Founder and Chief Product Officer at MarketMuse. He is a cross-disciplined, data-driven inbound marketing executive with 18+ years of experience managing products and website networks; focused on helping companies grow. You can follow him on Twitter or LinkedIn.

Tweet
Share
Share