SEOSEO News

Changes to the Quality Raters Guidelines | Rank Ranger






Don’t forget, you can keep up with the In Search SEO Podcast by subscribing on iTunes or by following the podcast on SoundCloud!

Summary of Episode 54: Analyzing Changes to the Quality Raters Guidelines

In Search SEO Banner 54

It’s 2020 y’all and we’re kicking it off with a living legend. Jennifer Slegg chats it up with us about the Quality Raters Guidelines!

  • Why Google has changed its language around YMYL
  • The meaning of Google shuffling around its categorization of YMYL sites
  • The ever-controversial relationship between the Quality Raters Guidelines and Google’s algorithm

Plus, we jump into the crap people are saying about doing SEO in 2020!





The Top 5 SEO Trends Everyone Is Pumping in 2020 That Will Fall Flat [00:05:28 – 00:18:49]

Mordy loves this time of year not because of the holidays, but because it’s the time of year all sorts of people come out of the woodwork and start making all kinds of hilariously nonsensical predictions about SEO in the new year.

Mordy literally Googled top SEO tips for 2020 and saw all of the crap that came up (for the most part) which brings us to this new segment: Falling Flat in SEO. Because it’s the new year, Mordy felt he needed to do something New Years-ish, and since he’s a horribly cynical sarcastic person, instead of doing the top 5 trends for 2020, here are the top 5 pieces of utter nonsense you’ll hear all year long and why it will fall flat.

In no particular order, let’s begin.

#5: Links & Authority will Save Your SEO Soul: Links are important but seeing everyone under the sun writing about how you need to build this link and that link and how links will save your site is a bit much for Mordy. Same goes for authority. People keep saying you need an author bio and you need your content reviewed by an expert and only then will you rank #1. It’s all a bit linear to Mordy at this point especially in the age of the core update where these sorts of one-dimensional outlooks are not enough. Authority is a concept, it’s a deep concept full of all sorts of latent meanings and implications. It’s not a link, it’s not an author bio, and it’s not content reviewed by an expert. Those are all practical manifestations of authority, but they’re not authority itself.

And this is part of a larger point. SEO was built to speak to machines. Now machines are starting to think like humans. Marketing speaks and has always spoken to humans. SEO is finally catching up to marketing where now we should start thinking of things like intent, targeting, and authority the way a human would and authority to a human is not as linear as an author bio.

[Here’s more of Mordy talking about not seeing SEO so linearly in 2020]

#4: Voice Search is a MUST If You’re Doing SEO: No, it’s not. Mordy knows voice search is real and knows some very well respected folk, such as Barry Schwartz, are very *** on voice search. All Mordy asks is to be careful because, like every year, all of the experts are saying voice search for 2020 is here! And it’s not, at least not yet. Mordy’s point is that he feels something pivotal needs to change in order for voice search to breakthrough because right now we’re using it to turn the lights off, look up the weather outside because pulling the shade is too labor-intensive, and show how smart we are by asking our devices questions it clearly can’t answer.

Voice search needs a moment to catapult it.

#3: FAQ Markup: If you’re all-in on this and going hog wild as everyone says you should be with adding FAQ markup, don’t forget that it must be used applicably. Google is not dumb and the current FAQ party cannot possibly last. So only implement the FAQ when it’s an actual FAQ-type page.

#2: Zero-Click Searches: We all know the data but SEO is not dead and your site is not dead. Mordy thinks 2020 should be about coming to our senses. There is a lot of context that goes around a zero-click SERP. A lot is vertical dependent (the Local Pack is going to be an organic killer but you’re still going to click through on a Featured Snippet that gives you step-by-step instructions on a complicated fix). It’s way more complicated and way more nuanced than it has been presented and discussed. Yes, for those sorts of high search volume keywords, like what’s playing at the movies, clicks are going to be hard, but users do a lot of long-tail searches. Nuance is the name of the game here.

#1: BERT: We know what your thinking. BERT? That’s your number one? Are you nuts? No. Well, yes, but no. Mordy’s not saying that BERT won’t be effective or play an important role as Mordy is a huge supporter on focusing on natural language processing in 2020.

Mordy is listing BERT here because the way he feels it’s being related to is a bit off. Mordy gets the sense that we think, or the “2020 trends” experts think, that Google flicked a switch and now BERT will turn everything on its head and change SEO as we know it in a single bound.

First, realize that BERT’s been around for a bit already and that hasn’t happened. Think back to when RankBrain came out in 2016 and how the conversation around intent really got started only about a year maybe a year and a half after its release. Why? Because that’s how machine learning works. It learns, it improves, it makes changes and this is a big part of the equation. It’s hard to predict exactly what those changes look like. It’s hard to know how effective a machine learning property will be and where exactly it will make its impact.

So please, no more cliches that BERT and NLP are going to turn the SERP on its head! BERT is big and will continue being big. It might even be bigger than Big Bird himself, but let’s see how this develops. Let’s analyze BERT as it grows and makes a deeper impact so that we can come up with a substantive analysis that provides real insight.

Analyzing Changes to Google’s Quality Raters Guidelines: A Conversation with Jennifer Slegg [00:18:49 – 00:56:30]

[This is a general summary of the interview and not a word for word transcript. You can listen to the podcast for the full interview.]

Mordy: Welcome to another In Search SEO podcast interview session. Today we have with us one of the most known personalities within the world of Search. You may know her from her interviewing Googlers of all varieties or from covering search marketing news. She is the one and only Jennifer Slegg!

Welcome!

Jennifer: Thank you for having me.

M: Before we start I have to ask a personal question. I know you’re a Canucks fan and one of my fondest memories as a child was watching the Rangers win the Stanley Cup in 1994. Do you have any bad blood still?

J: Yeah, we don’t talk about that.

M: Okay, no worries. Let’s quickly move on and jump into the Quality Raters Guidelines.

Just so that everyone listening is on the same page, what are Google’s Quality Raters Guidelines (QRG)?

J: The QRG is Google’s guide to the sites that they want to rank the highest on Google search results as well as the types of sites they want to see rank the lowest. I see this as the closest How-to guide that Google will give SEOs and site owners of how they should strive to meet what Google wants.

I’m always asked, so I would like to clarify, that quality raters do not have any impact on your site in the live search results. So even if your competitor gets in as a quality rater they cannot do anything that will negatively impact your site. However, they are rating how good or not good sites are so if a rater thinks your site sucks Google will work on making your site rank lower.

M: And who are these raters exactly?

J: These are people who for the most part don’t have an SEO background. They aren’t super searchers. They target stay at home moms. I believe there are 10,000 quality raters and it covers all languages and many countries, so they’re not just US-based.

M: And do you know how one becomes a quality rater?

J: Yes, you find an ad online and you just apply. It’s run through a third-party contractor. If you’re lucky you will be chosen.

M: It sounds like a cushy job.

J: I don’t know how cushy it actually is because you’re basically looking at two sets of search results and determining which side of the search results are of the higher quality. And even the raters don’t know which results are live and which they’re testing. It can be randomized and they have to look at each site, review each site, and find the reputation of each site. If you like searching and are interested in digging around you can find it interesting but I heard raters describe it as a rather boring job. And it’s not like they’re SEOs who are looking for the inside scoop like we do.

M: Yeah, because for me it’s interesting to perhaps see a taste of what’s to come.

J: But I think day in and day out it’s not so interesting.

M: Fair enough. And I must say that I admire you on being able to remember all of these changes. You must have the world’s greatest memory!

J: Thanks, I do have a pretty good memory.

M: So the first change I want to talk about was the one in May 2019 when Google replaced the term “E-A-T” with “Page Quality.” Why do you think they did that?

J: I looked at that change as a quality of life change. They weren’t really changing things from our perspective, it was more for the raters where they were more focusing on E-A-T and not on overall quality because not all pages have to have E-A-T. If you’re looking for a recipe for tomato soup or how to sew a seam you don’t need to rate based on E-A-T but you do need to rate on quality so it was more to simplify things on the raters’ end.

M: But they did keep E-A-T in a few spots.

J: Right and E-A-T is super important to the quality raters. It’s just in a few instances where they pull back on E-A-T and talk about generic page quality.



M:
While we’re talking about page quality, what do you think at its core goes into page quality/authority?

J: This is what everyone wants to know. Obviously, we know links are super important as they give a lot of authority to a website and Google has confirmed this and they need to be natural links. Slapping an author bio on your page doesn’t instantly give your site credibility on Google. Recently, someone came up with this Rent-an-Author scheme where people were paying people specifically to put their bio lines on their bad content. That won’t help as Google said they’re not tracking authorship as much as they did when authorship used to be a thing unless the author is an entity like someone who writes for the New York Times or is extremely well known. If you’re an author with a Wikipedia page you are more considered an entity in that case. Raters are looking at author bios to confirm E-A-T but we know the algorithm isn’t specifically looking for the author bio.

The thing that people get hung up on is that a lot of the things the rater guidelines are talking about are really good for users too. A searcher who lands on your page will like to know who is writing your content, what’s their background, and should they trust this content. People go to the About Us page to learn more about the site. Many SEOs get caught up on what to do for Google, but a lot of the guidelines are not so much for Google but how will this benefit the users. And if it’s good for users you should be doing it regardless.

On the authority part, there are lots of ways to increase your authority. If you’re a local business you can get mentions in your local newspaper. Do interviews, do reviews, or write for other sites, but don’t approach people asking for three follow links in the content or ask to guest blog as that scares people off. If you are writing for another site and your goal is to get links, if your content is really good quality the chances are they’re going to give you the link anyway. And obviously, don’t email a million people for guest blogging opportunities, you want to pick and choose, as you want your content to be on authoritative sites to show off that you’re an authority as well.

M: How does that work though? A case that comes to mind is Google recommending health sites to have a doctor review their content, but how can Google evaluate if the information is accurate or not? If it could be any doctor who knows what they’re recommending to users?

J: That is a good question. Again, Google is looking at these other signals. Is this doctor being linked from other authoritative sites? Is their bio linking out to other places that are lending credibility to them? The raters when looking on a medical site they will look up if the doctor is real and they will rate the site based on if the doctor is real or not. Based on those ratings, Google will determine what to put higher and lower in the rankings.

M: How does Google take what the raters conclude and integrate that into what they’re doing algorithmically?

J: Google runs experiments with the raters. They will work on some parts of an algorithm and then push it out to their raters. The raters are blind to this and don’t know what Google is looking for or testing. They just see two similar search results. Last year I believe they did 650,000 experiments that resulted in 3,200 algorithm updates. When people freak out and ask Google if there was an algorithm update, someone at Google will answer, “Probably.” They have thousands of updates being pushed out within a year. It’s not that easy to track which algorithm was the root cause when 10 updates were released in one day.

So the raters rate the potential updated results with the live results. The Googlers will get this data back and analyze it. Does this change increase the change in the search results, decrease the change, and are there any side effects? If everything looks good they’re likely to push out whatever the update was and if it didn’t work out they won’t.

M: I’m assuming the raters receive extensive training. Is that correct?

J: Not really. It’s like your mom, dad, or sister doing a search. They look at search from a different lens than we do. They look at a site and see if it’s useful while we look at how many backlinks they have or when was the last time they updated their content. And we’re such a small subset of Google users. We’re not normal in any sense when it comes to our search behavior so we’re not the market Google is reaching for when testing algorithm updates. They want to see the average Joe who isn’t as savvy as us.

Now they do have to study the QRG and take an exam. Also, Google will notice that if the rater’s ratings are vastly different than what most other raters are giving then they’ll take notice and address it.

M: Got it.

Let’s jump back into the changes I want to analyze with you. Change #2 was the most recent update back in September where Google split up Shopping from Finance making it its own YMYL (Your Money Your Life) niche or category. Why do you think they did that?

J: I think the big reason is that many people didn’t consider Shopping to be YMYL. I think it was to make it more clear that a site selling anything is considered YMYL. This shouldn’t come as a surprise as they are collecting personal data on these people. But yes, in the broader picture, everyone is surprised that e-commerce sites are considered YMYL and are held on this higher standard.

M: The last change I want to go over is that News resides on the top of the YMYL list and not Health.

J: I think it more has to do with that they made significant changes to the News part of YMYL a couple of years ago. There was a lot of controversy in the search results like fake news and conspiracy theories ranking well in the search results. It got a lot of bad publicity for Google so they updated their guidelines and changed their algorithm to try and combat this fake news.

I remember when Google was just starting when Google used to be the nobody search engine and it was only because they really attacked the spam problem when it became an issue with Yahoo, MSN, and Ask. If Google doesn’t keep up with their search quality at a high level it will open up the door for other search engines to step in. That’s the reason why we left Yahoo and went to Google in the first place.

M: When I think of the problem of Google and news I don’t think about Google having trouble figuring out what’s real or fake and they need the raters’ help. To me, it’s more on the algorithm side. For example, if you type in “breaking news” into Google you will not get great results in the Top Stories carousel. With these issues that Google is having, how will the raters help Google deal with that?

J: When Google released the algorithm and publicized it they also released an update to the QRG at the same time. What they did behind the scenes was update the guidelines and sent it to their raters. They then started testing algorithms to surface the higher quality news and to demote the conspiratorial/fake news. Once they updated the algorithm enough that the raters were rating the updated version higher than the current live results then Google made the announcement with the new guidelines.

M: With that, in their most recent guideline update they said that original content should be considered more authoritative, but just the other day (as of this recording) Barry Schwartz reported Google saying to use the rel=canonical to show the original source, but Glenn Gabe said that it seems that Google is ignoring it. Forgetting the raters, it seems Google is having difficulty determining who is the original source.

J: Yeah, syndication has always been a tricky thing. Canonicals help, but Google says they’re only a hint. Google can still use other signals to determine which has the higher quality or the originating source. So syndication is a bit harder to do nowadays, but although in that regard if a lot of sites really looked into it they might not see much benefit from syndicated content. It was really popular 10-15 years ago, but now people don’t want their content to appear on other sites that will outrank them especially when the site syndicating your content has much more authority than you do. And if your site is being outranked by your syndicated content, there’s probably a reason to it and you should look into your site quality or technical issues because for some reason Google decided to ignore the canonical and show another search result.

I do like that Google is trying to elevate original sources in the search results because it’s always a good thing when news sites do original reporting rather than one site with everyone regurgitating the same thing. No one wants to hear the same story, we want to hear different perspectives. Diversity is always good.

M: Speaking of this, the “original” preference in the QRG update and Google’s announcement to algorithmically preference original content came out within days of each other and to me it reads that there’s a real intrinsic connection between the QRG and the algorithm. I know it’s not a 1:1 connection as John Mueller says, but isn’t there a connection between the general thrust of where both are going?

J: I think there’s a pretty big connection, not to every algorithm, but we’ve seen in the past where there have been specific changes made to the QRG coinciding with various algorithm updates. I always breakdown the QRG to find what exactly changed and a lot it is a signal to me on what Google is thinking of and what they’re working on. It might not be live on the search results now, but it’s in there and the raters are actively checking for this.

M: And I think in the Medic update where I saw there were flavors of change in the QRG that affected the web pages that were demoted. Again, not that it’s a 1:1 match, but there’s that essence of the updates to the guidelines within the algorithmic updates happening at the same time. Do you see that being manifested by the core updates with their tendency to impact YMYL sites?

J: The core updates are interesting because Google says we can’t make changes based on a core update. Plus, the fact that there are 3,000 updates in a year it’s possible that the core update will coincide with something else.

M: You mention you speak to Gary and John and one of the things you’re known for is having access to all of these Googlers. What’s your take on what’s being said and what’s not being said (if anything)?

J: I think Googlers who speak publicly are in a tight spot. I think they want to be more open than they are but if they were more open then spammers will use that information to their advantage and the search results will suffer. I think for any updates we wish they could be more specific about what was targeted so we can fix them rather than trying to analyze who improved and who didn’t. But there are so many other things it could be impacting. For example, a few years ago when Wikipedia switched to HTTPS, everyone who saw their rankings tank thought that Wikipedia was struck by an algorithm update, but we looked into it and found that it was just that Google was dropping the HTTPs and switching to HTTPSs so of course there were rank fluctuations.

We often get the generic “Improve your site quality” phrase after any core update and they say if your site drops then other sites are doing it better so see what you can improve.

For years we had to rely on leaked copies of the QRG. As I got known for doing these QRG updates, I had raters send them to me. So Google is good that they’re now releasing them with official publishing which is awesome that we see them real-time. Another nice thing about the QRG is that while a lot of SEOs talk about big brand named sites a lot of the examples they use in the QRG are tiny sites as well. It’s not just the big brands.

M: Why is that? So they can have a more diverse sample set or because there’s something unique?

J: I think it’s for a more diverse sample set as otherwise we’ll be seeing the BBC, Macy’s, or Amazon in those examples.

M: Before we end today, what are some other changes to the QRG that you think are significant to focus on but haven’t gotten the industry’s attention?

J: There are three things I often tell people. The first is people don’t realize that in the current raters guidelines you can click through to all of the screenshots. You can click through every example and see a blown-up picture. You can use it to really look into detail on what sites they think are good and what sites they think are not. I even talked to people who found their sites listed as poor quality examples and Google never updates those screenshots so your site is permanently considered poor quality in the raters guidelines.

The second thing to mention is that Google actually advocates for sites that run advertisements in the QRG. They tell the raters that sites need ad revenue to support quality content and journalism. Google does say that if it’s excessive (nothing but ads above the fold) or those eye-catching annoying images you always see then those are considered low-quality when it comes to ads per se.

And the third thing is that a lot of people don’t realize that Google made a recent change about clickbait. Google wants to make sure that your titles are matching the content and not using misleading titles. Gary recently said that if Google is rewriting your title tags (for anything other than just adding your site name) you probably should have a look at your title tags because for some reason Google is finding them lacking.

Optimize It or Disavow It



M: If you had to do one over the other would you spend your time trying to recover from a core update by bolstering your author bios on your pages or would you spend your time listing all of the publications that have featured your work on your website?

J: I will say that if you were negatively impacted by a core update that none of these options are great, but for this game, I’d probably focus on bolstering up the author bio just to showcase a little more about who they are, why you should listen to them, where they have been published, if they won any awards, what their social media handles are, or just anything that will get people interested when reading the bio. You can go further and have a short bio on the article page, but when you click on the author it goes into much more detail with lists of their previous posts.

M: Thank you so much, Jennifer, for coming on. I learned so much and I really appreciate you joining us!

J: You’re very welcome.

SEO NEWS [00:57:48 – 01:00:33 – 01:04:10]

Search Console Coverage Reporting Gap: There seems to be a 10-day backlog with the Search Console Coverage report. If you see that there is a gap or a lag there is a good chance it’s not you, it’s Google.

Google Maps Search Carousel: Mordy spotted a new carousel that Google is testing where it shows Google Map lists in carousel form for local queries.

Fun SEO Send-Off Question

How did Google spend New Year’s Eve? 

Sapir kept it traditional by suggesting that Google watched the Times Square countdown. Mordy, being the pessimist realist, thought Google celebrated like most people, drunk and passed out in a **** of their own vomit on the bathroom floor.

Tune in next Tuesday for a new episode of The In Search SEO Podcast.

About The Author

The In Search SEO Podcast

In Search is a weekly SEO podcast featuring some of the biggest names in the search marketing industry.

Tune in to hear pure SEO insights with a ton of personality!

New episodes are released each Tuesday!





Source link

Related Articles

Back to top button
Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!