SEO News

Google’s Gary Illyes Wants Googlebot To Crawl Less


Google Cobweb

Gary Illyes from Google posted on LinkedIn that his mission this year is to “figure out how to crawl even less, and have fewer bytes on wire.” He added that Googlebot should “be more intelligent about caching and internal cache sharing among user agents, and we should have fewer bytes on wire.”

He added, “Decreasing crawling without sacrificing crawl-quality would benefit everyone.”

Today, Gary added, that Google is crawling as much as it did before – despite some folks thinking Google is crawling less. He said “In the grand scheme of things that’s just not the case; we’re crawling roughly as much as before.”

What Google is better at that before is scheduling. “However scheduling got more intelligent and we’re focusing more on URLs that more likely to deserve crawling,” he explained.

It seems Microsoft Bing, specifically Fabrice Canel from Microsoft and Gary Illyes from Google have the same goals. Microsoft is tackling it by encouraging site owners to use IndexNow. Google said in November 2021 that Google might consider adopting IndexNow but that came and went…

John Mueller from Google commented on the post suggesting, “We could just crawl RSS feeds and create some sort of Reader.” A joke about Google Reader…

Anyway – we will see what Google ends up doing here. Here is his full post:

My mission this year is to figure out how to crawl even less, and have fewer bytes on wire.

A few days ago there was a post on a Reddit community about how, in the OC’s perception, Google is crawling less than previous years. In the grand scheme of things that’s just not the case; we’re crawling roughly as much as before, however scheduling got more intelligent and we’re focusing more on URLs that more likely to deserve crawling.

However, we should, in fact, crawl less. We should, for example, be more intelligent about caching and internal cache sharing among user agents, and we should have fewer bytes on wire.

If you’ve seen an interesting IETF (or other standards body) internet draft that could help with this effort, or an actual standard I might’ve missed, send it my way. Decreasing crawling without sacrificing crawl-quality would benefit everyone.

Forum discussion at LinkedIn.



Source link : Seroundtable.com

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button
Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!