×
Friday, May 17, 2024

Why 100% indexing isn't possible, and why that's OK - Search Engine Land

Last updated Tuesday, June 14, 2022 06:00 ET , Source: NewsService

When it comes to topics like crawl budget, the historic rhetoric has always been that it’s a problem reserved for large websites (classified by Google as 1-million-plus webpages) and medium-sized websites with high content change frequency.

In recent months, however, crawling and indexing have become more common topics on the SEO forums and in questions posed to Googlers on Twitter.

From my own anecdotal experience, websites of varying size and change frequency have since November seen greater fluctuations and report changes in Google Search Console (both crawl stats and coverage reports) than they have historically.

A number of the major coverage changes I’ve witnessed have also correlated with unconfirmed Google updates and high volatility from the SERP sensors/watchers. Given none of the websites have too much in common in terms of stack, niche or even technical issues – is this an indication that 100% indexed (for most websites) isn’t now possible, and that’s OK?

This makes sense.

Google, in their own docs, outlines that the web is expanding at a pace far outstretching its own capability and means to crawl (and index) every URL.

In the same documentation, Google outlines a number of factors that impact their crawl capacity, as well as crawl demand, including:

  • The popularity of your URLs (and content).
  • It’s staleness.
  • How quickly the site responds.
  • Google’s knowledge (perceived inventory) of URLs on our website.

From conversations with Google’s John Mueller on...



Read Full Story: https://searchengineland.com/100-percent-indexing-impossible-385773

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.