Google’s John Mueller explains why it’s impossible to crawl and discover every URL on the web.
See which organic keywords actually drive conversions
Replace (not provided) with actual keywords and monitor their specific performance metrics in Google Analytics.
In response to a question about why SEO tools don’t show all backlinks, Google’s Search Advocate John Mueller says it’s impossible to crawl the whole web.
This is stated in a comment on Reddit in a thread started by a frustrated SEO professional.
They ask why all links pointing to a site aren’t getting found by an SEO tool they’re using.
Which tool the person is using isn’t important. As we learn from Mueller, it’s not possible for any tool to discover 100% of a website’s inbound links.
Here’s why.
There’s No Way To Crawl The Web “Properly”
Mueller says there’s no objectively correct way to crawl the web because it has an infinite number of URLs.
No one has the resources to keep an endless amount of URLs in a database, so web crawlers try to determine what’s worth crawling
As Mueller explains, that inevitably leads to URLs getting crawled infrequently or not at all.
“There’s no objective way to crawl the web properly.
It’s theoretically impossible to crawl it all, since the number of actual URLs is effectively infinite. Since nobody can afford to keep an infinite number of URLs in a database, all web crawlers make assumptions, simplifications, and guesses about what is realistically worth crawling.
And...
Read Full Story: https://www.searchenginejournal.com/googles-john-mueller-its-impossible-to-crawl-the-whole-web/442683/
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.