×
Friday, April 26, 2024

Does Google Have A Problem With Big Robots.txt Files? - Search Engine Journal

Last updated Tuesday, January 18, 2022 20:13 ET , Source: NewsService

See which organic keywords actually drive conversions

Replace (not provided) with actual keywords and monitor their specific performance metrics in Google Analytics.

Try For Free

Google addresses the subject of robots.txt files and whether it’s a good SEO practice to keep them within a reasonable size.

This topic is discussed by Google’s Search Advocate John Mueller during the Google Search Central SEO office-hours hangout recorded on January 14.

David Zieger, an SEO manager for a large news publisher in Germany, joins the livestream with concerns about a “huge” and “complex” robots.txt file.

How huge are we talking here?

Zieger says there’s over 1,500 lines with a “multitude” of disallows that keeps growing over the years.

The disallows prevent Google from indexing HTML fragments and URLs where AJAX calls are used.

Zieger says it’s not possible to set a noindex, which is another way to keep the fragments and URLs out of Google’s index, so he’s resorted to filling the site’s robots.txt with disallows.

Are there any negative SEO effects that can result from a huge robots.txt file?

Here’s what Mueller says.

SEO Considerations For Large Robots.txt Files

A large robots.txt file will not directly cause any negative impact to a site’s SEO.

However, a large file is harder to maintain, which may lead to accidental issues down the road.

Mueller explains:

“No direct negative SEO issues with that, but it makes it a lot harder to maintain. And it makes it a lot easier to...



Read Full Story: https://www.searchenginejournal.com/does-google-have-a-problem-with-big-robots-txt-files/433932/

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.