Gary Illyes shared a nice little tidbit on LinkedIn about robots.txt files. He said that only a tiny number of robots.txt files are over 500 kilobytes. I mean, most robots.txt files have a few lines of text, so this makes sense but still, it is a nice tidbit of knowledge.
Gary looked at over a billion robots.txt files that Google Search knows about and said only 7,188 of them were over 500 KiB. That is less than 0.000719%.
He wrote, "One would think that out of the billions (yes, with a ) of robots.txt files Google knows of more than 7188 would be larger in byte size than the 500kiB processing limit. Alas. No."
Yea, the SEO point here is that Google can process up to 500KB of your robots.txt file but most of those files don't even come close to that file size.
Forum discussion at LinkedIn.
Read Full Story: https://news.google.com/rss/articles/CBMiV2h0dHBzOi8vd3d3LnNlcm91bmR0YWJsZS5jb20vZ29vZ2xlLXZlcnktZmV3LXJvYm90cy10eHQtZmlsZXMtYXJlLW92ZXItNTAwa2ItMzYwMTkuaHRtbNIBAA?oc=5
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.