×
Tuesday, November 26, 2024

Google On How To Reduce Duplication By A Factor Of 10X - Search Engine Roundtable

Last updated Friday, October 29, 2021 07:41 ET , Source: NewsService

I spotted an interesting comment from John Mueller of Google about URLs and reducing duplication. He said focusing on individual URLs here and there won't help you much, it is more about looking where you can "reduce duplication by a factor of 10x." This thread is on Reddit where he added that you should not focus on the "individual posts here and there" but rather look for this at scale.

He said for example "if you have 100k products and they all have 50 URLs each, changing that from 5M URLs to 500k URLs (5 URLs each) would be worth the effort." How does one product page get 50 URLs? Well, besides for tracking parameters, there can be referral parameters, added product filters, and even bugs in your code that can generate these URLs. This is where technical SEOs shine, reducing these types of duplication at scale.

John added "that's usually also a clear technical thing, not something which depends on handwavy opinions."

Making these types of changes, where you go from 5 million URLs in Google's index to 500,000 URLs in Google's index can make a huge difference for your site in Google Search. It is not like you are missing out of 4.5 million pages, because all of those pages are duplicate 50 times to the 500,000 URLs. It just makes things cleaner and more consistent for Google and it helps consolidate signals to the primary product or category page URL.

So when you find these URL duplication issues, talk to your development team about how you can just serve the canonical...



Read Full Story: https://www.seroundtable.com/google-reduce-duplication-factor-32327.html

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.