Google says they’re attempting to scale back duplicates by an element of 10

I spotted an interesting comment from Google’s John Mueller about URLs and duplicate reduction. He said focusing on individual URLs here and there wouldn’t do you much good, it was more about finding where you can “reduce duplication by a factor of 10”. This thread is running Reddit where he added that one shouldn’t focus on the “individual posts here and there” but rather look for them true to scale.

For example, he said, “If you have 100,000 products and you all have 50 URLs, it would be worth the effort to change that from 5 million URLs to 500,000 URLs (5 URLs each).” How does a product page get 50 URLs? Well, aside from tracking parameters, there may be referral parameters, added product filters, and even bugs in your code that can generate these URLs. This is where technical SEOs shine and reduce this type of duplication on a large scale.

John added, “It’s usually a straight technical thing too, not something that depends on hand-waved opinions.”

These kinds of changes, going from 5 million URLs in the Google index to 500,000 URLs in the Google index, can make a huge difference to your website in Google Search. It’s not that you’re missing out on 4.5 million pages because all of those pages are duplicated 50 times to the 500,000 URLs. It just makes things cleaner and more consistent for Google and helps consolidate signals to the primary product or category page url.

So if you are experiencing these URL duplication issues, speak to your development team about how to easily provide the canonical URL for users and Google. There are other ways to track internal referrers and there are likely to be bugs that can be fixed using these dynamically generated URLs.

Forum discussion at Reddit.

Leave a Reply

Your email address will not be published. Required fields are marked *