Yext at SMX East: The Truth About Duplicates

Everyone in the local SEO community knows duplicate listings are bad news, for both your business and your customers. And they happen to be rampant too. Yext recently conducted a study of 2,719 random business listings and uncovered over 44,000 duplicates, with an average of 16.26 duplicates per location across the network.* Duplicates plague the […]

By Yext

Oct 8, 2014

1 min

Everyone in the local SEO community knows duplicate listings are bad news, for both your business and your customers. And they happen to be rampant too. Yext recently conducted a study of 2,719 random business listings and uncovered over 44,000 duplicates, with an average of 16.26 duplicates per location across the network.* Duplicates plague the ecosystem but are nearly impossible to remove permanently.

Yext's EVP of Partnerships, Christian Ward, gave a talk at SMX East last week to explain how duplicates are created, why they are so problematic, and best practices for getting rid of them.

He emphasized that, contrary to popular belief, "fixing" duplicates at the source level isn't the solution; in fact, it can cause an unintended ripple effect, leading to even more duplicates down the road. Instead, marketers need to verify their business information at the publisher level to suppress duplicates once and for all.

Yext's API-integrations with 50+ leading sites, maps, and apps scans for existing duplicates and lets marketers flag them appropriately so they're prevented from appearing at the most important level – the view on an actual publisher's site.

To get an in-depth look at duplicate creation and best-practices for effective suppression, download the whitepaper here or check out yext.com/resources.

*Source: Yext, September, 2014.

Share this Article

Read Next

loading icon