When Crowdsourced Answers Work Best — And When They Don’t

Amazon’s new program, Alexa Answers, will now provide crowdsourced answers to questions Alexa is unable to answer — a move aimed at expanding the intelligent assistant’s knowledge base. The rollout reinforces a key shift in consumer behavior that has been underway for years — people want answers. Amazon’s new solution is yet another demonstration of […]

By Christian J. Ward

Sep 16, 2019

2 min

Amazon's new program, Alexa Answers, will now provide crowdsourced answers to questions Alexa is unable to answer — a move aimed at expanding the intelligent assistant's knowledge base.

The rollout reinforces a key shift in consumer behavior that has been underway for years — people want answers. Amazon's new solution is yet another demonstration of that fact. But what does this feature mean for Amazon users — and for which types of queries is it likely to be useful?

Generally, there have been attempts at crowdsourcing answers in the past, across many different platforms. While some were helpful, the vast majority quickly ran into issues around moderation — checking the crowdsourced answers for accuracy and appropriateness. Amazon is at the forefront of Natural Language Processing and will likely analyze all proposed answers from the crowd to try to drive higher accuracy levels in an effort to combat that problem.

That said, the solution could still face data compilation problems. If there are 20 different answers from the crowd for "where do I get a great gluten-free meal in Redbank, NJ?" then Amazon will need a way to structure and rank those answers — which means compilation, ranking, and accuracy benchmarking.

And let's not forget: The crowd can be incredibly helpful, but the crowd can also be wrong.

What types of questions might the crowd provide good answers for? And what types might they not? Crowds can provide directional information, but they tend not to offer clear or accurate information for questions that require a direct answer from authoritative sources. For this reason, the crowdsourced approach is extremely helpful when mining broad thoughts and opinions from the crowd, but we should still rely on specific, accurate sources of truth for the actual answers to distinct questions.

Let's use an example: "What time does Redbank diner open in New Jersey?" is a question with one correct answer. People searching for this information need that right answer to determine whether, and when, to visit the diner. That type of question should have one authoritative source — the business itself.

On the other hand, "do people like to eat at Redbank diner in New Jersey?" is a great question for the crowd.

Essentially, the new Alexa feature is useful (and potentially a lot of fun) for matters of opinion or degree. But for questions where an errant answer could damage brand trust, you'll want brand-verified answers that ensure your customers get the right answer straight from the source of truth — your business.

Learn how Yext can help your businessdeliver brand-verified answersin search results.

Share this Article

Read Next

loading icon