Our research evolved from an initial study, conducted by Yext, that revealed how responding to online reviews can affect business reputation. Businesses that respond to at least 50% of reviews see approximately .35 star rating increase on average.* This elevates a business's reputation online - especially in search results. Prior research has also shown that responding to 60-80% of reviews is optimal.** Depending on the review volume that a business sees, this manual review and content generation process can translate to multiple hours or even days of a full-time employee's workload. We set out to discover if we could leverage the emerging technology of LLMs to automate and streamline the review response process for businesses through our Reviews monitoring platform.
A large language model is an AI algorithm that is trained on large data sets and can perform a variety of natural language processing tasks such as text generation or answering questions in a conversational manner (you probably already know this if you've ever played around with the likes of Jasper or ChatGPT). LLMs are already used in other sections of the Yext platform, such as Chat and the Content Generation feature. We set out to explore if AI and LLMs could be used for Reviews.
The foundation of our research lies in Yext Content, based on knowledge graph technology, where brand-approved facts are stored. There are four main key features to Yext Content.
1. Maintains a flexible schema
Content allows for platform customizations to align with each individual business's needs and operational structure. For example, a healthcare system would need entity types that reflect healthcare professionals, hospital locations, medical specialties, and health article documents., while a restaurant would have menu items, store locations, and special events. The KG schema can also change over time to adapt to evolving business needs and structures.
2. Defines relationships between entities
Content also provides definitions of how entities are related to one another. For example, Doctor A works at the Union Square office. This additional information provides key context for the LLM to understand the relationship structure between two objects.
3. Contains bi-directional relationship connections
Based on graph technology, Content can make connections bi-directional. For example, a Doctor works at the Union Square office. In the same relationship connection, we can infer that the Union Square office has Doctor A working there. This additional context layer provides a wealth of information to draw complex connections between entities.