Five9 has launched GenAI Studio, which allows organizations to take general purpose, off-the-shelf generative artificial intelligence models, such as OpenAI, and customize them for the contact center in just a few clicks.
GenAI Studio will power all Five9 applications that use generative AI models, starting with Agent Assist AI Summaries. It will enable customers to create, deploy, and test generative AI prompts based on business-specific, contextual data. GenAI Studio also lets users control the data that generative AI can access and share.
With a low-code, no-code generative AI model and prompt management hub, GenAI Studio helps companies deliver personalized customer and employee experiences using contextually relevant customer interaction data.
Key capabilities of GenAI Studio include the following:
- Engine-agnostic layer that allows users to select from a range of generative AI models.
- Knowledge and personalization through integration with many forms of contextual data, allowing GenAI Studio to answer questions, summarize calls, or guide agents with information that is unique to the business and personalized to the customer.
- Testing, monitoring, and observability for quality management of generative AI models using real call transcripts and utterances to measure performance and make improvements.
- Prompt library, a repository of sample prompts.
"Generative AI represents a revolutionary step forward in technology that can benefit the contact center. To harness it, we need the next generation of tools," said Jonathan Rosenberg, Five9's chief technology officer and head of AI, in a statement. "With GenAI Studio, customers can infuse off-the-shelf models with their unique data, making the generative AI model their own. Delivering this type of capability with both power and ease has been a challenge until now; it's in line with the core value proposition that Five9 has delivered and will continue to deliver in the future. GenAI Studio is the next big step in our journey to transform the contact center for the future."