AWS Messaging & Targeting Blog
Building AI-powered customer experiences using a modern communications hub
Customers today expect organizations to proactively meet their needs with personalized content delivered at the right time, place, and manner of their choosing. They seek dynamic, context-aware interactions with sophisticated conversations across all communication channels. This increasing demand puts pressure on organizations to transform their customer experience workflows to enhance loyalty and boost operational efficiency. While recent advances in Generative AI (GenAI), including hyper-personalization and Agentic AI, offer exciting possibilities, they also present new challenges. Organizations need a flexible, reusable architecture that allows them to incorporate GenAI into their existing customer engagement systems without requiring a complete overhaul of their current disparate solutions.
This blog post explores how to build an AI-powered modern communications hub using open-source GitHub samples that integrate SMS/MMS and WhatsApp services with GenAI capabilities. Organizations can create innovative AI-powered customer experiences with a quick proof-of-concept without disrupting existing systems.
In combination with Vector Databases and Retrieval Augmented Generation (RAG), GenAI makes it possible to reorganize knowledge into a single system and query from a single user interface through natural language conversation with a chatbot or virtual assistant. Funneling customer communications through a multi-channel communications hub linked with GenAI capabilities helps unify customer engagement mechanisms and streamlines the creation of rich customer experiences. Customers meet AI agents and Q&A bots on the communication channel that is convenient to self-serve their needs. Organizations can build communications-channel-agnostic customer experiences while collecting channel engagement event and conversational data into a centralized data store for real-time insights, ad-hoc queries, analytics, and ML training.
Solution overview
In the core of the solution is the Modern Communications Hub that connects digital communication channels with key GenAI services, like HAQM Bedrock and HAQM Q, along with AWS ML, database, storage, and serverless computing services.AWS End User Messaging and HAQM SES provide API level access to digital communication channels, offering secure, scalable, high-performance, and cost-effective services for enterprise applications to exchange SMS/MMS, WhatsApp, push and voice notifications, and email with customers.
A collection of open-source sample code, published in the AWS-samples GitHub repository, illustrates how to facilitate generative conversations on SMS/MMS and WhatsApp channels. This will be extended to include email services. Two key components form the foundation of the GenAI Integration Samples: the Multi-channel Chat with AI Agents and Q&A Bots and the Engagement Database and Analytics for End User Messaging and SES. We will simply refer to these as the Conversation Processor and Engagement Database in the solution diagram.
The Conversation Processor receives customer messages via AWS End User Messaging and HAQM Simple Email Service (SES), stores the conversation details, and invokes the relevant HAQM Bedrock Agent. HAQM Bedrock Agents use Large Language Models (LLMs) and knowledge bases to analyze tasks, break them into actionable steps, execute those steps or search the knowledge base, observe outcomes, and iteratively refine their approach until completing the task along with a response. Alternatively, the Conversation Processor can function as a Q&A bot in which case it uses HAQM Bedrock Knowledge Bases along with its RAG feature to generate an LLM answer and send back on the same channel as the customer’s message.
The Engagement Database collects and combines customer engagement data and conversational logs from across communication channels, storing the information in a centralized data lake on HAQM S3. By converting the data into a common, canonical format, the solution simplifies querying and analysis of these inbound events. A Lambda Transformer function leverages Apache Velocity Templates to transform the incoming JSON data, enabling real-time insights.
The raw event data stored in the HAQM S3 data lake can then be fed into other AWS services for further processing. For example, the data can flow into HAQM Connect Customer Data Profiles or HAQM SageMaker to support machine learning model training. Data analysts can use HAQM Athena to issue direct queries for detailed ad-hoc reporting, or to send the data to HAQM QuickSight for advanced visualizations and natural language querying capabilities through HAQM Q in QuickSight.
NOTE: There is the potential for end users to send Personal Identifiable Information (PII) in messages. To protect customer privacy, please consider using HAQM Comprehend to assist in redacting PII before storing messages in S3. The following blog post provides a good overview of how to use Comprehend to redact PII: Redact sensitive data from streaming data in near-real time using HAQM Comprehend and HAQM Kinesis Data Firehose.
HAQM Bedrock provides core GenAI capabilities such as LLMs, Knowledge Bases, Retrieval Augmented Generation (RAG), AI agents, and Guardrails, to understand customer asks, determine what action to take, and what to communicate back. HAQM Bedrock Knowledge Bases provide organization specific business knowledge and reasoning, while HAQM Bedrock Agents automate multistep tasks by seamlessly connecting with company systems, APIs, and data sources.
Prerequisites
The following prerequisites are necessary to build your modern communications hub:
- An AWS account. Sign up for an AWS account at AWS website if you don’t have one.
- Appropriate AWS Identity and Access Management(IAM) roles and permissions for HAQM Bedrock, AWS End User Messaging, and HAQM S3. For more information, see Create a service role for model import.
- AWS End User Messaging Configuration: You’ll need to configure the necessary origination identity in the AWS End User Messagingservice to deliver messages via SMS or WhatsApp. If configuring SMS, a registered and active SMS Origination Phone Number must be provisioned in AWS End User Messaging SMS. (Within the United States, use 10DLC or Toll-Free Numbers (TFNs). If configuring WhatsApp, an active number that has been registered with Meta/WhatsApp should be provisioned in AWS End User Messaging Social.
- HAQM Bedrock models: Bedrock Anthropic Claude 3.0 Sonnet and Titan Text Embeddings V2 enabled in your region. Note that these are the default models used by the solution, however, you are free to experiment with different models.
- Docker Installed and Running – This is used locally to package resources for deployment.
- Node (> v18) and NPM (> v8.19) installed and configured on your computer
- The AWS Command Line Interface(AWS CLI) installed and configured
- AWS CDK (v2) installed and configured on your computer.
Deploy the Conversation Processor and Engagement Database
Deploy the following two solutions. While not required, it is best to deploy them in this order, as outputs from the Engagement Database can be used in the Multi-Channel Chat example:
- Engagement Database and Analytics for End User Messaging and SES
- Multi-channel Chat with AI Agents and Q&A Bots
Each solution contains detailed instructions to deploy the required services using the AWS Cloud Development Kit (CDK). The first Engagement Database solution will create an HAQM Data Firehose stream that can be used as an input to the second Multi-Channel Chat application so that data can be stored and queried in the Engagement Database.
Multi-Channel Chat with AI Agents and Q&A Bot Data Sources
This solution demonstrates how users can interact with three different knowledge sources. You may not need all of three, however this should serve as a good example to build the right knowledge source for your particular use-case:
- Build your Knowledge Bases on HAQM Bedrock using HAQM S3. By default, the solution will create Knowledge Bases using an HAQM S3 Bucket as the data source. This solution allows you to upload documents to an HAQM S3 bucket to populate the knowledge base.
NOTE: The starter project creates an S3 bucket to store the documents used for the Bedrock Knowledge Base. Please consider using HAQM Macie to assist in the discovery of potentially sensitive data in S3 buckets. HAQM Macie can be enabled on a free trial for 30 days, up to 150GB per account.
- Build your Knowledge Base on HAQM Bedrock using a Web Crawler. Optionally configure your knowledge base to scan or crawl website(s) to populate your knowledge base.
- HAQM Bedrock Agents: Optionally enable your users to chat with an HAQM Bedrock Agents. Agents have the added benefit of supporting knowledge bases for answering questions and walking users through collecting the information needed to automate a task such as making a reservation. Sample agents are available in the HAQM Bedrock Agent Samples repository. Note that you will need to have an HAQM Bedrock Agent created in your region prior to deploying the solution.
Conclusion
A Modern Communications Hub, loosely coupled with core Generative AI services, will establish a composable foundation to build communication-channel-agnostic customer experiences on. Build one by leveraging the GenAI Integration Samples, Conversation Processor and Engagement Database, combining with the secure, scalable, high-performance, and cost-effective digital communication services by AWS End User Messaging and HAQM SES. This will provide a single point of conversational access to knowledge bases and agentic AI capabilities on HAQM Bedrock. Start experimenting with AI-powered customer experience innovations with a quick proof-of-concept that won’t interfere with your present customer engagement setup.
About the Authors