AWS Partner Network (APN) Blog
Category: Generative AI
Best Practices from Quantiphi for Unleashing Generative AI Functionality by Fine-Tuning LLMs
Fine-tuning large language models (LLMs) is crucial for leveraging their full potential across industries. Quantiphi unveils how fine-tuning supercharges LLMs to deliver domain-specific AI solutions that redefine possibilities. From personalized healthcare to precise financial predictions and streamlined legal reviews, fine-tuned models offer transformative value and unleash the power of customized, efficient, and responsible generative AI deployments.
Reimagining Vector Databases for the Generative AI Era with Pinecone Serverless on AWS
Pinecone has developed a novel serverless vector database architecture optimized for AI workloads like retrieval-augmented generation. Built on AWS, it decouples storage and compute and enables efficient intermittent querying of large datasets. This provides elasticity, fresher data, and major cost savings over traditional architectures. Pinecone serverless removes bottlenecks to building more knowledgeable AI applications economically at scale on AWS.
How Accenture’s CCE Solution Powered by AWS Generative AI Helps Improve Customer Experience
Contact centers can improve customer experiences using generative AI, which creates new content and conversations. Accenture’s Connected Customer Experience (CCE) solution incorporates AWS services to provide personalized human and AI interactions. It uses generative AI for agent assist, call summarization, and self-service FAQs. By leveraging generative AI on AWS, CCE aims to enhance agent productivity, reduce handle times, and deliver exceptional customer experiences.
Getting Started with Generative AI Using Hugging Face Platform on AWS
The Hugging Face Platform provides no-code and low-code solutions for deploying generative AI models on managed AWS infrastructure. Key features include Inference Endpoints for easy model deployment, Spaces for hosting machine learning apps, and AutoTrain for training state-of-the-art models without coding. Hugging Face is an AWS Generative AI Competency Partner whose mission is to democratize machine learning through open source, open science, and Hugging Face products and services.
New Generative AI Insights for AWS Partners to Accelerate Your Customer Offerings
AWS embraces the “working backwards” approach to stay customer-focused. The Generative AI Center of Excellence (CoE) for AWS Partners applies this methodology and collects partner feedback to provide relevant insights, tools, and resources on leveraging generative AI. Recent updates to the CoE include customer research on generative AI adoption challenges, a usage maturity heatmap by industry, and five new use case deep dives covering telecom, automotive, IDP, contact centers, and financial analysts.
Revolutionize Your Business with AWS Generative AI Competency Partners
With the ability to automate tasks, enhance productivity, and enable hyper-personalized customer experiences, businesses are seeking specialized expertise to build a successful generative AI strategy. To support this need, we’re excited to announce the AWS Generative AI Competency—an AWS Specialization that helps AWS customers more quickly adopt generative AI solutions and strategically position themselves for the future.
Generative AI Augments Marriott’s Cybersecurity Posture with AWS Partners Deloitte and Palo Alto Networks
Marriott’s CISO Arno Van Der Walt manages cybersecurity through a “human-centered, data-driven, technology-enabled” approach aimed at making security frictionless. Critical partnerships with AWS, Deloitte, and Palo Alto Networks leverage AI/ML to share threat data and empower “impossible” autonomous security. Together, their tri-party services provide an end-to-end platform unifying business and security data to detect threats and enable quick response.
How Shellkode Uses HAQM Bedrock to Convert Natural Language Queries to NoSQL Statements
Large language models like HAQM Bedrock can generate MongoDB queries from natural language questions, transforming how users access NoSQL databases. By leveraging AI and language models, this solution allows business users to query MongoDB data through conversational English instead of code. It connects to MongoDB with PyMongo, generates queries with LangChain and Bedrock, retrieves and formats results into natural language answers.
Transforming Customer Service with Rapyder’s Generative AI-Powered Call Agent Analyzer
Rapyder’s Call Agent Analyzer uses generative AI on AWS to revolutionize call center operations. It efficiently processes multilingual audio, summarizes calls, analyzes script adherence, and structures insights into actionable data. This solution helps businesses enhance customer satisfaction through data-driven call agent performance evaluation and training. As an AWS Partner, Rapyder provides cutting-edge cloud solutions that are reshaping industries like customer service.
How Infosys is Reimagining Enterprise Solutions with Generative AI on AWS
Infosys is leveraging AWS generative AI capabilities like HAQM Bedrock and AWS Trainium to enhance its enterprise solutions. For example, Infosys Personalized Smart Video uses HAQM Bedrock to create rich, dynamic video content, while Infosys Cortex applies generative AI to analyze call transcripts and improve customer engagement. Overall, Infosys is rapidly adopting AWS’ flexible and scalable generative AI services to boost automation, productivity and user experience across its portfolio.