AWS Machine Learning Blog
Category: HAQM Bedrock Prompt Management
Use HAQM Bedrock Intelligent Prompt Routing for cost and latency benefits
Today, we’re happy to announce the general availability of HAQM Bedrock Intelligent Prompt Routing. In this blog post, we detail various highlights from our internal testing, how you can get started, and point out some caveats and best practices. We encourage you to incorporate HAQM Bedrock Intelligent Prompt Routing into your new and existing generative AI applications.
Generate training data and cost-effectively train categorical models with HAQM Bedrock
In this post, we explore how you can use HAQM Bedrock to generate high-quality categorical ground truth data, which is crucial for training machine learning (ML) models in a cost-sensitive environment. Generative AI solutions can play an invaluable role during the model development phase by simplifying training and test data creation for multiclass classification supervised learning use cases. We dive deep into this process on how to use XML tags to structure the prompt and guide HAQM Bedrock in generating a balanced label dataset with high accuracy. We also showcase a real-world example for predicting the root cause category for support cases. This use case, solvable through ML, can enable support teams to better understand customer needs and optimize response strategies.
How Aetion is using generative AI and HAQM Bedrock to unlock hidden insights about patient populations
In this post, we review how Aetion’s Smart Subgroups Interpreter enables users to interact with Smart Subgroups using natural language queries. Powered by HAQM Bedrock and Anthropic’s Claude 3 large language models (LLMs), the interpreter responds to user questions expressed in conversational language about patient subgroups and provides insights to generate further hypotheses and evidence.