AWS announces HAQM Redshift integration with HAQM Bedrock for generative AI
AWS announces the integration of HAQM Redshift with HAQM Bedrock, a fully managed service offering high-performing foundation models (FMs) making it simpler, and faster for you to build generative AI applications. This integration enables you to leverage large language models (LLMs) from simple SQL commands alongside your data in HAQM Redshift.
With this new feature, you can now easily perform generative AI tasks such as language translation, text generation, summarization, customer classification, and sentiment analysis on your Redshift data using popular FMs like Anthropic Claude, HAQM Titan, Llama 2, and Mistral AI. First, your Redshift admin adds a policy to invoke Bedrock models in the IAM role to your Redshift Serverless namespace or cluster. Then, you can simply use the CREATE EXTERNAL MODEL command to point to a LLM in HAQM Bedrock, without requiring any model training or provisioning. You can invoke these models using familiar SQL commands, making it easier than ever to integrate generative AI capabilities into your data analytics workflows. You do not incur additional HAQM Redshift charges for using Large Language Models (LLMs) with HAQM Redshift ML beyond the standard HAQM Bedrock pricing.
HAQM Redshift integration with HAQM Bedrock is now generally available in all regions where HAQM Bedrock and HAQM Redshift ML are supported. To get started, visit the HAQM Redshift machine learning documentation and the HAQM Bedrock product page.