AWS Machine Learning Blog

Tag: Generative AI

Build an AI-powered document processing platform with open source NER model and LLM on HAQM SageMaker

In this post, we discuss how you can build an AI-powered document processing platform with open source NER and LLMs on SageMaker.

Optimizing Mixtral 8x7B on HAQM SageMaker with AWS Inferentia2

This post demonstrates how to deploy and serve the Mixtral 8x7B language model on AWS Inferentia2 instances for cost-effective, high-performance inference. We’ll walk through model compilation using Hugging Face Optimum Neuron, which provides a set of tools enabling straightforward model loading, training, and inference, and the Text Generation Inference (TGI) Container, which has the toolkit for deploying and serving LLMs with Hugging Face.

Automating regulatory compliance: A multi-agent solution using HAQM Bedrock and CrewAI

In this post, we explore how AI agents can streamline compliance and fulfill regulatory requirements for financial institutions using HAQM Bedrock and CrewAI. We demonstrate how to build a multi-agent system that can automatically summarize new regulations, assess their impact on operations, and provide prescriptive technical guidance. You’ll learn how to use HAQM Bedrock Knowledge Bases and HAQM Bedrock Agents with CrewAI to create a comprehensive, automated compliance solution.

Build an enterprise synthetic data strategy using HAQM Bedrock

In this post, we explore how to use HAQM Bedrock for synthetic data generation, considering these challenges alongside the potential benefits to develop effective strategies for various applications across multiple industries, including AI and machine learning (ML).

Shaping the future: OMRON’s data-driven journey with AWS

OMRON Corporation is a leading technology provider in industrial automation, healthcare, and electronic components. In their Shaping the Future 2030 (SF2030) strategic plan, OMRON aims to address diverse social issues, drive sustainable business growth, transform business models and capabilities, and accelerate digital transformation. At the heart of this transformation is the OMRON Data & Analytics Platform (ODAP), an innovative initiative designed to revolutionize how the company harnesses its data assets. This post explores how OMRON Europe is using HAQM Web Services (AWS) to build its advanced ODAP and its progress toward harnessing the power of generative AI.

Generate training data and cost-effectively train categorical models with HAQM Bedrock

In this post, we explore how you can use HAQM Bedrock to generate high-quality categorical ground truth data, which is crucial for training machine learning (ML) models in a cost-sensitive environment. Generative AI solutions can play an invaluable role during the model development phase by simplifying training and test data creation for multiclass classification supervised learning use cases. We dive deep into this process on how to use XML tags to structure the prompt and guide HAQM Bedrock in generating a balanced label dataset with high accuracy. We also showcase a real-world example for predicting the root cause category for support cases. This use case, solvable through ML, can enable support teams to better understand customer needs and optimize response strategies.

Picture-7-Feature-Image-Virtual AI Assistant using HAQM Q Business

Build a generative AI enabled virtual IT troubleshooting assistant using HAQM Q Business

Discover how to build a GenAI powered virtual IT troubleshooting assistant using HAQM Q Business. This innovative solution integrates with popular ITSM tools like ServiceNow, Atlassian Jira, and Confluence to streamline information retrieval and enhance collaboration across your organization. By harnessing the power of generative AI, this assistant can significantly boost operational efficiency and provide 24/7 support tailored to individual needs. Learn how to set up, configure, and leverage this solution to transform your enterprise information management.

Integrate generative AI capabilities into Microsoft Office using HAQM Bedrock

In this blog post, we showcase a powerful solution that seamlessly integrates AWS generative AI capabilities in the form of large language models (LLMs) based on HAQM Bedrock into the Office experience. By harnessing the latest advancements in generative AI, we empower employees to unlock new levels of efficiency and creativity within the tools they already use every day.