AWS Machine Learning Blog
GuardianGamer scales family-safe cloud gaming with AWS
In this post, we share how GuardianGamer uses AWS services including HAQM Nova and HAQM Bedrock to deliver a scalable and efficient supervision platform. The team uses HAQM Nova for intelligent narrative generation to provide parents with meaningful insights into their children’s gaming activities and social interactions, while maintaining a non-intrusive approach to monitoring.
Principal Financial Group increases Voice Virtual Assistant performance using Genesys, HAQM Lex, and HAQM QuickSight
In this post, we explore how Principal used this opportunity to build an integrated voice VA reporting and analytics solution using an HAQM QuickSight dashboard.
Optimize query responses with user feedback using HAQM Bedrock embedding and few-shot prompting
This post demonstrates how HAQM Bedrock, combined with a user feedback dataset and few-shot prompting, can refine responses for higher user satisfaction. By using HAQM Titan Text Embeddings v2, we demonstrate a statistically significant improvement in response quality, making it a valuable tool for applications seeking accurate and personalized responses.
Boosting team productivity with HAQM Q Business Microsoft 365 integrations for Microsoft 365 Outlook and Word
HAQM Q Business integration with Microsoft 365 applications offers powerful AI assistance directly within the tools that your team already uses daily. In this post, we explore how these integrations for Outlook and Word can transform your workflow.
Integrate HAQM Bedrock Agents with Slack
In this post, we present a solution to incorporate HAQM Bedrock Agents in your Slack workspace. We guide you through configuring a Slack workspace, deploying integration components in HAQM Web Services, and using this solution.
Secure distributed logging in scalable multi-account deployments using HAQM Bedrock and LangChain
In this post, we present a solution for securing distributed logging multi-account deployments using HAQM Bedrock and LangChain.
Build a domain‐aware data preprocessing pipeline: A multi‐agent collaboration approach
In this post, we introduce a multi-agent collaboration pipeline for processing unstructured insurance data using HAQM Bedrock, featuring specialized agents for classification, conversion, and metadata extraction. We demonstrate how this domain-aware approach transforms diverse data formats like claims documents, videos, and audio files into metadata-rich outputs that enable fraud detection, customer 360-degree views, and advanced analytics.
Automating complex document processing: How Onity Group built an intelligent solution using HAQM Bedrock
In this post, we explore how Onity Group, a financial services company specializing in mortgage servicing and origination, transformed their document processing capabilities using HAQM Bedrock and other AWS services. The solution helped Onity achieve a 50% reduction in document extraction costs while improving overall accuracy by 20% compared to their previous OCR and AI/ML solution.
HERE Technologies boosts developer productivity with new generative AI-powered coding assistant
HERE collaborated with the GenAIIC. Our joint mission was to create an intelligent AI coding assistant that could provide explanations and executable code solutions in response to users’ natural language queries. The requirement was to build a scalable system that could translate natural language questions into HTML code with embedded JavaScript, ready for immediate rendering as an interactive map that users can see on screen.
Set up a custom plugin on HAQM Q Business and authenticate with HAQM Cognito to interact with backend systems
In this post, we demonstrate how to build a custom plugin with HAQM Q Business for backend integration. This plugin can integrate existing systems, including third-party systems, with little to no development in just weeks and automate critical workflows. Additionally, we show how to safeguard the solution using HAQM Cognito and AWS IAM Identity Center, maintaining the safety and integrity of sensitive data and workflows.