AWS Machine Learning Blog
Category: HAQM SageMaker Data & AI Governance
Deploy HAQM SageMaker Projects with Terraform Cloud
In this post you define, deploy, and provision a SageMaker Project custom template purely in Terraform. With no dependencies on other IaC tools, you can now enable SageMaker Projects strictly within your Terraform Enterprise infrastructure.
Responsible AI in action: How Data Reply red teaming supports generative AI safety on AWS
In this post, we explore how AWS services can be seamlessly integrated with open source tools to help establish a robust red teaming mechanism within your organization. Specifically, we discuss Data Reply’s red teaming solution, a comprehensive blueprint to enhance AI safety and responsible AI practices.
How iFood built a platform to run hundreds of machine learning models with HAQM SageMaker Inference
In this post, we show how iFood uses SageMaker to revolutionize its ML operations. By harnessing the power of SageMaker, iFood streamlines the entire ML lifecycle, from model training to deployment. This integration not only simplifies complex processes but also automates critical tasks.
Unlock cost-effective AI inference using HAQM Bedrock serverless capabilities with an HAQM SageMaker trained model
HAQM Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies such as AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and HAQM through a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. In this post, I’ll show you how to use HAQM Bedrock—with its fully managed, on-demand API—with your HAQM SageMaker trained or fine-tuned model.