AWS Machine Learning Blog
Category: Developer Tools
Build flexible and scalable distributed training architectures using Kubeflow on AWS and HAQM SageMaker
In this post, we demonstrate how Kubeflow on AWS (an AWS-specific distribution of Kubeflow) used with AWS Deep Learning Containers and HAQM Elastic File System (HAQM EFS) simplifies collaboration and provides flexibility in training deep learning models at scale on both HAQM Elastic Kubernetes Service (HAQM EKS) and HAQM SageMaker utilizing a hybrid architecture approach. […]
Introducing HAQM CodeWhisperer, the ML-powered coding companion
We are excited to announce HAQM CodeWhisperer, a machine learning (ML)-powered service that helps improve developer productivity by providing code recommendations based on developers’ natural comments and prior code. With CodeWhisperer, developers can simply write a comment that outlines a specific task in plain English, such as “upload a file to S3.” Based on this, […]
Secure AWS CodeArtifact access for isolated HAQM SageMaker notebook instances
AWS CodeArtifact allows developers to connect internal code repositories to upstream code repositories like Pypi, Maven, or NPM. AWS CodeArtifact is a powerful addition to CI/CD workflows on AWS, but it is similarly effective for code-bases hosted on a Jupyter notebook. This is a common development paradigm for Machine Learning developers that build and train […]
Improve your data science workflow with a multi-branch training MLOps pipeline using AWS
In this post, you will learn how to create a multi-branch training MLOps continuous integration and continuous delivery (CI/CD) pipeline using AWS CodePipeline and AWS CodeCommit, in addition to Jenkins and GitHub. I discuss the concept of experiment branches, where data scientists can work in parallel and eventually merge their experiment back into the main […]
Create a cross-account machine learning training and deployment environment with AWS Code Pipeline
A continuous integration and continuous delivery (CI/CD) pipeline helps you automate steps in your machine learning (ML) applications such as data ingestion, data preparation, feature engineering, modeling training, and model deployment. A pipeline across multiple AWS accounts improves security, agility, and resilience because an AWS account provides a natural security and access boundary for your […]
How Intel Olympic Technology Group built a smart coaching SaaS application by deploying pose estimation models – Part 1
February 9, 2024: HAQM Kinesis Data Firehose has been renamed to HAQM Data Firehose. Read the AWS What’s New post to learn more. The Intel Olympic Technology Group (OTG), a division within Intel focused on bringing cutting-edge technology to Olympic athletes, collaborated with AWS Machine Learning Professional Services (MLPS) to build a smart coaching software […]
Build a CI/CD pipeline for deploying custom machine learning models using AWS services
HAQM SageMaker is a fully managed service that provides every developer and data scientist with the ability to build, train, and deploy machine learning (ML) models quickly. SageMaker removes the heavy lifting from each step of the ML process to make it easier to develop high-quality ML artifacts. AWS Serverless Application Model (AWS SAM) is […]
Private package installation in HAQM SageMaker running in internet-free mode
HAQM SageMaker Studio notebooks and HAQM SageMaker notebook instances are internet-enabled by default. However, many regulated industries, such as financial industries, healthcare, telecommunications, and others, require that network traffic traverses their own HAQM Virtual Private Cloud (HAQM VPC) to restrict and control which traffic can go through public internet. Although you can disable direct internet […]
Training and serving H2O models using HAQM SageMaker
Model training and serving steps are two essential pieces of a successful end-to-end machine learning (ML) pipeline. These two steps often require different software and hardware setups to provide the best mix for a production environment. Model training is optimized for a low-cost, feasible total run duration, scientific flexibility, and model interpretability objectives, whereas model […]
Using the HAQM SageMaker Studio Image Build CLI to build container images from your Studio notebooks
The new HAQM SageMaker Studio Image Build convenience package allows data scientists and developers to easily build custom container images from your Studio notebooks via a new CLI. The new CLI eliminates the need to manually set up and connect to Docker build environments for building container images in HAQM SageMaker Studio. HAQM SageMaker Studio […]