Posted On: Oct 4, 2023
HAQM SageMaker Model Registry now allows you to register machine learning (ML) models that are stored in private Docker repositories. This capability enables you to track all your ML models across multiple private AWS and non-AWS model repositories in one central service to simplify ML operations (MLOps) and ML governance at scale.
HAQM SageMaker Model Registry is a purpose-built metadata store to manage the entire lifecycle of ML models from training to inference. Whether you prefer to store your model artifacts (model framework files, container image) in AWS (HAQM ECR) or outside of AWS in any third party Docker repository, you can now track them all in HAQM SageMaker Model Registry. You also have the flexibility to register a model without read/write permissions to the associated container image. If you want to track an ML model in a private repository, set the optional ‘SkipModelValidation’ parameter to ‘All’ at the time of registration. Later you can also deploy these models for inference in HAQM SageMaker. For more details on how to deploy such models from private repositories, refer to our developer guide.
HAQM SageMaker Model Registry is available in all AWS Regions, except the AWS GovCloud (US) Regions. To get started, register your private ML models via the HAQM SageMaker Studio UI or via the HAQM SageMaker Python SDK. Visit the HAQM SageMaker developer guide for additional information.