AWS Open Source Blog

HAQM Managed Workflows for Apache Airflow unaffected by Airflow 1.10.12 vulnerability

HAQM Managed Workflows for Apache Airflow (MWAA) is not affected by the recently announced vulnerability in Apache Airflow 1.10.12. The default airflow.cfg file uses a temporary key that is the same for all installations. In Airflow 1.10.12 and earlier, there was no restriction in using that temporary key on the Airflow web server, meaning that the cookie signed using the default key for one Airflow web server could be used to access any other web server. This was mitigated in Airflow 1.10.14 by preventing the Airflow web server from launching if it is using the default temporary key.

HAQM MWAA uses a randomly generated key for every Airflow environment. This means an attacker can’t use this exploit to gain access to a MWAA web server instance. As such, HAQM MWAA is able to continue to offer Airflow 1.10.12 safely.

More HAQM MWAA updates

Airflow 2.0 and backport providers

With the release of Airflow 2.0, the HAQM MWAA team is working on the necessary testing to get it delivered to customers. In the meantime, backport provider support is available, which allows you to run DAGs in 1.10.12 that will also work in 2.0.

For example, to run the HAQM EMR integration example from the HAQM MWAA documentation using the Airflow HAQM backport provider, you would replace:

from airflow.contrib.operators.emr_create_job_flow_operator import EmrCreateJobFlowOperator
from airflow.contrib.operators.emr_add_steps_operator import EmrAddStepsOperator
from airflow.contrib.sensors.emr_step_sensor import EmrStepSensor

with:

from airflow.providers.amazon.aws.operators.emr_create_job_flow import EmrCreateJobFlowOperator
from airflow.providers.amazon.aws.operators.emr_add_steps import EmrAddStepsOperator
from airflow.providers.amazon.aws.sensors.emr_step import EmrStepSensor

and add the following line to the HAQM MWAA requirements.txt file:

apache-airflow-backport-providers-amazon

Improved compatibility

The HAQM MWAA team has added additional runtimes to provide support for MySQL, SSH, and other packages that depend on them. For example, to support MySQL and SSH, you can add the following lines to the HAQM MWAA requirements.txt file:

apache-airflow[mysql]
paramiko
sshtunnel

Container examples

The HAQM MWAA documentation team and AWS community are working on new examples and walkthroughs showing how to use the managed service with HAQM Elastic Kubernetes Service (HAQM EKS), HAQM Elastic Container Service (HAQM ECS), and more. Please visit the HAQM MWAA documentation for additional information.

John Jackson

John Jackson

John has over 20 years of software experience as a developer, systems architect, and product manager in both startups and large corporations and is the AWS Principal Product Manager responsible for HAQM Managed Workflows for Apache Airflow (MWAA).