I want to create a DAG in Airflow that triggers an ECS task. This ECS task will then return a few parameters to Airflow, allowing the DAG to trigger the next dependent ECS task using those parameters. I’ve been looking for a way to pass these parameters between tasks, and from what I’ve understood, the best approach seems to be using XComs.
However, the EcsRunTaskOperator I’m using doesn’t appear to capture the task’s stdout output (please correct me if I’m wrong). By default, it only passes the ecs_task_arn to XCom. So, I’m wondering how I can achieve this.
I’ve come across an alternative solution where I could read the logs from CloudWatch, parse them, and then push the values to XCom. However, I’m wondering if there’s a better way to do this—something that lets me push values from ECS directly to XCom without involving other services (like S3, message queues, or CloudWatch).
It would be very convenient if the operator could capture ECS stdout and pass the output to XCom, but it doesn’t seem feasible. Any ideas or suggestions?
Solution: An Airflow committer got back to me and clarified that the EcsRunTaskOperator actually captures the last log message and pushes it into XCom. However, this wasn’t mentioned in the documentation—they pointed me directly to the code instead.
I managed to solve my issue by setting do_xcom_push = True
and configuring the appropriate parameters to fetch the logs from CloudWatch, like this:
awslogs_group=""
awslogs_stream_prefix=""
awslogs_region=""
This allowed me to retrieve the last log message from ECS, which contained the arguments I needed to pass to the next job. With those arguments in XCom, I was able to trigger the next ECS job based on that information.
**I created a PR in Airflow repo so the documentation would be updated soon as well https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/_api/airflow/providers/amazon/aws/operators/ecs/index.html#airflow.providers.amazon.aws.operators.ecs.EcsRunTaskOperator