You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
🐛 Mapped task not triggered after mapped TaskGroup despite successful upstream
Summary
A mapped task (@task.expand(...)) does not get triggered after a mapped @task_group.expand(...), even though all upstream task instances inside the group are marked as success. This occurs when chaining two expansions in a "map -> map" pattern.
If I replace the second .expand() (on the load_data task) with a regular task call, the DAG runs fine — but when .expand() is used, the downstream task never runs and shows an upstream_failed status.
🔁 Reproducible example
This DAG reproduces the issue using only standard Airflow decorators and task mapping.
from airflow.decorators import dag, task, task_group
from datetime import datetime
import os
import csv
@task
def extract_data():
# Simulate list of 2 "files"
return ["doc1.pdf", "doc2.pdf"]
@task
def load_data(final_result: dict):
print("✅ Received:", final_result)
with open("/tmp/debug_results.csv", "a", newline='') as f:
writer = csv.DictWriter(f, fieldnames=["filename", "meta"])
writer.writerow(final_result)
@task_group
def transform_pipeline(input_path: str):
@task
def extract_meta(path: str):
return {"meta": "dummy_metadata", "filename": path}
return extract_meta(input_path)
@dag(start_date=datetime(2024, 1, 1), schedule_interval=None, catchup=False)
def mapped_taskgroup_bug():
input_paths = extract_data()
# TaskGroup is mapped here
transformed_results = transform_pipeline.expand(input_path=input_paths)
# ❌ This expand fails to trigger (upstream_failed)
load_data.expand(final_result=transformed_results)
# ✅ If instead I do this:
# @task
# def debug(results): return results
# load_data.expand(final_result=debug(transformed_results))
# → everything works
dag_instance = mapped_taskgroup_bug()
🧭 Observed behavior
transform_pipeline.expand(...) runs properly and creates 2 mapped task groups
Each inner task instance (e.g. transform_pipeline.extract_meta[0], [1]) finishes with success
But load_data.expand(...) never gets scheduled; it's marked as upstream_failed
There are no error logs in load_data, because it's never triggered
Replacing .expand() with a non-mapped task fixes it
✅ Expected behavior
Since all upstream mapped instances are successful and return dictionaries, the mapped load_data task should:
be expanded into 2 instances
and executed with the corresponding input dictionaries
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
🐛 Mapped task not triggered after mapped TaskGroup despite successful upstream

Summary
A mapped task (@task.expand(...)) does not get triggered after a mapped @task_group.expand(...), even though all upstream task instances inside the group are marked as success. This occurs when chaining two expansions in a "map -> map" pattern.
If I replace the second .expand() (on the load_data task) with a regular task call, the DAG runs fine — but when .expand() is used, the downstream task never runs and shows an upstream_failed status.
🔁 Reproducible example
This DAG reproduces the issue using only standard Airflow decorators and task mapping.
🧭 Observed behavior
transform_pipeline.expand(...) runs properly and creates 2 mapped task groups
Each inner task instance (e.g. transform_pipeline.extract_meta[0], [1]) finishes with success
But load_data.expand(...) never gets scheduled; it's marked as upstream_failed
There are no error logs in load_data, because it's never triggered
Replacing .expand() with a non-mapped task fixes it
✅ Expected behavior
Since all upstream mapped instances are successful and return dictionaries, the mapped load_data task should:
be expanded into 2 instances
and executed with the corresponding input dictionaries
Beta Was this translation helpful? Give feedback.
All reactions