-
Descriptioncatchup and backfill provided good flexibility to run SELECT * FROM table WHERE created_at BETWEEN {{ prev_data_interval_start_success }} AND {{ ts }} In case of any interruption on the scheduler level such as 1 day, this airflow backfill DAG --start-date=today --end-date=prev_week that would creates 672 with simple feature that only runs 1 instance of Use case/motivation
Related issuesNo response Are you willing to submit a PR?
Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This is likely not going to happen. Airlfow does not really understands what each task inside does and what you are asking for is for airflow to understand that instead of 96 or 672 separate runs of a DAG it can only run one for a bigger period. This is impossible in genaral case even if it seems simple for this particular example. This should be done in a differrnt way. If you want to do similar thing - it's not "Airflow" who should understand that it can be done as a "batch of DagRuns" but it should be the "DAG author" who should understand it and act appropriately. In this case likely a way better approach wll be to write a separate DAG doing this (with daily or weekly schedules as per your example) and if you do not want to repeat too much of the DAG code, you can already create multiple DAGs from single DAG python code following Dynamic DAG generation. Then you can re-run such DAGs with either Daily or Weekly data interval as you see fit. I don't think we will ever even approach implementing something like that at the "Airlfow" level |
Beta Was this translation helpful? Give feedback.
This is likely not going to happen. Airlfow does not really understands what each task inside does and what you are asking for is for airflow to understand that instead of 96 or 672 separate runs of a DAG it can only run one for a bigger period. This is impossible in genaral case even if it seems simple for this particular example.
This should be done in a differrnt way. If you want to do similar thing - it's not "Airflow" who should understand that it can be done as a "batch of DagRuns" but it should be the "DAG author" who should understand it and act appropriately.
In this case likely a way better approach wll be to write a separate DAG doing this (with daily or weekly schedules as per…