You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Optimize dynamic DAG updates to avoid loading large serialized DAGs (#57592)
When updating dynamic DAGs (those without task instances), Airflow was
loading the entire `SerializedDagModel` object from the database, which
could contain megabytes of JSON data, just to update a few fields which
was completely unnecesary.
This change replaces the object-loading approach with a direct SQL UPDATE
statement, significantly improving performance for deployments with large
or frequently-changing dynamic DAGs.
The optimization uses SQLAlchemy's update() construct to modify only the
necessary columns (_data, _data_compressed, dag_hash) without fetching
the existing row, reducing both database load and network transfer.
Additionally, removed an unnecessary session.merge() call on dag_version,
as the object is already tracked by the session after being loaded.
(cherry picked from commit 27c9b94)
0 commit comments