-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BeamRunPythonPipelineOperator doesn't work with Google Application Default Credentials ADC #42396
Comments
Hi @fpopic , as you checked "Yes I am willing to submit a PR!", I'll assign this to you. But please let us know if you no longer interested in it. Thanks! |
I removed the checkmark, would appreciate help. |
Adding minimal change to latest setup to reproduce In x-airflow-common:
&airflow-common
#...
build: .
#image: ...
environment:
&airflow-common-env
# ...
AIRFLOW__LOGGING__LOGGING_LEVEL: 'WARNING'
AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT: '{ "conn_type": "google_cloud_platform", "extra": { "scope": "https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/drive,https://www.googleapis.com/auth/bigquery", "num_retries": 0 } }'
GOOGLE_CLOUD_PROJECT: <my-project>
volumes: &airflow-common-volumes
# ...
# Use ADC
- ~/.config/gcloud/:/home/airflow/.config/gcloud:rw
... In Dockerfile install google cloud SDK ( # specifying explicit python 3.9 otherwise gcloud sdk installation complains
FROM apache/airflow:2.10.2-python3.9
USER root
## COPY FROM https://github.com/apache/airflow/blob/main/docs/docker-stack/docker-images-recipes/gcloud.Dockerfile
ARG CLOUD_SDK_VERSION=322.0.0
ENV GCLOUD_HOME=/home/airflow/google-cloud-sdk
ENV PATH="${GCLOUD_HOME}/bin/:${PATH}"
USER airflow
RUN DOWNLOAD_URL="https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-${CLOUD_SDK_VERSION}-linux-x86_64.tar.gz" \
&& TMP_DIR="$(mktemp -d)" \
&& curl -fL "${DOWNLOAD_URL}" --output "${TMP_DIR}/google-cloud-sdk.tar.gz" \
&& mkdir -p "${GCLOUD_HOME}" \
&& tar xzf "${TMP_DIR}/google-cloud-sdk.tar.gz" -C "${GCLOUD_HOME}" --strip-components=1 \
&& "${GCLOUD_HOME}/install.sh" \
--bash-completion=false \
--path-update=false \
--usage-reporting=false \
--additional-components alpha beta kubectl \
--quiet \
&& rm -rf "${TMP_DIR}" \
&& rm -rf "${GCLOUD_HOME}/.install/.backup/" \
&& gcloud --version
# END OF COPY
# pip install airflow has to be run with USER airflow
RUN pip install 'apache-airflow==2.10.2' \
apache-airflow[google_auth] \
apache-airflow-providers-google \
apache-airflow[google] \
apache-airflow-providers-apache-beam
#apache-airflow-providers-hashicorp \
RUN mkdir -p /home/airflow/.config/gcloud/ |
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.6.3 (problem occurs in latest version as well, will try to download latest and post log as well)
What happened?
When manually submitting an Apache Beam Python job to Google Dataflow runner using BeamRunPythonPipelineOperator
BeamRunPythonPipelineOperator.pipeline_options.service_account_name
GOOGLE_APPLICATION_CREDENTIALS
GCP_PROJECT
gcloud auth application-default login --disable-quota-project
gcloud auth login
gcloud config set project <project>
google_cloud_default
with following contentBeamRunPythonPipelineOperator.DataflowConfiguration.project_id
Task gets stuck in
apitools
that use oauth2clientand try initiative browser Google sign-in which must fail. I don't understand why the authentication flow ends in that execution branch since the credential of type
authorized_user
exist in the well-known path~/.config/gcloud/application_default_credentials.json
.What you think should happen instead?
Airflow should submit job to Dataflow using Application Default Credentials the same way standalone Apache Beam Python (without Airflow) submits the job to Dataflow does.
I see that Apache Beam already solved that problem apache/beam#15004, hence running Apache Beam Python without Airflow using ADC works.
How to reproduce
Prepare env. variables:
and execute the following DAG
and Apache Beam job source code
job_example_beam_dataflow_python_adc.py
Operating System
macOS 14.7
Versions of Apache Airflow Providers
apache-airflow-providers-apache-beam==5.3.0
Deployment
Docker Airflow / Composer Airflow (doesn't matter, problem occurs in latest version as well).
Deployment details
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: