Description
Apache Airflow version
2.11.0
If "Other Airflow 2 version" selected, which one?
No response
What happened?
Hi team,
This is related to the google cloud run operator
from airflow.providers.google.cloud.operators.cloud_run import (
CloudRunExecuteJobOperator,
)
When passing a number of retries to that operator i exepect the task to retry that number of times. However if the Number of retries per failed task
parameter is set in the jobs definition, then the task will retry x times and report it as a single try on airflow.
What you think should happen instead?
I think passing retries
to the CloudRunExecuteJobOperator
should override the Number of retries per failed task
parameter in the job definition, probably setting it to 0 ? Such that all the tries are co-ordinated by airflow.
How to reproduce
Create a new cloud run job and have this parameter set:
Make sure the code in the job will fail.
CloudRunExecuteJobOperator(
task_id="example",
project_id="example",
region="europe-north1",
job_name="example_job_name",
overrides={
"container_overrides": [
{
"args": container_args,
}
]
},
dag=dag,
deferrable=True,
retries=1
)
The task will retry 3 times, then report only one failure, get triggered again by airflow and retry 3 times again.
Operating System
Linux
Versions of Apache Airflow Providers
No response
Deployment
Google Cloud Composer
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct