Replies: 1 comment
-
I am getting the same error as well with the worker on Airflow 3.0.0, celery and redis. Was working fine before 2.11.0 > 3.0.0 upgrade. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Apache Airflow version
helm chart version
1.14.0
What happened
I have setup dags with git sync, disabled persistance and enabled the remote logging, I'm Using external postgres and S3.
values.yaml:
In the UI s3 connection:

Now i randomly get:
In worker pods in k8s i see that the user is used correctly:
there are also no permissions issues, as far as i can tell, because simple hello_world dag does work and creates the files on s3..
Worker logs for dag with s3 issues
I see the issue when running 1 specific dag from our older Airflow instance, this also correlates with the following logs in worker pod:
Any idea why this may be happening?
Beta Was this translation helpful? Give feedback.
All reactions