write logs to NON-AWS S3 object storage #13961
-
My organization has an internally hosted S3 object storage that is compatible with AWS. I can access it with the botocore libraries outside of Airflow. However, I am unable to configure Airflow to write logs to it. The Airflow S3_hook appears to hardcode the endpoint URL to the amazon location and I cannot seem to override it. So it looks for my S3 bucket out there in AWS instead of inside my corporate network. Has anyone else tried to write logs to S3 storage that was NOT AWS? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
It's not well documented, but it is possible if you put a https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html { "host": "https://my-s3-alike.internal/" } If that works a Doc PR to make this more obvious would be great! |
Beta Was this translation helpful? Give feedback.
It's not well documented, but it is possible if you put a
host
extra in the connectionhttps://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html
If that works a Doc PR to make this more obvious would be great!