Whitch metadata database objects I can delete in Airflow 2.8.2?
I got Airflow 2.8.2 installed with Docker.
I need clean Postgres database used with Airlow to free space on hard drive.
Whitch metadata database objects I can delete in Airflow 2.8.2?
I got Airflow 2.8.2 installed with Docker.
I need clean Postgres database used with Airlow to free space on hard drive.
Whitch metadata database objects I can delete in Airflow 2.8.2?
I got Airflow 2.8.2 installed with Docker.
I need clean Postgres database used with Airlow to free space on hard drive.
Airflow | The task completes with the SKIPPED status if it was started outside the interval of the schedule_interval parameter
In my DAG setup, the schedule_interval parameter has a cron expression = run 1 time per hour from 5-12 and from 13-18. And one hour (from 12-13) outside this interval.
If you manually start a task (CLEAR Task) outside of this interval, it will not work and will immediately set the SKIPPED status. And if a task started at 11:30 does not have time to complete before 12 o’clock, then it ends with the SKIPPED status.
How is it treated?
How do I attach the contents of a failed Airflow task log to the failure notification email message?
I have an Airflow DAG that runs a BashOperator task. When it fails I get an email with not much detail:
Apache Airflow build via docker-compose Web UI still presenting UTC
I am using docker-compose to build Airflow.
Dynamically create n tasks in airflow from params
I want to dynamically create n taks, n should be defined by params, so i can define it when running the dag in the UI:
Passing Airflow Variables to a constructor
I have a PythonOperator task that requires arguments to be passed to the constructor before the python callable is invoked.
How to pass airflow connection credentials to DatabricksSubmitNowOperator without exposing them?
I’m rather new to airflow and couldn’t find information anywhere. Currently in my DAG i’m reading an existing Postgres connection configured in airflow thus taking the credentials for it and passing them to task that uses DatabricksSubmitNowOperator. I’m using existing cluster and i’m passing them as “base_parameters”. The process works fine but the problem is that when you open Databricks job details “base_parameters” are visible and credentials like login and password are exposed. I tried passing the credential variables as environmental variables but they are not accessible between tasks. TL DR: is there a way to read airflow connection credentials and pass them as variables to DatabricksSubmitNowOperator task without exposing them?
How to pass in-operator parameters to built-in templating for Airflow?
Have recently started to use Airflow for various jobs, trying to get handle on templating.