Airflow – Task date is different than the date in the rendered template
I have an airflow DAG that should run on 1 AM every day. Now from the task details it does seems like it ran on 1 AM on the 24th (the time of this post):
Pass Airflow Variables to Task
Is there an accepted best practice for passing or accessing encrypted Airflow variables/connections in modules executed during the DagRun? Is passing it as an argument to the PythonOperator with op_kwargs sufficient?
How to apply changes to Airflow job_heartbeat_sec scheduler variable
So I’ve updated the value of job_heartbeat_sec
in the [scheduler]
section of airflow.cfg
. I stopped and restarted the scheduler after updating the value of the variable. I expected the /health
endpoint of the Webserver UI to only update the heartbeat timestamp every 20 seconds, but it is still using the default 5.
How to execute a task in Airflow in another server?
I have already set up the following infrastructure:
Set schedule_after_task_execution=False in Ariflow from within a dag
Is it possible to set schedule_after_task_execution=False
from within a DAG? I need this functionality only for one particular DAG that has a problem with the “mini scheduler”.
Is it possible to be set?
I tried this code but it doesn’t seem to work:
Unable to generate dynamic task id’s
I’m trying to execute the same query for a list of BQ tables as separate tasks using BigQueryInsertJobOperator
When I use “Trigger DAG” on the page to execute a task, the task doesn’t run immediately
I installed Airflow Version: v2.8.2 using Docker. When I use “Trigger DAG” on the page to execute a task, the task doesn’t run immediately. Instead, it stays in the queue and only executes after 5 minutes, even though my worker and resources are sufficient. I want the task to execute immediately whenever I click “Trigger DAG”. What should I do?
How to hide environment variable in Airflow DAG’s log?
I am using Airflow to periodically run a query on an Oracle server. The connection string is defined in an Airflow variable called password_oracle_connections, so that it remains secret.
Airflow TriggerDagRunOperator ignores allowed_states parameter
I’ve been trying to implement a master dag, that would trigger multiple dags, wait for their execution and proceed with the next tasks.
Python Airflow – how can I extract the results from a PythonOperator?
I have a dag that calls on a function that returns a PythonOperator
. I want to get the results of this task, so that I can pass it to another task. Any ideas if this is possible?