DAG: c4_financial_data

schedule: 0 0,4,8,11,16 * * *


Task Instance: getDrowToken


Task Instance Details

Dependencies Blocking Task From Getting Scheduled
Dependency Reason
Pool Slots Available ("Tasks using non-existent pool '%s' will not be scheduled", 'default_pool')
Dagrun Running Task instance's dagrun was not in the 'running' state but in the state 'success'.
Task Instance State Task is in the 'success' state which is not a valid state for execution. The task must be cleared in order to be run.
Attribute: python_callable
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
def getDrowToken(**context):
    # response = SimpleHttpOperator(
    #     task_id="getDrowToken",
    #     http_conn_id="getDrowToken",
    #     endpoint="https://uat2.drow.cloud/api/auth/authenticate", 
    #     method="POST",
    #     data={
    #     "username": "icwp2@drow.cloud",
    #     "password": "dGVzdDAxQHRlc3QuY29t"
    #     },
    #     xcom_push=True,
    # )

    response = requests.post(
    url=f"{dRoW_api_end_url}/api/auth/authenticate",
    data={
    "username": "icwp2@drow.cloud",
    "password": "dGVzdDAxQHRlc3QuY29t"
    }
    ).json()
    context["ti"].xcom_push(key="token", value=response['token'])
Task Instance Attributes
Attribute Value
dag_id c4_financial_data
duration 0.409359
end_date 2025-03-10 16:00:29.188699+00:00
execution_date 2025-03-10T11:00:00+00:00
executor_config {}
generate_command <function TaskInstance.generate_command at 0x7f152f9bf320>
hostname 63fbafbc3109
is_premature False
job_id 112233
key ('c4_financial_data', 'getDrowToken', <Pendulum [2025-03-10T11:00:00+00:00]>, 2)
log <Logger airflow.task (INFO)>
log_filepath /usr/local/airflow/logs/c4_financial_data/getDrowToken/2025-03-10T11:00:00+00:00.log
log_url http://localhost:8080/admin/airflow/log?execution_date=2025-03-10T11%3A00%3A00%2B00%3A00&task_id=getDrowToken&dag_id=c4_financial_data
logger <Logger airflow.task (INFO)>
mark_success_url http://localhost:8080/success?task_id=getDrowToken&dag_id=c4_financial_data&execution_date=2025-03-10T11%3A00%3A00%2B00%3A00&upstream=false&downstream=false
max_tries 1
metadata MetaData(bind=None)
next_try_number 2
operator PythonOperator
pid 891341
pool default_pool
prev_attempted_tries 1
previous_execution_date_success 2025-03-10 08:00:00+00:00
previous_start_date_success 2025-03-10 11:00:39.678902+00:00
previous_ti <TaskInstance: c4_financial_data.getDrowToken 2025-03-10 08:00:00+00:00 [success]>
previous_ti_success <TaskInstance: c4_financial_data.getDrowToken 2025-03-10 08:00:00+00:00 [success]>
priority_weight 2
queue default
queued_dttm 2025-03-10 16:00:26.663425+00:00
raw False
run_as_user None
start_date 2025-03-10 16:00:28.779340+00:00
state success
task <Task(PythonOperator): getDrowToken>
task_id getDrowToken
test_mode False
try_number 2
unixname airflow
Task Attributes
Attribute Value
dag <DAG: c4_financial_data>
dag_id c4_financial_data
depends_on_past False
deps {<TIDep(Not In Retry Period)>, <TIDep(Trigger Rule)>, <TIDep(Previous Dagrun State)>}
do_xcom_push True
downstream_list [<Task(PythonOperator): getMongoDB>]
downstream_task_ids {'getMongoDB'}
email None
email_on_failure True
email_on_retry True
end_date None
execution_timeout None
executor_config {}
extra_links []
global_operator_extra_link_dict {}
inlets []
lineage_data None
log <Logger airflow.task.operators (INFO)>
logger <Logger airflow.task.operators (INFO)>
max_retry_delay None
on_failure_callback None
on_retry_callback None
on_success_callback None
op_args []
op_kwargs {}
operator_extra_link_dict {}
operator_extra_links ()
outlets []
owner airflow
params {}
pool default_pool
priority_weight 1
priority_weight_total 2
provide_context True
queue default
resources None
retries 1
retry_delay 0:05:00
retry_exponential_backoff False
run_as_user None
schedule_interval 0 0,4,8,11,16 * * *
shallow_copy_attrs ('python_callable', 'op_kwargs')
sla None
start_date 2022-10-24T00:00:00+00:00
subdag None
task_concurrency None
task_id getDrowToken
task_type PythonOperator
template_ext []
template_fields ('templates_dict', 'op_args', 'op_kwargs')
templates_dict None
trigger_rule all_success
ui_color #ffefeb
ui_fgcolor #000
upstream_list []
upstream_task_ids set()
wait_for_downstream False
weight_rule downstream