Dependency | Reason |
---|---|
Dagrun Running | Task instance's dagrun was not in the 'running' state but in the state 'failed'. |
Trigger Rule | Task's trigger rule 'all_success' requires all upstream tasks to have succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 1, 'upstream_failed': 0, 'done': 1}, upstream_task_ids={'create_table_task'} |
Task Instance State | Task is in the 'upstream_failed' state which is not a valid state for execution. The task must be cleared in order to be run. |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | def getDrowToken(**context):
# response = SimpleHttpOperator(
# task_id="getDrowToken",
# http_conn_id="getDrowToken",
# endpoint="https://uat2.drow.cloud/api/auth/authenticate",
# method="POST",
# data={
# "username": "icwp2@drow.cloud",
# "password": "dGVzdDAxQHRlc3QuY29t"
# },
# xcom_push=True,
# )
response = requests.post(
url=f"{dRoW_api_end_url}/api/auth/authenticate",
data={
"username": "icwp2@drow.cloud",
"password": "dGVzdDAxQHRlc3QuY29t"
}
).json()
context["ti"].xcom_push(key="token", value=response['token'])
|
Attribute | Value |
---|---|
dag_id | nd201907_cleaning |
duration | None |
end_date | 2025-04-26 10:06:44.692421+00:00 |
execution_date | 2025-04-26T04:00:00+00:00 |
executor_config | {} |
generate_command | <function TaskInstance.generate_command at 0x7f152f9bf320> |
hostname | |
is_premature | False |
job_id | None |
key | ('nd201907_cleaning', 'getDrowToken', <Pendulum [2025-04-26T04:00:00+00:00]>, 1) |
log | <Logger airflow.task (INFO)> |
log_filepath | /usr/local/airflow/logs/nd201907_cleaning/getDrowToken/2025-04-26T04:00:00+00:00.log |
log_url | http://localhost:8080/admin/airflow/log?execution_date=2025-04-26T04%3A00%3A00%2B00%3A00&task_id=getDrowToken&dag_id=nd201907_cleaning |
logger | <Logger airflow.task (INFO)> |
mark_success_url | http://localhost:8080/success?task_id=getDrowToken&dag_id=nd201907_cleaning&execution_date=2025-04-26T04%3A00%3A00%2B00%3A00&upstream=false&downstream=false |
max_tries | 1 |
metadata | MetaData(bind=None) |
next_try_number | 1 |
operator | None |
pid | None |
pool | default_pool |
prev_attempted_tries | 0 |
previous_execution_date_success | 2025-04-22 16:00:00+00:00 |
previous_start_date_success | 2025-04-22 22:01:19.250567+00:00 |
previous_ti | <TaskInstance: nd201907_cleaning.getDrowToken 2025-04-25 22:00:00+00:00 [upstream_failed]> |
previous_ti_success | <TaskInstance: nd201907_cleaning.getDrowToken 2025-04-22 16:00:00+00:00 [success]> |
priority_weight | 2 |
queue | default |
queued_dttm | None |
raw | False |
run_as_user | None |
start_date | 2025-04-26 10:06:44.692400+00:00 |
state | upstream_failed |
task | <Task(PythonOperator): getDrowToken> |
task_id | getDrowToken |
test_mode | False |
try_number | 1 |
unixname | airflow |
Attribute | Value |
---|---|
dag | <DAG: nd201907_cleaning> |
dag_id | nd201907_cleaning |
depends_on_past | False |
deps | {<TIDep(Not In Retry Period)>, <TIDep(Trigger Rule)>, <TIDep(Previous Dagrun State)>} |
do_xcom_push | True |
downstream_list | [<Task(PythonOperator): getMongoDB>] |
downstream_task_ids | {'getMongoDB'} |
None | |
email_on_failure | True |
email_on_retry | True |
end_date | None |
execution_timeout | None |
executor_config | {} |
extra_links | [] |
global_operator_extra_link_dict | {} |
inlets | [] |
lineage_data | None |
log | <Logger airflow.task.operators (INFO)> |
logger | <Logger airflow.task.operators (INFO)> |
max_retry_delay | None |
on_failure_callback | None |
on_retry_callback | None |
on_success_callback | None |
op_args | [] |
op_kwargs | {} |
operator_extra_link_dict | {} |
operator_extra_links | () |
outlets | [] |
owner | airflow |
params | {} |
pool | default_pool |
priority_weight | 1 |
priority_weight_total | 2 |
provide_context | True |
queue | default |
resources | None |
retries | 1 |
retry_delay | 0:05:00 |
retry_exponential_backoff | False |
run_as_user | None |
schedule_interval | 0 4,10,16,22 * * * |
shallow_copy_attrs | ('python_callable', 'op_kwargs') |
sla | None |
start_date | 2023-01-17T00:00:00+00:00 |
subdag | None |
task_concurrency | None |
task_id | getDrowToken |
task_type | PythonOperator |
template_ext | [] |
template_fields | ('templates_dict', 'op_args', 'op_kwargs') |
templates_dict | None |
trigger_rule | all_success |
ui_color | #ffefeb |
ui_fgcolor | #000 |
upstream_list | [<Task(PostgresOperator): create_table_task>] |
upstream_task_ids | {'create_table_task'} |
wait_for_downstream | False |
weight_rule | downstream |