| Dependency | Reason |
|---|---|
| Task Instance State | Task is in the 'upstream_failed' state which is not a valid state for execution. The task must be cleared in order to be run. |
| Dag Not Paused | Task's DAG 'ReportAPIChecker' is paused. |
| Dagrun Running | Task instance's dagrun was not in the 'running' state but in the state 'failed'. |
| Trigger Rule | Task's trigger rule 'all_success' requires all upstream tasks to have succeeded, but found 1 non-success(es). upstream_tasks_state={'total': 1, 'successes': 0, 'skipped': 0, 'failed': 1, 'upstream_failed': 0, 'done': 1}, upstream_task_ids={'getDrowToken'} |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | def getReportAPIAndSendReq(**context):
recordIdMap = open(f'{CUR_DIR}/data/workflowRecordIdMap.json')
recordIdMapData = json.load(recordIdMap)
modulesConfig =open(f'{CUR_DIR}/data/modulesConfigs.json')
modulesConfigData = json.load(modulesConfig)
for data in modulesConfigData:
result=f'{dRoW_api_end_url}/api/module/reporting?report='
if data['configs']['config'].get('outputType') != None:
result += data['configs']['key'] + '_' + data['configs']['config']['outputType'] + '&' + 'workflow_id=' + data['relatedId']
else:
result += data['configs']['key'] + '&' + 'workflow_id=' + data['relatedId']
recordId = next((_data for _data in recordIdMapData if _data.get('workflowId')==data['relatedId']), None)
if recordId !=None:
result += '&record_id=' + recordId['recordId']
for _config in data['configs']['config'].keys():
__config=re.sub('([a-z0-9])([A-Z])', r'\1_\2', _config).lower()
result += '&' + __config + '=' + f"{data['configs']['config'][_config]}"
result += '&year=2022&month=10&day=31'
|
| Attribute | Value |
|---|---|
| dag_id | ReportAPIChecker |
| duration | None |
| end_date | 2025-07-21 00:07:33.494347+00:00 |
| execution_date | 2025-07-20T00:00:00+00:00 |
| executor_config | {} |
| generate_command | <function TaskInstance.generate_command at 0x7f81b3e42320> |
| hostname | |
| is_premature | False |
| job_id | None |
| key | ('ReportAPIChecker', 'getReportAPIAndSendReq', <Pendulum [2025-07-20T00:00:00+00:00]>, 1) |
| log | <Logger airflow.task (INFO)> |
| log_filepath | /usr/local/airflow/logs/ReportAPIChecker/getReportAPIAndSendReq/2025-07-20T00:00:00+00:00.log |
| log_url | http://localhost:8080/admin/airflow/log?execution_date=2025-07-20T00%3A00%3A00%2B00%3A00&task_id=getReportAPIAndSendReq&dag_id=ReportAPIChecker |
| logger | <Logger airflow.task (INFO)> |
| mark_success_url | http://localhost:8080/success?task_id=getReportAPIAndSendReq&dag_id=ReportAPIChecker&execution_date=2025-07-20T00%3A00%3A00%2B00%3A00&upstream=false&downstream=false |
| max_tries | 1 |
| metadata | MetaData(bind=None) |
| next_try_number | 1 |
| operator | None |
| pid | None |
| pool | default_pool |
| prev_attempted_tries | 0 |
| previous_execution_date_success | None |
| previous_start_date_success | None |
| previous_ti | <TaskInstance: ReportAPIChecker.getReportAPIAndSendReq 2025-07-19 00:00:00+00:00 [upstream_failed]> |
| previous_ti_success | None |
| priority_weight | 1 |
| queue | default |
| queued_dttm | None |
| raw | False |
| run_as_user | None |
| start_date | 2025-07-21 00:07:33.494334+00:00 |
| state | upstream_failed |
| task | <Task(PythonOperator): getReportAPIAndSendReq> |
| task_id | getReportAPIAndSendReq |
| test_mode | False |
| try_number | 1 |
| unixname | airflow |
| Attribute | Value |
|---|---|
| dag | <DAG: ReportAPIChecker> |
| dag_id | ReportAPIChecker |
| depends_on_past | False |
| deps | {<TIDep(Trigger Rule)>, <TIDep(Not In Retry Period)>, <TIDep(Previous Dagrun State)>} |
| do_xcom_push | True |
| downstream_list | [] |
| downstream_task_ids | set() |
| None | |
| email_on_failure | True |
| email_on_retry | True |
| end_date | None |
| execution_timeout | None |
| executor_config | {} |
| extra_links | [] |
| global_operator_extra_link_dict | {} |
| inlets | [] |
| lineage_data | None |
| log | <Logger airflow.task.operators (INFO)> |
| logger | <Logger airflow.task.operators (INFO)> |
| max_retry_delay | None |
| on_failure_callback | None |
| on_retry_callback | None |
| on_success_callback | None |
| op_args | [] |
| op_kwargs | {'name': 'Dylan'} |
| operator_extra_link_dict | {} |
| operator_extra_links | () |
| outlets | [] |
| owner | airflow |
| params | {} |
| pool | default_pool |
| priority_weight | 1 |
| priority_weight_total | 1 |
| provide_context | True |
| queue | default |
| resources | None |
| retries | 1 |
| retry_delay | 0:05:00 |
| retry_exponential_backoff | False |
| run_as_user | None |
| schedule_interval | 0 0 * * * |
| shallow_copy_attrs | ('python_callable', 'op_kwargs') |
| sla | None |
| start_date | 2022-10-24T00:00:00+00:00 |
| subdag | None |
| task_concurrency | None |
| task_id | getReportAPIAndSendReq |
| task_type | PythonOperator |
| template_ext | [] |
| template_fields | ('templates_dict', 'op_args', 'op_kwargs') |
| templates_dict | None |
| trigger_rule | all_success |
| ui_color | #ffefeb |
| ui_fgcolor | #000 |
| upstream_list | [<Task(PythonOperator): getDrowToken>] |
| upstream_task_ids | {'getDrowToken'} |
| wait_for_downstream | False |
| weight_rule | downstream |