| Dependency | Reason |
|---|---|
| Task Instance State | Task is in the 'success' state which is not a valid state for execution. The task must be cleared in order to be run. |
| Dagrun Running | Task instance's dagrun was not in the 'running' state but in the state 'success'. |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 | def getPaymentStatistics(**context):
token = context.get("ti").xcom_pull(key="token")
PaymentData = getSheetData(token, "69045bd4e7623895abe44568")
FinalStatsData = getSheetData(token, "69045f64c641945865aafd76")
# PostgreSQL Database Connection Parameters
host = 'drowdatewarehouse.crlwwhgepgi7.ap-east-1.rds.amazonaws.com'
dbUserName = 'dRowAdmin'
dbUserPassword = 'drowsuper'
database = 'drowDateWareHouse'
charSet = "utf8mb4"
port = "5432"
conn_string = ('postgres://' +
dbUserName + ':' +
dbUserPassword +
'@' + host + ':' + port +
'/' + database)
db = create_engine(conn_string)
conn = db.connect()
latest_ip = 0
with conn as conn:
df = pd.DataFrame()
Mappings = {}
for x in PaymentData:
df_nested_list = json_normalize(x)
latest_ip = df_nested_list['IP No.'].max()
df = df.append(df_nested_list, ignore_index=True)
df.rename(columns=Mappings, inplace=True)
df.columns = df.columns.str.replace(' ', '_').str.replace('.', '').str.replace('(', '_').str.replace(')', '').str.replace('%', 'percent').str.replace('/', '_')
print("Payment Statistics df:", df)
df.to_sql('ssm519_payment_statistics', con=conn, if_exists='replace', index=False)
df = pd.DataFrame()
Mappings = {}
print("Latest IP No.:", latest_ip)
for x in FinalStatsData:
df_nested_list = json_normalize(x)
df = df.append(df_nested_list, ignore_index=True)
df.rename(columns=Mappings, inplace=True)
df.columns = df.columns.str.replace(' ', '_').str.replace('.', '').str.replace('(', '_').str.replace(')', '').str.replace('%', 'percent').str.replace('/', '_')
df['Contract_Number'] = 'SSM519'
df['Latest_IP_No'] = latest_ip
main_df = pd.read_sql('SELECT * FROM scc_final_stats', con=conn)
# Remove the old data for this contract number
main_df = main_df[main_df['Contract_Number'] != 'SSM519']
# Add the new data
main_df = pd.concat([main_df, df], ignore_index=True)
# Replace the SQL table with updated data
main_df.to_sql('scc_final_stats', con=conn, if_exists='replace', index=False)
|
| Attribute | Value |
|---|---|
| dag_id | ssm519-scc |
| duration | 3.091161 |
| end_date | 2025-11-04 09:58:48.758249+00:00 |
| execution_date | 2025-11-04T09:56:51.825734+00:00 |
| executor_config | {} |
| generate_command | <function TaskInstance.generate_command at 0x7f81b3e42320> |
| hostname | 63fbafbc3109 |
| is_premature | False |
| job_id | 251395 |
| key | ('ssm519-scc', 'getPaymentStatistics', <Pendulum [2025-11-04T09:56:51.825734+00:00]>, 2) |
| log | <Logger airflow.task (INFO)> |
| log_filepath | /usr/local/airflow/logs/ssm519-scc/getPaymentStatistics/2025-11-04T09:56:51.825734+00:00.log |
| log_url | http://localhost:8080/admin/airflow/log?execution_date=2025-11-04T09%3A56%3A51.825734%2B00%3A00&task_id=getPaymentStatistics&dag_id=ssm519-scc |
| logger | <Logger airflow.task (INFO)> |
| mark_success_url | http://localhost:8080/success?task_id=getPaymentStatistics&dag_id=ssm519-scc&execution_date=2025-11-04T09%3A56%3A51.825734%2B00%3A00&upstream=false&downstream=false |
| max_tries | 1 |
| metadata | MetaData(bind=None) |
| next_try_number | 2 |
| operator | PythonOperator |
| pid | 3127631 |
| pool | default_pool |
| prev_attempted_tries | 1 |
| previous_execution_date_success | 2025-11-04 07:00:00+00:00 |
| previous_start_date_success | 2025-11-04 15:03:30.942701+00:00 |
| previous_ti | <TaskInstance: ssm519-scc.getPaymentStatistics 2025-11-04 07:00:00+00:00 [success]> |
| previous_ti_success | <TaskInstance: ssm519-scc.getPaymentStatistics 2025-11-04 07:00:00+00:00 [success]> |
| priority_weight | 1 |
| queue | default |
| queued_dttm | 2025-11-04 09:58:44.062951+00:00 |
| raw | False |
| run_as_user | None |
| start_date | 2025-11-04 09:58:45.667088+00:00 |
| state | success |
| task | <Task(PythonOperator): getPaymentStatistics> |
| task_id | getPaymentStatistics |
| test_mode | False |
| try_number | 2 |
| unixname | airflow |
| Attribute | Value |
|---|---|
| dag | <DAG: ssm519-scc> |
| dag_id | ssm519-scc |
| depends_on_past | False |
| deps | {<TIDep(Trigger Rule)>, <TIDep(Not In Retry Period)>, <TIDep(Previous Dagrun State)>} |
| do_xcom_push | True |
| downstream_list | [] |
| downstream_task_ids | set() |
| None | |
| email_on_failure | True |
| email_on_retry | True |
| end_date | None |
| execution_timeout | None |
| executor_config | {} |
| extra_links | [] |
| global_operator_extra_link_dict | {} |
| inlets | [] |
| lineage_data | None |
| log | <Logger airflow.task.operators (INFO)> |
| logger | <Logger airflow.task.operators (INFO)> |
| max_retry_delay | None |
| on_failure_callback | None |
| on_retry_callback | None |
| on_success_callback | None |
| op_args | [] |
| op_kwargs | {'name': 'Dylan'} |
| operator_extra_link_dict | {} |
| operator_extra_links | () |
| outlets | [] |
| owner | airflow |
| params | {} |
| pool | default_pool |
| priority_weight | 1 |
| priority_weight_total | 1 |
| provide_context | True |
| queue | default |
| resources | None |
| retries | 1 |
| retry_delay | 0:05:00 |
| retry_exponential_backoff | False |
| run_as_user | None |
| schedule_interval | 0 7,15 * * * |
| shallow_copy_attrs | ('python_callable', 'op_kwargs') |
| sla | None |
| start_date | 2023-01-17T00:00:00+00:00 |
| subdag | None |
| task_concurrency | None |
| task_id | getPaymentStatistics |
| task_type | PythonOperator |
| template_ext | [] |
| template_fields | ('templates_dict', 'op_args', 'op_kwargs') |
| templates_dict | None |
| trigger_rule | all_success |
| ui_color | #ffefeb |
| ui_fgcolor | #000 |
| upstream_list | [<Task(PythonOperator): getDrowToken>] |
| upstream_task_ids | {'getDrowToken'} |
| wait_for_downstream | False |
| weight_rule | downstream |