Hey @Anonymous - thank you for your reply, sure here is the log contents...
af-8a562d6a58184d509fe8af27d6267e64-worker-0.af-8a562d6a58184d509fe8af27d6267e64-worker.adf.svc.cluster.local
*** Found logs served from host http://af-8a562d6a58184d509fe8af27d6267e64-worker-0.af-8a562d6a58184d509fe8af27d6267e64-worker.adf.svc.cluster.local:8793/log/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline/attempt=1.log
[2024-07-31T14:04:40.081+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:40.262+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:40.364+0000] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [queued]>
[2024-07-31T14:04:40.377+0000] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [queued]>
[2024-07-31T14:04:40.377+0000] {taskinstance.py:1308} INFO - Starting attempt 1 of 1
[2024-07-31T14:04:40.395+0000] {taskinstance.py:1327} INFO - Executing <Task(FabricRunItemOperator): run_fabric_pipeline> on 2024-07-31 14:04:38.329715+00:00
[2024-07-31T14:04:40.400+0000] {standard_task_runner.py:57} INFO - Started process 51 to run task
[2024-07-31T14:04:40.406+0000] {standard_task_runner.py:84} INFO - Running: ['airflow', 'tasks', 'run', 'Run_Fabric_Item', 'run_fabric_pipeline', 'manual__2024-07-31T14:04:38.329715+00:00', '--job-id', '48', '--raw', '--subdir', 'DAGS_FOLDER/fabric_test.py', '--cfg-path', '/tmp/tmpv9e7md6v']
[2024-07-31T14:04:40.407+0000] {standard_task_runner.py:85} INFO - Job 48: Subtask run_fabric_pipeline
[2024-07-31T14:04:40.493+0000] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py:195: DeprecationWarning: The sql_alchemy_conn option in [core] has been moved to the sql_alchemy_conn option in [database] - the old setting has been used, but please update your config.
SQL_ALCHEMY_CONN = conf.get("database", "SQL_ALCHEMY_CONN")
[2024-07-31T14:04:40.731+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:40.743+0000] {task_command.py:410} INFO - Running <TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [running]> on host af-8a562d6a58184d509fe8af27d6267e64-worker-0.af-8a562d6a58184d509fe8af27d6267e64-worker.adf.svc.cluster.local
[2024-07-31T14:04:41.172+0000] {taskinstance.py:1545} INFO - Exporting env vars: AIRFLOW_CTX_DAG_OWNER='airflow' AIRFLOW_CTX_DAG_ID='Run_Fabric_Item' AIRFLOW_CTX_TASK_ID='run_fabric_pipeline' AIRFLOW_CTX_EXECUTION_DATE='2024-07-31T14:04:38.329715+00:00' AIRFLOW_CTX_TRY_NUMBER='1' AIRFLOW_CTX_DAG_RUN_ID='manual__2024-07-31T14:04:38.329715+00:00'
[2024-07-31T14:04:41.285+0000] {base.py:73} INFO - Using connection ID 'fabric_default' for task execution.
[2024-07-31T14:04:42.522+0000] {fabric.py:158} INFO - Deferring the task to wait for item run to complete.
[2024-07-31T14:04:42.597+0000] {taskinstance.py:1415} INFO - Pausing task as DEFERRED. dag_id=Run_Fabric_Item, task_id=run_fabric_pipeline, execution_date=20240731T140438, start_date=20240731T140440
[2024-07-31T14:04:42.665+0000] {local_task_job_runner.py:222} INFO - Task exited with return code 100 (task deferral)
[2024-07-31T14:04:43.829+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:44.004+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:44.109+0000] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [queued]>
[2024-07-31T14:04:44.125+0000] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [queued]>
[2024-07-31T14:04:44.130+0000] {taskinstance.py:1306} INFO - Resuming after deferral
[2024-07-31T14:04:44.148+0000] {taskinstance.py:1327} INFO - Executing <Task(FabricRunItemOperator): run_fabric_pipeline> on 2024-07-31 14:04:38.329715+00:00
[2024-07-31T14:04:44.153+0000] {standard_task_runner.py:57} INFO - Started process 53 to run task
[2024-07-31T14:04:44.158+0000] {standard_task_runner.py:84} INFO - Running: ['airflow', 'tasks', 'run', 'Run_Fabric_Item', 'run_fabric_pipeline', 'manual__2024-07-31T14:04:38.329715+00:00', '--job-id', '49', '--raw', '--subdir', 'DAGS_FOLDER/fabric_test.py', '--cfg-path', '/tmp/tmpzr7ui6kt']
[2024-07-31T14:04:44.159+0000] {standard_task_runner.py:85} INFO - Job 49: Subtask run_fabric_pipeline
[2024-07-31T14:04:44.241+0000] {warnings.py:109} WARNING - /home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py:195: DeprecationWarning: The sql_alchemy_conn option in [core] has been moved to the sql_alchemy_conn option in [database] - the old setting has been used, but please update your config.
SQL_ALCHEMY_CONN = conf.get("database", "SQL_ALCHEMY_CONN")
[2024-07-31T14:04:44.412+0000] {logging_mixin.py:150} INFO - Changing /opt/airflow/logs/dag_id=Run_Fabric_Item/run_id=manual__2024-07-31T14:04:38.329715+00:00/task_id=run_fabric_pipeline permission to 509
[2024-07-31T14:04:44.434+0000] {task_command.py:410} INFO - Running <TaskInstance: Run_Fabric_Item.run_fabric_pipeline manual__2024-07-31T14:04:38.329715+00:00 [running]> on host af-8a562d6a58184d509fe8af27d6267e64-worker-0.af-8a562d6a58184d509fe8af27d6267e64-worker.adf.svc.cluster.local
[2024-07-31T14:04:44.848+0000] {taskinstance.py:1598} ERROR - Trigger failed:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/triggerer_job_runner.py", line 686, in update_triggers
trigger_class = self.get_trigger_by_classpath(new_trigger_orm.classpath)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/jobs/triggerer_job_runner.py", line 727, in get_trigger_by_classpath
self.trigger_cache[classpath] = import_string(classpath)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/module_loading.py", line 36, in import_string
module = import_module(module_path)
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'airflow.providers.microsoft.powerbi'
[2024-07-31T14:04:44.921+0000] {taskinstance.py:1824} ERROR - Task failed with exception
airflow.exceptions.TaskDeferralError: Trigger failure
[2024-07-31T14:04:44.926+0000] {taskinstance.py:1345} INFO - Marking task as FAILED. dag_id=Run_Fabric_Item, task_id=run_fabric_pipeline, execution_date=20240731T140438, start_date=20240731T140440, end_date=20240731T140444
[2024-07-31T14:04:44.944+0000] {standard_task_runner.py:104} ERROR - Failed to execute job 49 for task run_fabric_pipeline (Trigger failure; 53)
[2024-07-31T14:04:44.972+0000] {local_task_job_runner.py:225} INFO - Task exited with return code 1
[2024-07-31T14:04:45.066+0000] {taskinstance.py:2653} INFO - 0 downstream tasks scheduled from follow-on schedule check