Category : airflow

I am trying to run my python script via BashOperator, but it is throwing an error. Error: Running command: [‘bash’, ‘-c’, ‘python /opt/***/dags/scripts/Pscript.py’] [2021-06-16 14:58:19,148] {subprocess.py:75} INFO – Output: [2021-06-16 14:58:35,107] {subprocess.py:83} INFO – Command exited with return code -9 [2021-06-16 14:58:36,460] {taskinstance.py:1481} ERROR – Task failed with exception Source: Python..

Read more

I have an Airflow (2.1.0 Stable) task, Call_Shell_Script = SSHOperator( task_id=’Call_Shell_Script’, ssh_conn_id=’ssh_remote_machine’, command="sh /Users/user/Desktop/ip_to_text.sh " ) which remote connects to my desktop running macOS, and tries to call a shell script. However, the following error occurred: airflow.exceptions.AirflowException: SSH operator error: error running cmd: sh /Users/user/Desktop/ip_to_text.sh , error: sh: /Users/user/Desktop/ip_to_text.sh: Operation not permitted I also tried ..

Read more

while scheduler try to read new *.py file with dag, I’m getting error in log file: Process DagFileProcessor0-Process: Traceback (most recent call last): File "/usr/lib64/python3.6/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/usr/lib64/python3.6/multiprocessing/process.py", line 93, in run self._target(*self._args, **self._kwargs) File "/usr/local/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 187, in _run_file_processor callback_requests=callback_requests, File "/usr/local/lib/python3.6/site-packages/airflow/utils/session.py", line 70, in wrapper return func(*args, session=session, **kwargs) ..

Read more

So, I was changing the DAG of my work ETL when I found something like this. # (…) def build_spark_task(**kwargs) -> SparkTaskConfig: spark_task = SparkTaskConfig(…) # (…) return spark_task def get_spark_k8s_tasks_tuple(**kwargs) -> (SparkKubernetesOperator, SparkKubernetesSensor): create_k8s_app = SparkKubernetesOperator(…) # the task wait_app_completion = SparkKubernetesSensor(…) # the task sensor # (…) return create_k8s_app, wait_app_completion # Bronze layer ..

Read more

I am trying to find a way for connection pool management for external connections created in Airflow. Airflow version : 2.1.0 Python Version : 3.9.5 Airflow DB : SQLite External connections created : MySQL and Snowflake I know there are properties in airflow.cfg file sql_alchemy_pool_enabled = True sql_alchemy_pool_size = 5 But these properties are for ..

Read more