Replies: 1 comment
-
It seems to be tied to overloading the pod, i gave it more resources and reduced max_workers to a lower number and things are a little more stable, if we send a sigterm shouldnt the pod be killed? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Airflow Version 1.10.13
I am trying to use concurrent.futures module within a task and everytime after the first task it sends sigterm so the tasks get stucks in a running state with this error
[2020-12-10 18:25:24,749] {taskinstance.py:955} ERROR - Received SIGTERM. Terminating subprocesses.
[2020-12-10 18:25:24,749] {taskinstance.py:955} ERROR - Received SIGTERM. Terminating subprocesses.
[2020-12-10 18:25:24,750] {logging_mixin.py:112} WARNING - Process ForkProcess-58:
[2020-12-10 18:25:24,750] {taskinstance.py:955} ERROR - Received SIGTERM. Terminating subprocesses.
[2020-12-10 18:25:24,750] {logging_mixin.py:112} WARNING - Process ForkProcess-59:
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - Traceback (most recent call last):
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/usr/lib64/python3.8/multiprocessing/process.py", line 313, in _bootstrap
self.run()
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/usr/lib64/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/usr/lib64/python3.8/concurrent/futures/process.py", line 233, in _process_worker
call_item = call_queue.get(block=True)
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/usr/lib64/python3.8/multiprocessing/queues.py", line 96, in get
with self._rlock:
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/usr/lib64/python3.8/multiprocessing/synchronize.py", line 95, in enter
return self._semlock.enter()
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - File "/opt/app-root/lib64/python3.8/site-packages/airflow/models/taskinstance.py", line 957, in signal_handler
raise AirflowException("Task received SIGTERM signal")
[2020-12-10 18:25:24,752] {logging_mixin.py:112} WARNING - airflow.exceptions.AirflowException: Task received SIGTERM signal
Is there anyway to use ThreadPoolExecutor and ProcessPoolExecutor within tasks?
Beta Was this translation helpful? Give feedback.
All reactions