0

I have been using multiprocessing to split a time consuming task into smaller independent tasks. Each subtask will store the results in a file. To manage concurrency for writing to the file, I've created a queue to have a dedicated process to access the file and avoid collisions, as suggested in Python multiprocessing safely writing to a file.

All this works fine. However, when I try to run the task from a Flask application using Celery, it throws the following exception:

File "../utils/task_management.py", line 74, in task_management
    manager = mp.Manager()
  File "/usr/lib64/python3.6/multiprocessing/context.py", line 56, in Manager
    m.start()
  File "/usr/lib64/python3.6/multiprocessing/managers.py", line 513, in start
    self._process.start()
  File "/usr/lib64/python3.6/multiprocessing/process.py", line 103, in start
    'daemonic processes are not allowed to have children'
AssertionError: daemonic processes are not allowed to have children

This is the task_management.py code:

def write_file(sub_taks_list):
    manager = mp.Manager()
    q = manager.Queue() 
    pool = mp.Pool(mp.cpu_count())
    watcher = pool.apply_async(listener, (q,))
    results = pool.map(sub_taks_handler, sub_taks_list)
    ...


def listener(q):
    output_file = './output/'
    with open(output_file, 'w') as f:
        while 1:
            m = q.get()
            if m == 'kill':
                break
            f.write(m)
            f.flush()
            m = '' 

Is there any workaround that I could use to make the process from multiprocessing manager non-daemonic?

Thanks a lot!

Maria
  • 43
  • 4
  • [This thread](https://stackoverflow.com/questions/6974695/python-process-pool-non-daemonic) looks like a similar problem and may be of some help. – sj95126 Aug 20 '21 at 03:54

0 Answers0