I'm using a logger inside Python process (multiprocessing.process) to log. I declare a FileHandler to log to disk, once the job is done I try to close the underlying file but can't achieve that. My real problem is that I spawn a lot of process resulting in IOError: [Errno 24] Too many open files
I reproduce the error with this snippet (test_process.py):
import logging
import multiprocessing
class TestProcess(multiprocessing.Process):
def __init__(self):
multiprocessing.Process.__init__(self)
self.logger = logging.getLogger('test')
self.logger.setLevel(logging.INFO)
self.handler = logging.FileHandler('test.log')
self.handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
self.handler.setFormatter(formatter)
self.logger.addHandler(self.handler)
def run(self):
self.logger.info('hello')
self.logger.removeHandler(self.handler)
self.handler.close()
and when I run this:
from test_process import TestProcess
p=TestProcess()
p.start()
p.join()
You can check that there is file descriptor alive for the file test.log.
Any hint to overcome this behaviour?