python rq模块中worker执行时,Job中的日志记录问题



生产者端

async def parse_list(self, key, page):
    start = time.time()
    url, headers = await self.get_list_url_headers(key, page)
    my_job = create_job(func_salve_run, key, page, url, headers)
    logger.debug(f'queued job: "{my_job.get_id()}"')
def create_job(_func, *args):
    _job = queue.enqueue(_func, *args)
    # print(f'job id:{_job.get_id()}')
    return _job



消费者端 job.py

import loguru
basepath = os.path.abspath(os.path.dirname(__file__))
logger.add(f'{basepath}/logs/Crawl.log', format="{time:YYYY-MM-DD HH:mm:ss} | {level} | {message}",level="DEBUG", retention='5 days')

def func_salve_run(key, page, url, headers):
    resp = fetch(url, headers=headers)
    logger.debug(f' {key} / {page} crawl success')
    itemsArray = json.loads(resp)['data']['itemsArray']
    items = crawl_pipeline(itemsArray, key, page, coll)
    logger.debug(f'itemsArray')



求解答后端运行时如何保证多个worker能写进同一日志文件而不产生锁冲突?
或者说怎么让单个worker得到它自己所运行Job中的日志记录然后写进独自一个文件?

https://www.cnblogs.com/gudaojuanma/p/Python-RQ-Wokers.html