python 开启大量进程(1000个),在满足资源的情况下,运行不稳定、报错、甚至崩溃

问题:
python,开启1000个进程,在开启到50个左右就会出现错误、不稳定的情况,远远开不到1000个进程。但是在此电脑的虚拟机上(或者另一台intel cpu电脑上),性能远比此电脑的低,却能非常稳定的开到1000个进程。难道安装的系统有问题?还是和cpu有关?

此电脑配置:
cpu: AMD R9 3900X 12核24线程;内存:32G(实际可用20G)
此电脑下虚拟机配置:
cpu: 4核;内存:10G(实际可用8G)
另外一台电脑配置:
cpu:intel i7-4710MQ 2核 ;内存:8G(实际可用4G)

代码如下:

from multiprocessing import Process
import time


def calc(name):
    a = 0
    while a < 5000:
        a += 1
    print(name)
    time.sleep(1000)


if __name__ == '__main__':
    time1 = time.time()
    px_list = []
    for i in range(1000):
        px = Process(target=calc, args=("p%s" % i,))
        px.start()
        px_list.append(px)
    for i in px_list:
        i.join()

    print("spend: %s" % (time.time() - time1))

报错内容如下,报内存不足,实际50%用不到,在开启到50个左右就会出现错误、不稳定的情况,远远开不到1000个进程

Traceback (most recent call last):
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
    px_j[i].start()
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\process.py", line 121, in start
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
    self._popen = self._Popen(self)
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\context.py", line 224, in _Popen
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 844, in exec_module
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\context.py", line 327, in _Popen
  File "<frozen importlib._bootstrap_external>", line 939, in get_code
    return Popen(process_obj)
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\popen_spawn_win32.py", line 73, in __init__
  File "<frozen importlib._bootstrap_external>", line 1037, in get_data
    hp, ht, pid, tid = _winapi.CreateProcess(
OSError: [WinError 1450] 系统资源不足,无法完成请求的服务。
MemoryError
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\spawn.py", line 125, in _main
    prepare(preparation_data)
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\spawn.py", line 236, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\spawn.py", line 287, in _fixup_main_from_path
    main_content = runpy.run_path(main_path,
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\runpy.py", line 265, in run_path
    return _run_module_code(code, init_globals, run_name,
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\runpy.py", line 97, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "E:\INDO_ROOT\test3.py", line 5, in <module>
    from concurrent.futures import ProcessPoolExecutor
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\concurrent\futures\__init__.py", line 44, in __getattr__
    from .process import ProcessPoolExecutor as pe
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\concurrent\futures\process.py", line 54, in <module>
    import multiprocessing.connection
  File "C:\Users\l\anaconda3\envs\AiLearning\lib\multiprocessing\connection.py", line 21, in <module>
    import _multiprocessing
ImportError: DLL load failed while importing _multiprocessing: 页面文件太小,无法完成操作。

此电脑任务管理器截图:

img

在虚拟机下任务管理器截图:

img

if  __name__ == '__main__':
    for i in range(1000):
        ...
        time.sleep(.013) # time.sleep(.1) or time.sleep(.2) or ...