python安装scrapy出错

问题遇到的现象和发生背景

python安装scrapy出错 ERROR: Exception怎么办

问题相关代码,请勿粘贴截图

(spider) D:\qq\pythonProject4>pip install scrapy

运行结果及报错内容

Collecting scrapy
Downloading Scrapy-2.6.2-py2.py3-none-any.whl (264 kB)
|| 264 kB 23 kB/s
Collecting pyOpenSSL>=16.2.0
Downloading pyOpenSSL-22.0.0-py2.py3-none-any.whl (55 kB)
55 kB 17 kB/s
Collecting protego>=0.1.15
Downloading Protego-0.2.1-py2.py3-none-any.whl (8.2 kB)
Collecting Twisted>=17.9.0
Downloading Twisted-22.4.0-py3-none-any.whl (3.1 MB)
| 122 kB 7.8 kB/s eta 0:06:20ERROR: Exception:
Traceback (most recent call last):
File "D:\software\envs\spider\lib\site-packages\pip_vendor\urllib3\response.py", line 438, in _error_catcher
yield
File "D:\software\envs\spider\lib\site-packages\pip_vendor\urllib3\response.py", line 519, in read
data = self._fp.read(amt) if not fp_closed else b""
File "D:\software\envs\spider\lib\site-packages\pip_vendor\cachecontrol\filewrapper.py", line 62, in read
data = self.__fp.read(amt)
File "D:\software\envs\spider\lib\http\client.py", line 447, in read
n = self.readinto(b)
File "D:\software\envs\spider\lib\http\client.py", line 491, in readinto
n = self.fp.readinto(b)
File "D:\software\envs\spider\lib\socket.py", line 589, in readinto
return self._sock.recv_into(b)
File "D:\software\envs\spider\lib\ssl.py", line 1052, in recv_into
return self.read(nbytes, buffer)
File "D:\software\envs\spider\lib\ssl.py", line 911, in read
return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "D:\software\envs\spider\lib\site-packages\pip_internal\cli\base_command.py", line 173, in _main
status = self.run(options, args)
File "D:\software\envs\spider\lib\site-packages\pip_internal\cli\req_command.py", line 203, in wrapper
return func(self, options, args)
File "D:\software\envs\spider\lib\site-packages\pip_internal\commands\install.py", line 316, in run
reqs, check_supported_wheels=not options.target_dir
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\resolver.py", line 95, in resolve
collected.requirements, max_rounds=try_to_avoid_resolution_too_deep
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 472, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 366, in resolve
failure_causes = self._attempt_to_pin_criterion(name)
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 212, in _attempt_to_pin_criterion
criteria = self._get_updated_criteria(candidate)
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 203, in _get_updated_criteria
self._add_to_criteria(criteria, requirement, parent=candidate)
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 172, in _add_to_criteria
if not criterion.candidates:
File "D:\software\envs\spider\lib\site-packages\pip_vendor\resolvelib\structs.py", line 151, in bool
return bool(self._sequence)
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 140, in bool
return any(self)
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 128, in
return (c for c in iterator if id(c) not in self._incompatible_ids)
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 32, in _iter_built
candidate = func()
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 209, in _make_candidate_from_link
version=version,
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 301, in init
version=version,
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 156, in init
self.dist = self._prepare()
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 227, in _prepare
dist = self._prepare_distribution()
File "D:\software\envs\spider\lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 306, in _prepare_distribution
self._ireq, parallel_builds=True
File "D:\software\envs\spider\lib\site-packages\pip_internal\operations\prepare.py", line 508, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "D:\software\envs\spider\lib\site-packages\pip_internal\operations\prepare.py", line 552, in _prepare_linked_requirement
self.download_dir, hashes
File "D:\software\envs\spider\lib\site-packages\pip_internal\operations\prepare.py", line 243, in unpack_url
hashes=hashes,
File "D:\software\envs\spider\lib\site-packages\pip_internal\operations\prepare.py", line 102, in get_http_url
from_path, content_type = download(link, temp_dir.path)
File "D:\software\envs\spider\lib\site-packages\pip_internal\network\download.py", line 145, in call
for chunk in chunks:
File "D:\software\envs\spider\lib\site-packages\pip_internal\cli\progress_bars.py", line 144, in iter
for x in it:
File "D:\software\envs\spider\lib\site-packages\pip_internal\network\utils.py", line 87, in response_chunks
decode_content=False,
File "D:\software\envs\spider\lib\site-packages\pip_vendor\urllib3\response.py", line 576, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "D:\software\envs\spider\lib\site-packages\pip_vendor\urllib3\response.py", line 541, in read
File "D:\software\envs\spide File "D:\software\envs\spider\lib\site-packages\pip_vendor\urllib3\response.py", line 443, in _error_catche
raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443):

我的解答思路和尝试过的方法
我想要达到的结果:怎么解决

试试这样:

pip install scrapy -i https://pypi.tuna.tsinghua.edu.cn/simple

这是网络连接失败的问题,可以尝试换源解决
换源的方式:
Windows用户想要更改pypi源,可以在%APPDATA%目录下新建pip文件夹,再在pip文件夹下新建pip.ini。
(如果路径中已经存在了文件夹或者文件,就不用创建了,另外你可以在cmd下输入echo %APPDATA%查看%APPDATA%代表的路径是哪,或者直接在文件夹索引那输入%APPDATA%即可直接到该位置)

找到pip.ini文件后,将文件修改为如下所示:
如下为阿里云,也可以修改为其他源

[global]
index-url = http://mirrors.aliyun.com/pypi/simple
[install]
trusted-host=mirrors.aliyun.com