spark启动worker不全

我spark启动的话只有一个worker成功
我创建了三个,主节点master也成功了
三个worker只有一个显示了

主节点上




```[hadoop@master001 ~]$  /home/hadoop/software/spark-2.4.5/sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/software/spark-2.4.5/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-master001.out
Slave001: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/software/spark-2.4.5/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-Slave001.out
Slave003: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/software/spark-2.4.5/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-Slave003.out
Slave002: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/software/spark-2.4.5/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-Slave002.out
[hadoop@master001 ~]$ jps
3392 Jps
1541 SecondaryNameNode
1685 ResourceManager
3335 Master
1357 NameNode

成功节点
[hadoop@Slave001 software]$ jps
2610 Jps
2564 Worker
1673 NodeManager

web上看也只显示一个worker
slaves上也打了三个虚拟机
```shell


我想要三个都连接上

检查启动日志,看看哪里报错了。