spark on yarn

问题遇到的现象和发生背景

hadoop 完全分布下 运行sprak on yarn 出错

问题相关代码,请勿粘贴截图
[root@master logs]# spark-submit --master local --class org.apache.spark.examples.SparkPi /usr/local/src/spark/examples/jars/spark-examples_2.11-2.0.0.jar 40 | grep Pi

能正常运行的到Pi
spark on yarn就报错了

[root@master logs]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi /usr/local/src/spark/examples/jars/spark-examples_2.11-2.0.0.jar 1 | grep Pi
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/src/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/src/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
22/04/09 17:11:54 INFO spark.SparkContext: Running Spark version 2.0.0
22/04/09 17:11:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/04/09 17:11:54 INFO spark.SecurityManager: Changing view acls to: root
22/04/09 17:11:54 INFO spark.SecurityManager: Changing modify acls to: root
22/04/09 17:11:54 INFO spark.SecurityManager: Changing view acls groups to: 
22/04/09 17:11:54 INFO spark.SecurityManager: Changing modify acls groups to: 
22/04/09 17:11:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/04/09 17:11:55 INFO util.Utils: Successfully started service 'sparkDriver' on port 32920.
22/04/09 17:11:55 INFO spark.SparkEnv: Registering MapOutputTracker
22/04/09 17:11:55 INFO spark.SparkEnv: Registering BlockManagerMaster
22/04/09 17:11:55 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-aafaa5f8-8fa3-44bd-a070-592253eb9fdb
22/04/09 17:11:55 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MB
22/04/09 17:11:55 INFO spark.SparkEnv: Registering OutputCommitCoordinator
22/04/09 17:11:55 INFO util.log: Logging initialized @2768ms
22/04/09 17:11:55 INFO server.Server: jetty-9.2.z-SNAPSHOT
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62417a16{/jobs,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32057e6{/jobs/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26be6ca7{/jobs/job,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ea1bcdc{/jobs/job/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@759fad4{/stages,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64712be{/stages/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53499d85{/stages/stage,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@30ed9c6c{/stages/stage/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@782a4fff{/stages/pool,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46c670a6{/stages/pool/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59fc684e{/storage,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5ae81e1{/storage/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2fd1731c{/storage/rdd,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5ae76500{/storage/rdd/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6063d80a{/environment,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1133ec6e{/environment/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@355e34c7{/executors,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54709809{/executors/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2a2da905{/executors/threadDump,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24f360b2{/executors/threadDump/json,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@60cf80e7{/static,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@302fec27{/,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@770d0ea6{/api,null,AVAILABLE}
22/04/09 17:11:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48c40605{/stages/stage/kill,null,AVAILABLE}
22/04/09 17:11:55 INFO server.ServerConnector: Started ServerConnector@3574e198{HTTP/1.1}{0.0.0.0:4040}
22/04/09 17:11:55 INFO server.Server: Started @2905ms
22/04/09 17:11:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
22/04/09 17:11:55 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.5.134:4040
22/04/09 17:11:55 INFO spark.SparkContext: Added JAR file:/usr/local/src/spark/examples/jars/spark-examples_2.11-2.0.0.jar at spark://192.168.5.134:32920/jars/spark-examples_2.11-2.0.0.jar with timestamp 1649495515709
22/04/09 17:11:56 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.5.134:8032
22/04/09 17:11:56 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers
22/04/09 17:11:56 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
22/04/09 17:11:56 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
22/04/09 17:11:56 INFO yarn.Client: Setting up container launch context for our AM
22/04/09 17:11:56 INFO yarn.Client: Setting up the launch environment for our AM container
22/04/09 17:11:56 INFO yarn.Client: Preparing resources for our AM container
22/04/09 17:11:57 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
22/04/09 17:11:58 INFO yarn.Client: Uploading resource file:/tmp/spark-0e9d4106-f05e-4671-8af7-b707d0636e6c/__spark_libs__3495207998487674211.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1649494013949_0005/__spark_libs__3495207998487674211.zip
22/04/09 17:12:01 INFO yarn.Client: Uploading resource file:/tmp/spark-0e9d4106-f05e-4671-8af7-b707d0636e6c/__spark_conf__8627322350494422863.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1649494013949_0005/__spark_conf__.zip
22/04/09 17:12:01 INFO spark.SecurityManager: Changing view acls to: root
22/04/09 17:12:01 INFO spark.SecurityManager: Changing modify acls to: root
22/04/09 17:12:01 INFO spark.SecurityManager: Changing view acls groups to: 
22/04/09 17:12:01 INFO spark.SecurityManager: Changing modify acls groups to: 
22/04/09 17:12:01 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/04/09 17:12:01 INFO yarn.Client: Submitting application application_1649494013949_0005 to ResourceManager
22/04/09 17:12:01 INFO impl.YarnClientImpl: Submitted application application_1649494013949_0005
22/04/09 17:12:01 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1649494013949_0005 and attemptId None
22/04/09 17:12:02 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:02 INFO yarn.Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1649495521813
         final status: UNDEFINED
         tracking URL: http://master:8088/proxy/application_1649494013949_0005/
         user: root
22/04/09 17:12:03 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:04 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:05 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:06 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:07 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
22/04/09 17:12:07 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> master, PROXY_URI_BASES -> http://master:8088/proxy/application_1649494013949_0005), /proxy/application_1649494013949_0005
22/04/09 17:12:07 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
22/04/09 17:12:07 INFO yarn.Client: Application report for application_1649494013949_0005 (state: ACCEPTED)
22/04/09 17:12:08 INFO yarn.Client: Application report for application_1649494013949_0005 (state: RUNNING)
22/04/09 17:12:08 INFO yarn.Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: 192.168.5.136
         ApplicationMaster RPC port: 0
         queue: default
         start time: 1649495521813
         final status: UNDEFINED
         tracking URL: http://master:8088/proxy/application_1649494013949_0005/
         user: root
22/04/09 17:12:08 INFO cluster.YarnClientSchedulerBackend: Application application_1649494013949_0005 has started running.
22/04/09 17:12:08 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33457.
22/04/09 17:12:08 INFO netty.NettyBlockTransferService: Server created on 192.168.5.134:33457
22/04/09 17:12:08 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.5.134, 33457)
22/04/09 17:12:08 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.5.134:33457 with 413.9 MB RAM, BlockManagerId(driver, 192.168.5.134, 33457)
22/04/09 17:12:08 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.5.134, 33457)
22/04/09 17:12:09 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2849434b{/metrics/json,null,AVAILABLE}
22/04/09 17:12:13 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
22/04/09 17:12:13 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> master, PROXY_URI_BASES -> http://master:8088/proxy/application_1649494013949_0005), /proxy/application_1649494013949_0005
22/04/09 17:12:13 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
22/04/09 17:12:15 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
22/04/09 17:12:15 INFO server.ServerConnector: Stopped ServerConnector@3574e198{HTTP/1.1}{0.0.0.0:4040}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@48c40605{/stages/stage/kill,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@770d0ea6{/api,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@302fec27{/,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@60cf80e7{/static,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@24f360b2{/executors/threadDump/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2a2da905{/executors/threadDump,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@54709809{/executors/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@355e34c7{/executors,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1133ec6e{/environment/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6063d80a{/environment,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5ae76500{/storage/rdd/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2fd1731c{/storage/rdd,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5ae81e1{/storage/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@59fc684e{/storage,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@46c670a6{/stages/pool/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@782a4fff{/stages/pool,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@30ed9c6c{/stages/stage/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@53499d85{/stages/stage,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@64712be{/stages/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@759fad4{/stages,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6ea1bcdc{/jobs/job/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@26be6ca7{/jobs/job,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@32057e6{/jobs/json,null,UNAVAILABLE}
22/04/09 17:12:15 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@62417a16{/jobs,null,UNAVAILABLE}
22/04/09 17:12:15 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.5.134:4040
22/04/09 17:12:16 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend
        at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:581)
        at org.apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:162)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/04/09 17:12:16 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalStateException: Spark context stopped while waiting for backend
        at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:581)
        at org.apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:162)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/04/09 17:12:16 ERROR client.TransportClient: Failed to send RPC 8523652673737865811 to /192.168.5.136:36338: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
22/04/09 17:12:16 INFO storage.DiskBlockManager: Shutdown hook called
22/04/09 17:12:16 INFO util.ShutdownHookManager: Shutdown hook called
22/04/09 17:12:16 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0e9d4106-f05e-4671-8af7-b707d0636e6c/userFiles-80bb5ece-70ef-4b91-9d3c-390d02916468
22/04/09 17:12:16 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0e9d4106-f05e-4671-8af7-b707d0636e6c
运行结果及报错内容
我的解答思路和尝试过的方法
我想要达到的结果

能正常运行得到Pi值