hibnch运行worldcount出现错误,不知道是什么原因

Warning: Master yarn-client is deprecated since 2.0. Please use master "yarn" with specified deploy mode instead.
22/03/05 20:24:54 WARN util.Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 192.168.183.137 instead (on interface ens33)
22/03/05 20:24:54 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
22/03/05 20:24:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/03/05 20:24:57 INFO spark.SparkContext: Running Spark version 2.4.5
22/03/05 20:24:57 INFO spark.SparkContext: Submitted application: ScalaWordCount
22/03/05 20:24:57 INFO spark.SecurityManager: Changing view acls to: root
22/03/05 20:24:57 INFO spark.SecurityManager: Changing modify acls to: root
22/03/05 20:24:57 INFO spark.SecurityManager: Changing view acls groups to:
22/03/05 20:24:57 INFO spark.SecurityManager: Changing modify acls groups to:
22/03/05 20:24:57 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/03/05 20:24:58 INFO util.Utils: Successfully started service 'sparkDriver' on port 43165.
22/03/05 20:24:58 INFO spark.SparkEnv: Registering MapOutputTracker
22/03/05 20:24:58 INFO spark.SparkEnv: Registering BlockManagerMaster
22/03/05 20:24:58 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/03/05 20:24:58 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/03/05 20:24:58 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-b9f0530a-d4ec-4700-a833-31183bd95ab8
22/03/05 20:24:58 INFO memory.MemoryStore: MemoryStore started with capacity 2.1 GB
22/03/05 20:24:58 INFO spark.SparkEnv: Registering OutputCommitCoordinator
22/03/05 20:24:58 INFO util.log: Logging initialized @6624ms
22/03/05 20:24:58 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
22/03/05 20:24:58 INFO server.Server: Started @6785ms
22/03/05 20:24:58 INFO server.AbstractConnector: Started ServerConnector@7dd712e8{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/03/05 20:24:58 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/jobs,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10567255{/jobs/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/jobs/job,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79c4715d{/jobs/job/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aa360ea{/stages,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6548bb7d{/stages/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e27ba81{/stages/stage,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@35e52059{/stages/stage/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@62577d6{/stages/pool,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bd54f7{/stages/pool/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b5f8707{/storage,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@772485dd{/storage/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5a12c728{/storage/rdd,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79ab3a71{/storage/rdd/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e5bfdfc{/environment,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d829787{/environment/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@71652c98{/executors,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51bde877{/executors/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@60b85ba1{/executors/threadDump,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@492fc69e{/executors/threadDump/json,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@117632cf{/static,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c0b41d6{/,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4837595f{/api,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f2d2181{/jobs/job/kill,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49bf29c6{/stages/stage/kill,null,AVAILABLE,@Spark}
22/03/05 20:24:58 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.183.137:4040
22/03/05 20:24:58 INFO spark.SparkContext: Added JAR file:/app/HiBench-master/sparkbench/assembly/target/sparkbench-assembly-8.0-SNAPSHOT-dist.jar at spark://192.168.183.137:43165/jars/sparkbench-assembly-8.0-SNAPSHOT-dist.jar with timestamp 1646540698867
22/03/05 20:25:00 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8032
22/03/05 20:25:00 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
22/03/05 20:25:00 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
22/03/05 20:25:00 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
22/03/05 20:25:00 INFO yarn.Client: Setting up container launch context for our AM
22/03/05 20:25:00 INFO yarn.Client: Setting up the launch environment for our AM container
22/03/05 20:25:00 INFO yarn.Client: Preparing resources for our AM container
22/03/05 20:25:01 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
22/03/05 20:25:05 INFO yarn.Client: Uploading resource file:/tmp/spark-b6e4c772-9832-4813-91ba-9130a42959a7/__spark_libs__3170012956923171363.zip -> hdfs://localhost:9000/user/root/.sparkStaging/application_1646538013026_0003/__spark_libs__3170012956923171363.zip
22/03/05 20:25:13 INFO yarn.Client: Uploading resource file:/tmp/spark-b6e4c772-9832-4813-91ba-9130a42959a7/__spark_conf__6184710404523455740.zip -> hdfs://localhost:9000/user/root/.sparkStaging/application_1646538013026_0003/__spark_conf__.zip
22/03/05 20:25:13 INFO spark.SecurityManager: Changing view acls to: root
22/03/05 20:25:13 INFO spark.SecurityManager: Changing modify acls to: root
22/03/05 20:25:13 INFO spark.SecurityManager: Changing view acls groups to:
22/03/05 20:25:13 INFO spark.SecurityManager: Changing modify acls groups to:
22/03/05 20:25:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/03/05 20:25:15 INFO yarn.Client: Submitting application application_1646538013026_0003 to ResourceManager
22/03/05 20:25:15 INFO impl.YarnClientImpl: Submitted application application_1646538013026_0003
22/03/05 20:25:15 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1646538013026_0003 and attemptId None
22/03/05 20:25:16 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:16 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1646540715160
     final status: UNDEFINED
     tracking URL: http://localhost:8088/proxy/application_1646538013026_0003/
     user: root
22/03/05 20:25:17 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:18 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:19 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:21 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:22 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:23 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:24 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:26 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:27 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:28 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:29 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:30 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:32 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:33 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:34 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:35 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:36 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:37 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:38 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:39 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:40 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:41 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:42 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:43 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:44 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:45 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:46 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:47 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:48 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:49 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:50 INFO yarn.Client: Application report for application_1646538013026_0003 (state: ACCEPTED)
22/03/05 20:25:51 INFO yarn.Client: Application report for application_1646538013026_0003 (state: RUNNING)
22/03/05 20:25:51 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 192.168.183.137
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1646540715160
     final status: UNDEFINED
     tracking URL: http://localhost:8088/proxy/application_1646538013026_0003/
     user: root
22/03/05 20:25:51 INFO cluster.YarnClientSchedulerBackend: Application application_1646538013026_0003 has started running.
22/03/05 20:25:51 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41433.
22/03/05 20:25:51 INFO netty.NettyBlockTransferService: Server created on 192.168.183.137:41433
22/03/05 20:25:51 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/03/05 20:25:51 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.183.137, 41433, None)
22/03/05 20:25:51 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.183.137:41433 with 2.1 GB RAM, BlockManagerId(driver, 192.168.183.137, 41433, None)
22/03/05 20:25:52 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.183.137, 41433, None)
22/03/05 20:25:52 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.183.137, 41433, None)
22/03/05 20:25:52 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> localhost, PROXY_URI_BASES -> http://localhost:8088/proxy/application_1646538013026_0003), /proxy/application_1646538013026_0003
22/03/05 20:25:52 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
22/03/05 20:25:52 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
22/03/05 20:25:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@28be7fec{/metrics/json,null,AVAILABLE,@Spark}
22/03/05 20:25:53 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at com.intel.hibench.sparkbench.common.IOCommon$.getPropertiesFromFile(IOCommon.scala:95)
    at com.intel.hibench.sparkbench.common.IOCommon$.<init>(IOCommon.scala:91)
    at com.intel.hibench.sparkbench.common.IOCommon$.<clinit>(IOCommon.scala)
    at com.intel.hibench.sparkbench.common.IOCommon.$anonfun$load$1(IOCommon.scala:38)
    at scala.Option.getOrElse(Option.scala:121)
    at com.intel.hibench.sparkbench.common.IOCommon.load(IOCommon.scala:38)
    at com.intel.hibench.sparkbench.micro.ScalaWordCount$.main(ScalaWordCount.scala:38)
    at com.intel.hibench.sparkbench.micro.ScalaWordCount.main(ScalaWordCount.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/03/05 20:25:53 INFO spark.SparkContext: Invoking stop() from shutdown hook
22/03/05 20:25:53 INFO server.AbstractConnector: Stopped Spark@7dd712e8{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/03/05 20:25:53 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.183.137:4040
22/03/05 20:25:53 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
22/03/05 20:25:53 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
22/03/05 20:25:53 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
22/03/05 20:25:53 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
22/03/05 20:25:54 INFO cluster.YarnClientSchedulerBackend: Stopped
22/03/05 20:25:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/03/05 20:25:54 INFO memory.MemoryStore: MemoryStore cleared
22/03/05 20:25:54 INFO storage.BlockManager: BlockManager stopped
22/03/05 20:25:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
22/03/05 20:25:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/03/05 20:25:54 INFO spark.SparkContext: Successfully stopped SparkContext
22/03/05 20:25:54 INFO util.ShutdownHookManager: Shutdown hook called
22/03/05 20:25:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b6e4c772-9832-4813-91ba-9130a42959a7
22/03/05 20:25:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-bd95ae16-c36c-4eb3-a9e5-e6b44fb61be5