报错信息
"E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\jbr\bin\java.exe" "-javaagent:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\lib\idea_rt.jar=53995:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\DEII\AppData\Local\Temp\classpath59267419.jar com.xcy.Electricity.Electricity
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/E:/xcy/数据可视化/spark-2.1.1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/DEII/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:130)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at com.xcy.Electricity.Electricity$.main(Electricity.scala:11)
at com.xcy.Electricity.Electricity.main(Electricity.scala)
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 12 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 17 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 25 more
Caused by: java.lang.IllegalArgumentException: Unable to locate hive jars to connect to metastore. Please set spark.sql.hive.metastore.jars.
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:298)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 30 more
21/10/20 14:33:45 INFO SparkContext: Invoking stop() from shutdown hook
21/10/20 14:33:45 INFO SparkUI: Stopped Spark web UI at http://10.21.1.80:4040
21/10/20 14:33:45 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/20 14:33:45 INFO MemoryStore: MemoryStore cleared
21/10/20 14:33:45 INFO BlockManager: BlockManager stopped
21/10/20 14:33:45 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/20 14:33:45 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/20 14:33:45 INFO SparkContext: Successfully stopped SparkContext
21/10/20 14:33:45 INFO ShutdownHookManager: Shutdown hook called
21/10/20 14:33:45 INFO ShutdownHookManager: Deleting directory C:\Users\DEII\AppData\Local\Temp\spark-a63aa538-3bdf-4818-ac0f-231022f3b2e0
Process finished with exit code 1
maven配置
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
</dependencies>
SLF4J包含了多个不同版本的jar包,删除多余的jar包。
一般说来version版本都是2.4.3的
版本
scala 2.11
spark2.1.1
hive 2.3.4
E:\JAVA\jdk\jdk8\bin\java.exe "-javaagent:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\lib\idea_rt.jar=61296:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\DEII\AppData\Local\Temp\classpath437323891.jar com.xcy.Electricity.Electricity
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/DEII/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/DEII/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/10/20 20:21:36 INFO SparkContext: Running Spark version 2.4.3
21/10/20 20:21:36 INFO SparkContext: Submitted application: Electricity
21/10/20 20:21:36 INFO SecurityManager: Changing view acls to: DEII
21/10/20 20:21:36 INFO SecurityManager: Changing modify acls to: DEII
21/10/20 20:21:36 INFO SecurityManager: Changing view acls groups to:
21/10/20 20:21:36 INFO SecurityManager: Changing modify acls groups to:
21/10/20 20:21:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(DEII); groups with view permissions: Set(); users with modify permissions: Set(DEII); groups with modify permissions: Set()
21/10/20 20:21:38 INFO Utils: Successfully started service 'sparkDriver' on port 61334.
21/10/20 20:21:38 INFO SparkEnv: Registering MapOutputTracker
21/10/20 20:21:38 INFO SparkEnv: Registering BlockManagerMaster
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/10/20 20:21:38 INFO DiskBlockManager: Created local directory at C:\Users\DEII\AppData\Local\Temp\blockmgr-54045321-4888-40d5-bca0-de07ecb4d59f
21/10/20 20:21:38 INFO MemoryStore: MemoryStore started with capacity 885.6 MB
21/10/20 20:21:38 INFO SparkEnv: Registering OutputCommitCoordinator
21/10/20 20:21:38 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/10/20 20:21:38 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-P38CTL8:4040
21/10/20 20:21:38 INFO Executor: Starting executor ID driver on host localhost
21/10/20 20:21:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61375.
21/10/20 20:21:38 INFO NettyBlockTransferService: Server created on DESKTOP-P38CTL8:61375
21/10/20 20:21:38 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/10/20 20:21:38 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: Registering block manager DESKTOP-P38CTL8:61375 with 885.6 MB RAM, BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:39 INFO SharedState: loading hive config file: file:/E:/BigDataContest/code01/target/classes/hive-site.xml
21/10/20 20:21:39 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
21/10/20 20:21:39 INFO SharedState: Warehouse path is '/user/hive/warehouse'.
21/10/20 20:21:40 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Wed Oct 20 20:21:40 CST 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
21/10/20 20:21:42 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: master
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2632)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2614)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:111)
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.createDatabase(InMemoryCatalog.scala:109)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:117)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$3.apply(BaseSessionStateBuilder.scala:133)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$3.apply(BaseSessionStateBuilder.scala:133)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:247)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand.run(databases.scala:44)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at com.xcy.Electricity.Electricity$.main(Electricity.scala:31)
at com.xcy.Electricity.Electricity.main(Electricity.scala)
Caused by: java.net.UnknownHostException: master
... 42 more
21/10/20 20:21:50 INFO SparkContext: Invoking stop() from shutdown hook
21/10/20 20:21:50 INFO SparkUI: Stopped Spark web UI at http://DESKTOP-P38CTL8:4040
21/10/20 20:21:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/20 20:21:50 INFO MemoryStore: MemoryStore cleared
21/10/20 20:21:50 INFO BlockManager: BlockManager stopped
21/10/20 20:21:50 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/20 20:21:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/20 20:21:50 INFO SparkContext: Successfully stopped SparkContext
21/10/20 20:21:50 INFO ShutdownHookManager: Shutdown hook called
21/10/20 20:21:50 INFO ShutdownHookManager: Deleting directory C:\Users\DEII\AppData\Local\Temp\spark-743ce9b2-d43d-42ad-959a-ea8d9b1f3814
Process finished with exit code 1