spark 访问 不到 hive 数据库 ?

spark 访问不到 hive数据库

代码

   //TODO:环境基本配置
    val sparkConf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("Electricity")

    //链接Mysql数据库
    val mysqlSpark: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()
    import mysqlSpark.implicits._
    //库名
    var dataName: String = "shtd_store"
    //表名
    var tables : String = "PART"

    val mySqlDf = mysqlSpark.read.format("jdbc")
      .option("url", "jdbc:mysql://192.168.1.113:3306/" + dataName)
      .option("driver", "com.mysql.jdbc.Driver")
      .option("user", "root")
      .option("password", "123456")
      .option("dbtable", tables)
      .load()

//
    //连接hive数据库
    val hiveSpark: SparkSession = SparkSession.builder().config(sparkConf).config("spaek.sql.warrhouse.dir", "hdfs:localhost:8020/user/hive/warehouse").enableHiveSupport().getOrCreate()
    import hiveSpark.implicits._
    hiveSpark.sql("show databases").show()



    //TODO:关闭环境
    mysqlSpark.close()
    hiveSpark.close()

报错信息】

E:\JAVA\jdk\jdk8\bin\java.exe "-javaagent:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\lib\idea_rt.jar=61296:E:\JAVA\开发工具\IDEA\IntelliJ IDEA Community Edition 2019.3.5\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\DEII\AppData\Local\Temp\classpath437323891.jar com.xcy.Electricity.Electricity
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/DEII/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/DEII/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/10/20 20:21:36 INFO SparkContext: Running Spark version 2.4.3
21/10/20 20:21:36 INFO SparkContext: Submitted application: Electricity
21/10/20 20:21:36 INFO SecurityManager: Changing view acls to: DEII
21/10/20 20:21:36 INFO SecurityManager: Changing modify acls to: DEII
21/10/20 20:21:36 INFO SecurityManager: Changing view acls groups to: 
21/10/20 20:21:36 INFO SecurityManager: Changing modify acls groups to: 
21/10/20 20:21:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(DEII); groups with view permissions: Set(); users  with modify permissions: Set(DEII); groups with modify permissions: Set()
21/10/20 20:21:38 INFO Utils: Successfully started service 'sparkDriver' on port 61334.
21/10/20 20:21:38 INFO SparkEnv: Registering MapOutputTracker
21/10/20 20:21:38 INFO SparkEnv: Registering BlockManagerMaster
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/10/20 20:21:38 INFO DiskBlockManager: Created local directory at C:\Users\DEII\AppData\Local\Temp\blockmgr-54045321-4888-40d5-bca0-de07ecb4d59f
21/10/20 20:21:38 INFO MemoryStore: MemoryStore started with capacity 885.6 MB
21/10/20 20:21:38 INFO SparkEnv: Registering OutputCommitCoordinator
21/10/20 20:21:38 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/10/20 20:21:38 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-P38CTL8:4040
21/10/20 20:21:38 INFO Executor: Starting executor ID driver on host localhost
21/10/20 20:21:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61375.
21/10/20 20:21:38 INFO NettyBlockTransferService: Server created on DESKTOP-P38CTL8:61375
21/10/20 20:21:38 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/10/20 20:21:38 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManagerMasterEndpoint: Registering block manager DESKTOP-P38CTL8:61375 with 885.6 MB RAM, BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:38 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-P38CTL8, 61375, None)
21/10/20 20:21:39 INFO SharedState: loading hive config file: file:/E:/BigDataContest/code01/target/classes/hive-site.xml
21/10/20 20:21:39 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
21/10/20 20:21:39 INFO SharedState: Warehouse path is '/user/hive/warehouse'.
21/10/20 20:21:40 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Wed Oct 20 20:21:40 CST 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
21/10/20 20:21:42 WARN SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: master
    at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
    at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
    at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:668)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:604)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2598)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2632)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2614)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
    at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:111)
    at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.createDatabase(InMemoryCatalog.scala:109)
    at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:117)
    at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
    at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$3.apply(BaseSessionStateBuilder.scala:133)
    at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$3.apply(BaseSessionStateBuilder.scala:133)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:247)
    at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
    at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.execution.command.ShowDatabasesCommand.run(databases.scala:44)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
    at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
    at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
    at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
    at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
    at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
    at com.xcy.Electricity.Electricity$.main(Electricity.scala:31)
    at com.xcy.Electricity.Electricity.main(Electricity.scala)
Caused by: java.net.UnknownHostException: master
    ... 42 more
21/10/20 20:21:50 INFO SparkContext: Invoking stop() from shutdown hook
21/10/20 20:21:50 INFO SparkUI: Stopped Spark web UI at http://DESKTOP-P38CTL8:4040
21/10/20 20:21:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/20 20:21:50 INFO MemoryStore: MemoryStore cleared
21/10/20 20:21:50 INFO BlockManager: BlockManager stopped
21/10/20 20:21:50 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/20 20:21:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/20 20:21:50 INFO SparkContext: Successfully stopped SparkContext
21/10/20 20:21:50 INFO ShutdownHookManager: Shutdown hook called
21/10/20 20:21:50 INFO ShutdownHookManager: Deleting directory C:\Users\DEII\AppData\Local\Temp\spark-743ce9b2-d43d-42ad-959a-ea8d9b1f3814

Process finished with exit code 1

hosts文件配置服务器名称和对应的ip地址了没,贴出来看看,或者参考下这个: