21/10/23 11:12:17 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
21/10/23 11:12:17 INFO ObjectStore: Initialized ObjectStore
21/10/23 11:12:18 ERROR ObjectStore: Version information found in metastore differs 1.2.0 from expected schema version 2.3.0. Schema verififcation is disabled hive.metastore.schema.verification
21/10/23 11:12:18 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore UNKNOWN@172.16.1.100
21/10/23 11:12:18 INFO HiveMetaStore: Added admin role in metastore
21/10/23 11:12:18 INFO HiveMetaStore: Added public role in metastore
21/10/23 11:12:18 INFO HiveMetaStore: No user is added in admin role, since config is empty
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_all_functions
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_all_functions
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_database: default
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_database: global_temp
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: global_temp
21/10/23 11:12:18 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_database: shi
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: shi
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_database: shi
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: shi
21/10/23 11:12:18 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:18 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:20 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:20 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:22 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:22 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:24 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:24 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:26 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:26 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:28 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:28 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:30 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:30 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:32 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:32 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:34 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:34 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:36 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:36 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
21/10/23 11:12:38 INFO HiveMetaStore: 0: get_table : db=shi tbl=user_visit_action
21/10/23 11:12:38 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=shi tbl=user_visit_action
Exception in thread "main" org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table user_visit_action. Exception thrown when executing query : SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MTable' AS `NUCLEUS_TYPE`,`A0`.`CREATE_TIME`,`A0`.`LAST_ACCESS_TIME`,`A0`.`OWNER`,`A0`.`RETENTION`,`A0`.`IS_REWRITE_ENABLED`,`A0`.`TBL_NAME`,`A0`.`TBL_TYPE`,`A0`.`TBL_ID` FROM `TBLS` `A0` LEFT OUTER JOIN `DBS` `B0` ON `A0`.`DB_ID` = `B0`.`DB_ID` WHERE `A0`.`TBL_NAME` = ? AND `B0`.`NAME` = ?;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:109)
at org.apache.spark.sql.hive.HiveExternalCatalog.tableExists(HiveExternalCatalog.scala:851)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.tableExists(ExternalCatalogWithListener.scala:146)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:432)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:319)
at org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:165)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3616)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3614)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:606)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:763)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:601)
at team_Dsj_Spark_core.sql.hive.Spark_cityproduser$.main(Spark_cityproduser.scala:19)
at team_Dsj_Spark_core.sql.hive.Spark_cityproduser.main(Spark_cityproduser.scala)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table user_visit_action. Exception thrown when executing query : SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MTable' AS `NUCLEUS_TYPE`,`A0`.`CREATE_TIME`,`A0`.`LAST_ACCESS_TIME`,`A0`.`OWNER`,`A0`.`RETENTION`,`A0`.`IS_REWRITE_ENABLED`,`A0`.`TBL_NAME`,`A0`.`TBL_TYPE`,`A0`.`TBL_ID` FROM `TBLS` `A0` LEFT OUTER JOIN `DBS` `B0` ON `A0`.`DB_ID` = `B0`.`DB_ID` WHERE `A0`.`TBL_NAME` = ? AND `B0`.`NAME` = ?
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1282)
at org.apache.spark.sql.hive.client.HiveClientImpl.getRawTableOption(HiveClientImpl.scala:397)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$tableExists$1(HiveClientImpl.scala:411)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:294)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:227)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:226)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:276)
at org.apache.spark.sql.hive.client.HiveClientImpl.tableExists(HiveClientImpl.scala:411)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$tableExists$1(HiveExternalCatalog.scala:851)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
... 25 more
Caused by: MetaException(message:Exception thrown when executing query : SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MTable' AS `NUCLEUS_TYPE`,`A0`.`CREATE_TIME`,`A0`.`LAST_ACCESS_TIME`,`A0`.`OWNER`,`A0`.`RETENTION`,`A0`.`IS_REWRITE_ENABLED`,`A0`.`TBL_NAME`,`A0`.`TBL_TYPE`,`A0`.`TBL_ID` FROM `TBLS` `A0` LEFT OUTER JOIN `DBS` `B0` ON `A0`.`DB_ID` = `B0`.`DB_ID` WHERE `A0`.`TBL_NAME` = ? AND `B0`.`NAME` = ?)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:211)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at com.sun.proxy.$Proxy22.get_table_req(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1350)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.getTable(SessionHiveMetaStoreClient.java:127)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
at com.sun.proxy.$Proxy23.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1274)
... 36 more
Caused by: javax.jdo.JDOException: Exception thrown when executing query : SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MTable' AS `NUCLEUS_TYPE`,`A0`.`CREATE_TIME`,`A0`.`LAST_ACCESS_TIME`,`A0`.`OWNER`,`A0`.`RETENTION`,`A0`.`IS_REWRITE_ENABLED`,`A0`.`TBL_NAME`,`A0`.`TBL_TYPE`,`A0`.`TBL_ID` FROM `TBLS` `A0` LEFT OUTER JOIN `DBS` `B0` ON `A0`.`DB_ID` = `B0`.`DB_ID` WHERE `A0`.`TBL_NAME` = ? AND `B0`.`NAME` = ?
NestedThrowables:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.IS_REWRITE_ENABLED' in 'field list'
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:677)
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:241)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1350)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1367)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1158)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
at com.sun.proxy.$Proxy21.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:2030)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getTableInternal(HiveMetaStore.java:1984)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_req(HiveMetaStore.java:1969)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
... 47 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.IS_REWRITE_ENABLED' in 'field list'
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
at com.mysql.jdbc.Util.getInstance(Util.java:408)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:943)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3973)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3909)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2527)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2680)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2487)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1858)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1966)
at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)
at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:375)
at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:552)
at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:617)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368)
... 65 more
21/10/23 11:12:38 INFO SparkContext: Invoking stop() from shutdown hook
21/10/23 11:12:38 INFO SparkUI: Stopped Spark web UI at http://DESKTOP-5QEFLDC:4040
21/10/23 11:12:38 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/10/23 11:12:38 INFO MemoryStore: MemoryStore cleared
21/10/23 11:12:38 INFO BlockManager: BlockManager stopped
21/10/23 11:12:38 INFO BlockManagerMaster: BlockManagerMaster stopped
21/10/23 11:12:38 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/10/23 11:12:38 INFO SparkContext: Successfully stopped SparkContext
21/10/23 11:12:38 INFO ShutdownHookManager: Shutdown hook called
21/10/23 11:12:38 INFO ShutdownHookManager: Deleting directory C:\Users\DELL\AppData\Local\Temp\spark-a6ab97d2-02b1-48c1-8684-a7fd334e938a
连接hive后能读取到数据,但不能建表写入数据
可能是环境搭建的事,也可能是依赖的事,我的版本可能也有事
有没有人懂,帮帮忙
检查一下你的插入sql有没有问题,另外关于hive的依赖则需要这两个,mysql依赖注意8.0和5.6或者以下的区别即可
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.9.2</version>
</dependency>
应该是用户权限不足,spark本地跑坑很多,他会用你当前计算机系统的登录用户名去hdfs里乱搞