最近在学搭建Hadoop平台,但是做到hive组件时初始化一直有误,可以帮我找找原因吗

[hadoop@master ~]$ schematool -initSchema -dbType mysql
which: no hbase in (/usr/local/src/hadoop/bin:/usr/local/src/hadoop/sbin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/src/jdk1.8.0_131/bin:/usr/local/src/hive/bin:/home/hadoop/.local/bin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/src/hive/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/src/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/src/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://master:3306/hive?createDatabaseIfNotExist=true&useSSL=false
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: root
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
*** schemaTool failed ***
[hadoop@master ~]$ hive

可以在b站上面看看尚硅谷的,尚硅谷里面关于hiive有安装文档,可以照着人家的文档来安装hive。安装完hive后,我给你提一个建议,就用mr引擎,不需要用tez和spark。用mr更贴近你的学习,explain执行计划可以让你更明细

SLF4J 这个组件多了
排除掉
先检查配置文件
搜索

<groupId>org.slf4j</groupId>

如果有多余的,给它排除掉,留一个就行。


其次,看看库文件保护SLF4J的库是否有重复,重复就给他去掉

img

关于hive部署写了一篇博文,亲子验证过,希望对你有帮助,望采纳!
根据你的报错,请检查驱动是否复制,以及初始化
复制mysql的驱动程序到hive/lib下面
$ ./bin/schematool -dbType mysql -initSchema --verbose
$ hive
详情见刚写的博文
https://blog.csdn.net/qq_15604349/article/details/124343882

初始化hive元数据出现的问题,hive-site.xml配置要正确。
参考下我的配置:

<configuration>

<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.239.131:3306/hive?createDatabaseIfNotExist=true</value>
    <description>
      JDBC connect string for a JDBC metastore.
      To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
      For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
    </description>
</property>

<property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>123456!a</value>
    <description>password to use against metastore database</description>
  </property>


<property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>


<property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
    <description>Username to use against metastore database</description>
  </property>

<property>
    <name>hive.metastore.schema.verification</name>
    <value>false</value>
</property>

</configuration>

如有问题请参考我的这篇博客,步骤很详细,一步步操作即可,整个大数据集群都部署过,有更多问题可以私信我也可以看我的技术博客:
https://blog.csdn.net/zhengzaifeidelushang/article/details/109774248?ops_request_misc=%257B%2522request%255Fid%2522%253A%2522165061787416780271984948%2522%252C%2522scm%2522%253A%252220140713.130102334.pc%255Fblog.%2522%257D&request_id=165061787416780271984948&biz_id=0&utm_medium=distribute.pc_search_result.none-task-blog-2~blog~first_rank_ecpm_v1~rank_v31_ecpm-23-109774248.nonecase&utm_term=hive&spm=1018.2226.3001.4450

img

【大数据学习----- Hadoop、Spark、HBase、Hive搭建环境遇到的错误以及解决方法】https://minipro.baidu.com/ma/qrcode/parser?app_key=y1lpwNoOyVpW33XOPd72rzN4aUS43Y3O&launchid=32b194e0-8ed9-44f3-99ef-714f119b69bb&path=%2Fpages%2Fblog%2Findex%3FblogId%3D103578475%26_swebFromHost%3Dbaiduboxapp