hbase 调用ImportTsv导入csv文件时报错File does not exist

用下面方法想hbase导入csv文件时,报错代码Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://hadoop1:9000/opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator="," -Dimporttsv.bulk.output=/hfile_tmp -Dimporttsv.columns=HBASE_ROW_KEY,cf tb1 simple.csv

错误详细代码

2019-03-05 01:16:22,254 INFO  [main] Configuration.deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2019-03-05 01:16:22,351 INFO  [main] client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x3694c492a720007
2019-03-05 01:16:22,357 INFO  [main] zookeeper.ZooKeeper: Session: 0x3694c492a720007 closed
2019-03-05 01:16:22,358 INFO  [main-EventThread] zookeeper.ClientCnxn: EventThread shut down
2019-03-05 01:16:22,401 INFO  [main] Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2019-03-05 01:16:22,402 INFO  [main] jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
2019-03-05 01:16:22,494 INFO  [main] Configuration.deprecation: io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2019-03-05 01:16:23,151 INFO  [main] mapreduce.JobSubmitter: Cleaning up the staging area file:/home/xinlei/hadoop/tmp/mapred/staging/xinlei1237697110/.staging/job_local1237697110_0001
Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://hadoop1:9000/opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1072)
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
        at org.apache.hadoop.hbase.mapreduce.ImportTsv.run(ImportTsv.java:738)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:747)

查看这个jar包是存在的

xinlei@hadoop1:~$ ll /opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar
-rw-rw-r-- 1 xinlei xinlei 533455 Sep  8 05:07 /opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar
xinlei@hadoop1:~$ 

怀疑可能是hdfs://hadoop1:9000/opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar这个链接有问题,请各位大神帮忙看看。谢谢!

请问你解决了吗? 我也遇到了同样的问题

请问问题解决了吗?

请问问题解决了吗

把缺少的jar包上传到hdfs上,hdfs://hadoop1:9000/opt/hbase-1.2.11/lib/protobuf-java-2.5.0.jar
可以把/opt/hbase-1.2.11/lib/下的全部jar包上传
hdfs dfs -mkdir -p /opt/hbase-1.2.11/lib/
hdfs dfs -put /opt/hbase-1.2.11/lib/
/opt/hbase-1.2.11/lib/