sqoop报错 ERROR tool.BaseSqoopTool: Error parsing arguments for export:

想用sqoop将数据从hive导入MySQL

hadoop@dblab-VirtualBox:/usr/local/sqoop$ ./bin/sqoop export --connect jdbc:mysql://localhost:3306/dbjd --username root -p --table tv --export-dir '/user/hive/warehouse/dbjd.db/tv' --fields-terminated-by '\t';

Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
21/12/19 16:29:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Error parsing arguments for export:
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: -p
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: --table
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: tv
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: --export-dir
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: /user/hive/warehouse/dbjd.db/tv
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: --fields-terminated-by
21/12/19 16:29:14 ERROR tool.BaseSqoopTool: Unrecognized argument: \t

Try --help for usage instructions.

你好,我是有问必答小助手,非常抱歉,本次您提出的有问必答问题,技术专家团超时未为您做出解答


本次提问扣除的有问必答次数,将会以问答VIP体验卡(1次有问必答机会、商城购买实体图书享受95折优惠)的形式为您补发到账户。


因为有问必答VIP体验卡有效期仅有1天,您在需要使用的时候【私信】联系我,我会为您补发。

./bin/sqoop export --connect jdbc:mysql://localhost:3306/dbjd --username root -p --table tv --export-dir '/user/hive/warehouse/dbjd.db/tv' --fields-terminated-by '\t';中-p要改为-P
都是也出现了新的问题
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
21/12/19 16:51:03 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
Enter password:
21/12/19 16:51:08 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/12/19 16:51:08 INFO tool.CodeGenTool: Beginning code generation
Sun Dec 19 16:51:09 CST 2021 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
21/12/19 16:51:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM tv AS t LIMIT 1
21/12/19 16:51:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM tv AS t LIMIT 1
21/12/19 16:51:10 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
注: /tmp/sqoop-hadoop/compile/67bb807a55e4af3c8969d6a704b27c7e/tv.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/12/19 16:51:16 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/67bb807a55e4af3c8969d6a704b27c7e/tv.jar
21/12/19 16:51:16 INFO mapreduce.ExportJobBase: Beginning export of tv
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/12/19 16:51:17 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/12/19 16:51:19 WARN mapreduce.ExportJobBase: Input path hdfs://localhost:9000/user/hive/warehouse/dbjd.db/tv does not exist
21/12/19 16:51:19 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/12/19 16:51:19 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/12/19 16:51:19 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
21/12/19 16:51:19 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
21/12/19 16:51:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
21/12/19 16:51:21 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/usr/local/hadoop/tmp/mapred/staging/hadoop1941714596/.staging/job_local1941714596_0001
21/12/19 16:51:21 ERROR tool.ExportTool: Encountered IOException running export job: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://localhost:9000/user/hive/warehouse/dbjd.db/tv