使用Sqoop连接MySQL并将数据表导入到HDFS中:出错

Warning: /opt/bigdata/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/bigdata/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
21/05/24 02:27:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
21/05/24 02:27:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/05/24 02:27:50 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/05/24 02:27:50 INFO tool.CodeGenTool: Beginning code generation
21/05/24 02:27:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1
21/05/24 02:27:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1
21/05/24 02:27:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/bigdata/hadoop
注: /tmp/sqoop-zhangyu/compile/5dbd4960514747a4a548ab01432eaad9/user_info.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/05/24 02:27:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-zhangyu/compile/5dbd4960514747a4a548ab01432eaad9/user_info.jar
21/05/24 02:27:52 WARN manager.MySQLManager: It looks like you are importing from mysql.
21/05/24 02:27:52 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
21/05/24 02:27:52 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
21/05/24 02:27:52 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
21/05/24 02:27:52 INFO mapreduce.ImportJobBase: Beginning import of user_info
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/bigdata/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/bigdata/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/05/24 02:27:52 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/05/24 02:27:53 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
21/05/24 02:27:53 INFO client.RMProxy: Connecting to ResourceManager at master/172.100.3.160:8032
21/05/24 02:27:56 INFO db.DBInputFormat: Using read commited transaction isolation
21/05/24 02:27:56 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `user_info`
21/05/24 02:27:56 INFO db.IntegerSplitter: Split size: 0; Num splits: 4 from: 1 to: 2
21/05/24 02:27:56 INFO mapreduce.JobSubmitter: number of splits:2
21/05/24 02:27:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1621823245599_0001
21/05/24 02:27:57 INFO impl.YarnClientImpl: Submitted application application_1621823245599_0001
21/05/24 02:27:57 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1621823245599_0001/
21/05/24 02:27:57 INFO mapreduce.Job: Running job: job_1621823245599_0001
21/05/24 02:28:05 INFO mapreduce.Job: Job job_1621823245599_0001 running in uber mode : false
21/05/24 02:28:05 INFO mapreduce.Job:  map 0% reduce 0%
21/05/24 02:28:05 INFO mapreduce.Job: Job job_1621823245599_0001 failed with state FAILED due to: Application application_1621823245599_0001 failed 2 times due to AM Container for appattempt_1621823245599_0001_000002 exited with  exitCode: 127
For more detailed output, check application tracking page:http://master:8088/cluster/app/application_1621823245599_0001Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1621823245599_0001_02_000001
Exit code: 127
Stack trace: ExitCodeException exitCode=127: 
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:585)
	at org.apache.hadoop.util.Shell.run(Shell.java:482)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:776)
	at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)


Container exited with a non-zero exit code 127
Failing this attempt. Failing the application.
21/05/24 02:28:05 INFO mapreduce.Job: Counters: 0
21/05/24 02:28:05 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
21/05/24 02:28:05 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 12.3882 seconds (0 bytes/sec)
21/05/24 02:28:05 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
21/05/24 02:28:05 INFO mapreduce.ImportJobBase: Retrieved 0 records.
21/05/24 02:28:05 ERROR tool.ImportTool: Import failed: Import job failed!

 

你好,我是有问必答小助手,非常抱歉,本次您提出的有问必答问题,技术专家团超时未为您做出解答

本次提问扣除的有问必答次数,将会以问答VIP体验卡(1次有问必答机会、商城购买实体图书享受95折优惠)的形式为您补发到账户。

​​​​因为有问必答VIP体验卡有效期仅有1天,您在需要使用的时候【私信】联系我,我会为您补发。