sqoop:000> start job --name job_1
Submission details
Job Name: job_1
Server URL: http://localhost:12000/sqoop/
Created by: hadoop
Creation date: 2020-07-06 17:25:55 CST
Lastly updated by: hadoop
External ID: job_1594023705104_0003
http://server1:8088/proxy/application_1594023705104_0003/
2020-07-06 17:25:55 CST: BOOTING - Progress is not available
sqoop:000> status job --name job_1
Submission details
Job Name: job_1
Server URL: http://localhost:12000/sqoop/
Created by: hadoop
Creation date: 2020-07-06 17:25:55 CST
Lastly updated by: hadoop
External ID: job_1594023705104_0003
http://server1:8088/proxy/application_1594023705104_0003/
2020-07-06 17:27:36 CST: FAILED
Exception: Job Failed with status:3
日志:
2020-07-06 17:25:39,129 INFO [org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:96)] Got job request
2020-07-06 17:25:39,133 INFO [org.apache.sqoop.audit.FileAuditLogger.logAuditEvent(FileAuditLogger.java:61)] user=hadoop ip=127.0.0.1 op=get obj=jobobjId=all
2020-07-06 17:25:39,145 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:39,159 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:43,692 INFO [org.apache.sqoop.audit.FileAuditLogger.logAuditEvent(FileAuditLogger.java:61)] user=hadoop ip=127.0.0.1 op=get obj=linobjId=all
2020-07-06 17:25:43,714 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:43,721 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:43,723 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,673 INFO [org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:96)] Got job request
2020-07-06 17:25:55,693 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,725 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,726 INFO [org.apache.sqoop.audit.FileAuditLogger.logAuditEvent(FileAuditLogger.java:61)] user=hadoop ip=127.0.0.1 op=submit obj=job objId=job_1
2020-07-06 17:25:55,734 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,739 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,741 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:55,831 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configurePartitionProperties(GenericJdbcFromInitializer.java:137)] Found primary key columns [id]
2020-07-06 17:25:55,831 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configurePartitionProperties(GenericJdbcFromInitializer.java:152)] Using partition column: id
2020-07-06 17:25:55,833 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configurePartitionProperties(GenericJdbcFromInitializer.java:238)] Using min/max query: SELECT MIN( id ), MAX( id ) FROM test1 . student
2020-07-06 17:25:55,836 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configurePartitionProperties(GenericJdbcFromInitializer.java:291)] Boundaries for the job: min=1001, max=1010, columnType=4
2020-07-06 17:25:55,838 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configureTableProperties(GenericJdbcFromInitializer.java:353)] Using dataSql: SELECT * FROM test1 . student WHERE ${CONDITIONS}
2020-07-06 17:25:55,838 INFO [org.apache.sqoop.connector.jdbc.GenericJdbcFromInitializer.configureTableProperties(GenericJdbcFromInitializer.java:354)] Field names: id , name , classid
2020-07-06 17:25:55,854 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file httpfs-site.xml
2020-07-06 17:25:55,857 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file kms-site.xml
2020-07-06 17:25:55,857 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file hdfs-site.xml
2020-07-06 17:25:55,857 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file yarn-site.xml
2020-07-06 17:25:55,857 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file core-site.xml
2020-07-06 17:25:55,858 INFO [org.apache.sqoop.connector.hdfs.HdfsUtils.createConfiguration(HdfsUtils.java:66)] Found Hadoop configuration file mapred-site.xml
2020-07-06 17:25:55,914 INFO [org.apache.hadoop.conf.Configuration.logDeprecation(Configuration.java:1395)] fs.default.name is deprecated. Instead, use fs.defaultFS
2020-07-06 17:25:55,968 INFO [org.apache.sqoop.connector.hadoop.security.SecurityUtils.generateDelegationTokens(SecurityUtils.java:75)] Running on unsecured cluster, skipping delegation token generation.
2020-07-06 17:25:55,971 INFO [org.apache.sqoop.connector.hdfs.HdfsToInitializer.initialize(HdfsToInitializer.java:90)] Using working directory: /sqoop_test/.7bf8856a-24ff-4ab8-a0f0-4962f47a9b86
2020-07-06 17:25:56,197 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:25:56,351 INFO [org.apache.hadoop.yarn.client.RMProxy.newProxyInstance(RMProxy.java:133)] Connecting to ResourceManager at server1/172.16.160.128:8032
2020-07-06 17:25:56,381 WARN [org.apache.hadoop.mapreduce.JobResourceUploader.uploadResourcesInternal(JobResourceUploader.java:149)] Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2020-07-06 17:25:56,396 INFO [org.apache.hadoop.mapreduce.JobResourceUploader.disableErasureCodingForPath(JobResourceUploader.java:906)] Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1594023705104_0003
2020-07-06 17:25:56,479 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:56,945 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:56,982 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:57,432 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:57,905 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:58,377 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:25:58,418 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:00,565 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:00,902 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:01,368 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:01,397 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:01,830 WARN [org.apache.hadoop.mapreduce.JobResourceUploader.uploadJobJar(JobResourceUploader.java:482)] No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2020-07-06 17:26:01,861 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:01,879 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:02,302 INFO [org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:202)] number of splits:9
2020-07-06 17:26:02,359 INFO [org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:239)] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-07-06 17:26:02,843 INFO [org.apache.hadoop.mapreduce.JobSubmitter.printTokens(JobSubmitter.java:298)] Submitting tokens for job: job_1594023705104_0003
2020-07-06 17:26:02,843 INFO [org.apache.hadoop.mapreduce.JobSubmitter.printTokens(JobSubmitter.java:299)] Executing with tokens: []
2020-07-06 17:26:02,855 INFO [org.apache.hadoop.mapred.YARNRunner.setupLocalResources(YARNRunner.java:426)] Job jar is not present. Not adding any jar to the list of resources.
2020-07-06 17:26:03,220 INFO [org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:329)] Submitted application application_1594023705104_0003
2020-07-06 17:26:03,243 INFO [org.apache.hadoop.mapreduce.Job.submit(Job.java:1574)] The url to track the job: http://server1:8088/proxy/application_1594023705104_0003/
2020-07-06 17:26:03,888 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:27:35,957 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:27:36,045 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:30:14,020 INFO [org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:96)] Got job request
2020-07-06 17:30:14,062 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:30:14,064 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:30:14,064 INFO [org.apache.sqoop.audit.FileAuditLogger.logAuditEvent(FileAuditLogger.java:61)] user=hadoop ip=127.0.0.1 op=status obj=job objId=job_1
2020-07-06 17:30:14,080 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
2020-07-06 17:32:36,052 INFO [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)] Attempting transaction commit
没有报错,但是有几个warn,都百度过还是没什么头绪,求解
https://blog.csdn.net/weixin_33982670/article/details/92033816
对照一项项看下,特别是网络端口防火墙