mapred 任务执行报ExitCodeException exitCode=255错误,哪位大侠遇到过,请指教。

15/03/10 13:54:37 INFO mapreduce.Job: Task Id : attempt_1424832738800_0060_m_000002_2, Status : FAILED
Exception from container-launch.
Container id: container_1424832738800_0060_01_000018
Exit code: 255
Stack trace: ExitCodeException exitCode=255:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:196)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:299)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Container exited with a non-zero exit code 255

Container exited with a non-zero exit code 255同求

我也遇到了这个问题,后来在yarn的container日志中发现
java.lang.NoSuchMethodError: org.apache.hadoop.fs.FSOutputSummer.(Ljava/util/zip/Checksum;II)V
at org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1563)
at org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1594)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1626)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1488)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1413)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:387)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:383)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:383)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:327)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:786)
at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.createEventWriter(JobHistoryEventHandler.java:379)
at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.setupEventWriter(JobHistoryEventHandler.java:419)
at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.handleEvent(JobHistoryEventHandler.java:504)
at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler$1.run(JobHistoryEventHandler.java:280)
at java.lang.Thread.run(Thread.java:745)
2016-07-15 11:47:11,373 INFO [eventHandlingThread] org.apache.hadoop.util.ExitUtil: Exiting with status -1

出现这个错误的原因是,由于,运行mapreduce任务时,缺少hbase的jar包,我就将所有hbase的jar包放入到yarn.application.classpath指定的目录$HADOOP_COMMON_HOME/share/hadoop/common/*下,然后我发现hbase的所有包中包含了hadoop-common-2.5.1.jar,但是hadoop本身包含hadoop-common-2.6.4.jar,造成冲突,把hadoop-common-2.6.4.jar删除掉就可以了