用java读取hdfs的.lzo_deflate文件报错

linux环境没有问题,hadoop环境、配置也没有问题,并且通过hdoop fs -text 指令能正常打开该压缩文件。但是用java读取就报错了,请大神帮忙看看,谢谢

代码如下:
public static void main(String[] args) {
String uri = "/daas/****/MBLDPI3G.2016081823_10.1471532401822.lzo_deflate";
Configuration conf = new Configuration();
String path = "/software/servers/hadoop-2.6.3-bin/hadoop-2.6.3/etc/hadoop/";
conf.addResource(new Path(path + "core-site.xml"));
conf.addResource(new Path(path + "hdfs-site.xml"));
conf.addResource(new Path(path + "mapred-site.xml"));
try {
CompressionCodecFactory factory = new CompressionCodecFactory(conf);

        CompressionCodec codec = factory.getCodec(new Path(uri));
        if (codec == null) {
            System.out.println("Codec for " + uri + " not found.");
        } else {
            CompressionInputStream in = null;
            try {
                in = codec.createInputStream(new java.io.FileInputStream(uri));
                byte[] buffer = new byte[100];
                int len = in.read(buffer);
                while (len > 0) {
                    System.out.write(buffer, 0, len);
                    len = in.read(buffer);
                }
            } finally {
                if (in != null) {
                    in.close();
                }
            }
        }

    } catch (Exception e) {
        e.printStackTrace();
    }
}

报错信息如下:
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.NativeCodeLoader).

log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
java.io.FileNotFoundException: /daas/***/MBLDPI3G.2016081823_10.1471532401822.lzo_deflate (没有那个文件或目录)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.(FileInputStream.java:146)
at java.io.FileInputStream.(FileInputStream.java:101)
at FileDecompressor.main(FileDecompressor.java:53)

加载的jar包:
    <classpathentry kind="lib" path="lib/commons-cli-1.2.jar"/>
<classpathentry kind="lib" path="lib/commons-collections-3.2.2.jar"/>
<classpathentry kind="lib" path="lib/commons-configuration-1.6.jar"/>
<classpathentry kind="lib" path="lib/commons-lang-2.6.jar"/>
<classpathentry kind="lib" path="lib/commons-logging-1.1.3.jar"/>
<classpathentry kind="lib" path="lib/guava-18.0.jar"/>
<classpathentry kind="lib" path="lib/hadoop-auth-2.6.3.jar"/>
<classpathentry kind="lib" path="lib/hadoop-common-2.6.3.jar"/>
<classpathentry kind="lib" path="lib/hadoop-hdfs-2.6.3.jar"/>
<classpathentry kind="lib" path="lib/htrace-core-3.0.4.jar"/>
<classpathentry kind="lib" path="lib/log4j-1.2.17.jar"/>
<classpathentry kind="lib" path="lib/protobuf-java-2.5.0.jar"/>
<classpathentry kind="lib" path="lib/slf4j-api-1.7.5.jar"/>
<classpathentry kind="lib" path="lib/slf4j-log4j12-1.7.5.jar"/>
<classpathentry kind="lib" path="lib/hadoop-lzo-0.4.20.jar"/>

问题很奇怪,通过百度实在搞不定了,跪求过路大神帮忙指点迷津,谢谢!