hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bing Li <lbl...@gmail.com>
Subject Re: Unable to Load Native-Hadoop Library for Your Platform
Date Tue, 07 Feb 2012 14:05:56 GMT
Dear Uma,

Thanks so much for your reply!

Is the compression technique critical to Hadoop? I am not familiar with
native libraries.

Best regards,
Bing

On Tue, Feb 7, 2012 at 6:04 PM, Uma Maheswara Rao G <maheswara@huawei.com>wrote:

> Looks you are not using any compression in your code. Hadoop has some
> native libraries to load mainly for compression codecs.
> When you want to use that compression tequniques, you need to compile with
> this compile.native option enable. Also need to set in java library path.
> If you are not using any such stuff, then you need not worry about that
> warning.
> Please look at the below link for more information.
> http://hadoop.apache.org/common/docs/current/native_libraries.html
>
> Regards,
> Uma
> ________________________________________
> From: Bing Li [lblabs@gmail.com]
> Sent: Tuesday, February 07, 2012 3:08 PM
> To: common-user@hadoop.apache.org
> Subject: Unable to Load Native-Hadoop Library for Your Platform
>
> Dear all,
>
> I got an error when running a simple Java program on Hadoop. The program is
> just to merge some local files to one and put it on Hadoop.
>
> The code is as follows.
>
>                ......
>
>                Configuration conf = new Configuration();
>                try
>                {
>                        FileSystem hdfs = FileSystem.get(conf);
>                        FileSystem local = FileSystem.getLocal(conf);
>                        Path inputDir = new Path("/home/libing/Temp/");
>                        Path hdfsFile = new
> Path("/tmp/user/libing/example.txt");
>
>                        try
>                        {
>                                FileStatus[] inputFiles =
> local.listStatus(inputDir);
>                                FSDataOutputStream out =
> hdfs.create(hdfsFile);
>                                for (int i = 0; i < inputFiles.length; i ++)
>                                {
>
> System.out.println(inputFiles[i].getPath().getName());
>                                        FSDataInputStream in =
> local.open(inputFiles[i].getPath());
>                                        byte buffer[] = new byte[256];
>                                        int bytesRead = 0;
>                                        while ((bytesRead =
> in.read(buffer)) > 0)
>                                        {
>                                                out.write(buffer, 0,
> bytesRead);
>                                        }
>                                        in.close();
>                                }
>                                out.close();
>                        }
>                        catch (IOException e)
>                        {
>                                e.printStackTrace();
>                        }
>                }
>                catch (IOException e)
>                {
>                        e.printStackTrace();
>                }
>
>                ......
>
> I run it with ant and got the following warning. BTW, all the relevant jar
> packages from Hadoop are specified in the build.xml.
>
>     [java] 2012-2-7 17:16:18 org.apache.hadoop.util.NativeCodeLoader
> <clinit>
>     [java] Warning: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
>
> The program got a correct result. But I cannot figure out what the above
> problem is.
>
> Thanks so much!
> Bing
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message