hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brahma Reddy Battula <brahmareddy.batt...@huawei.com>
Subject RE: UnstaisfiedLinkError - Windows Environment
Date Tue, 29 Mar 2016 07:04:03 GMT
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop)
am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <brahmareddy.battula@huawei.com<mailto:brahmareddy.battula@huawei.com>>
wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<mailto:karthi93.sankar@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<mailto:user@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced
(rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any
one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S

Mime
View raw message