Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B58D719707 for ; Tue, 29 Mar 2016 07:16:01 +0000 (UTC) Received: (qmail 91585 invoked by uid 500); 29 Mar 2016 07:15:57 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 91436 invoked by uid 500); 29 Mar 2016 07:15:57 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 91425 invoked by uid 99); 29 Mar 2016 07:15:56 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 Mar 2016 07:15:56 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 62D91C08EE for ; Tue, 29 Mar 2016 07:15:56 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.198 X-Spam-Level: * X-Spam-Status: No, score=1.198 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id ls4wbAq9vkpy for ; Tue, 29 Mar 2016 07:15:54 +0000 (UTC) Received: from mail-oi0-f47.google.com (mail-oi0-f47.google.com [209.85.218.47]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 2A3335F3DC for ; Tue, 29 Mar 2016 07:15:54 +0000 (UTC) Received: by mail-oi0-f47.google.com with SMTP id h6so10058223oia.2 for ; Tue, 29 Mar 2016 00:15:54 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc; bh=y2LANAwfiw/Ui7Kz52eJ6qpDXOCLVnUY6jhYO1w3rDY=; b=EIYUjPgM0G7jSlLPUm+phBqhmGDgvmmLohnP6cXbdZQjoLQPPF6RJO//q6PeiJomwr qWtMOp8dIt6nzYD0uTzqUWBswdzJWI6JbESys+vqpng1gpH63ya87W3PHp/5rLf2ecPU Y9z/AShS/mb3kbY1zDHqqphzbMpUaDfXoCYtI5u5MbvGYR6D7C0/zKOW6SQuywc2FRwc 3nQ2ES9I7MwmDgp3OXPi2nVj/rpOD2IlvK2V2n9+qpCO7aZryWITCjI8kayZOwBGYQsp N19qL16phvzrsRzGCLq/4YEz+xAqNn4+8GRki/Bbw2g0jSzmk7nVEQfYpmLnlfBDzNHy zAQw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:cc; bh=y2LANAwfiw/Ui7Kz52eJ6qpDXOCLVnUY6jhYO1w3rDY=; b=dSmqnw1s+mtqsE5CCJ1yYljZaDwjG+w2Wc35fQDkhx7s0kfK4g8C0rDpJm//Cqh0N+ fnsX0ibmOQJqoRZXkewbsUqhInFb6qoBumAd8tRBJq+OwfhTbqk12+1Cot+h9OKPTbhr FD8lepfePFc8B0ksBqvbVBhAD+afOtm3PwRdX/pGUl1iMtyXtwmck8ekYn7Xjah4/hOg tNP2D6XBNuTOV8E+x0JLKv9Icr7vM8B14SIHusFE+f5pY2bqDVuZCjxxSvm4pKUY5RhL 6omju+0Q2AitQmxJbOhgaUzRKuTThbRl55xoO2UdtXdFd/hfx3+tZLCYKk0tu8cFUhA4 yKYA== X-Gm-Message-State: AD7BkJLv/KDShBIZE7MgT+D9FYbPdC5n9Sscgg5I9LUaVj5DzrC5soLFG4bYCfAyMyYhsjXlhXrDq9W56GBuOg== MIME-Version: 1.0 X-Received: by 10.202.72.22 with SMTP id v22mr265067oia.9.1459235753282; Tue, 29 Mar 2016 00:15:53 -0700 (PDT) Received: by 10.202.235.11 with HTTP; Tue, 29 Mar 2016 00:15:53 -0700 (PDT) In-Reply-To: <8AD4EE147886274A8B495D6AF407DF698F03BFB2@BLREML509-MBX.china.huawei.com> References: <8AD4EE147886274A8B495D6AF407DF698F03BF7D@BLREML509-MBX.china.huawei.com> <8AD4EE147886274A8B495D6AF407DF698F03BFB2@BLREML509-MBX.china.huawei.com> Date: Tue, 29 Mar 2016 12:45:53 +0530 Message-ID: Subject: Re: UnstaisfiedLinkError - Windows Environment From: karthi keyan To: Brahma Reddy Battula Cc: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a113dc192b7d352052f2acc8c --001a113dc192b7d352052f2acc8c Content-Type: text/plain; charset=UTF-8 Yes, built with right libraries. In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2). On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula < brahmareddy.battula@huawei.com> wrote: > Are you using the right libraries ( built for 64-bit windows and > Hadoop 2.6.2) ? > > > > *From:* karthi keyan [mailto:karthi93.sankar@gmail.com] > *Sent:* 29 March 2016 14:51 > *To:* Brahma Reddy Battula > *Cc:* user@hadoop.apache.org > *Subject:* Re: UnstaisfiedLinkError - Windows Environment > > > > Hi Brahma, > > > I have added those libraries to the bin path. Every time when > i communicate with other cluster(hadoop) am facing this issue. > > Is there any Backward compatibility ?? or some thing else ? > > > > On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula < > brahmareddy.battula@huawei.com> wrote: > > Hadoop Cluster installed in Windows or only client is in Windows? > > > > Whether Hadoop distribution contains windows library files and > /bin is added to PATH ? > > > > > > *From:* karthi keyan [mailto:karthi93.sankar@gmail.com] > *Sent:* 29 March 2016 14:29 > *To:* user@hadoop.apache.org > *Subject:* UnstaisfiedLinkError - Windows Environment > > > > Hi, > > Frequently am facing this issue while reading the Data from HDFS, Every > time i have replaced (rebuid) the jars. Does any one suggest me the right > way to resolve this issue? or can any one tell me the root cause for this > error ? > > JDK > 1.7 > System env - win 64 bit > > > Caused by: java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V > > at > org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) > ~[hadoop-common-2.6.2.jar:na] > > at > org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) > ~[hadoop-common-2.6.2.jar:na] > > at > org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) > ~[hadoop-common-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) > ~[hadoop-hdfs-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) > ~[hadoop-hdfs-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) > ~[hadoop-hdfs-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) > ~[hadoop-hdfs-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) > ~[hadoop-hdfs-2.6.2.jar:na] > > at > org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) > ~[hadoop-hdfs-2.6.2.jar:na] > > at java.io.DataInputStream.read(DataInputStream.java:100) > ~[na:1.7.0] > > Regards, > > Karthikeyan S > > > --001a113dc192b7d352052f2acc8c Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Yes, built with right libraries.=C2=A0

In my c= ase i have to connect with remote cluster which accommodate Hadoop (built f= or 64 bit windows and Hadoop 2.5.2).=C2=A0

On Tue, Mar 29, 2016 at 12:34 PM, = Brahma Reddy Battula <brahmareddy.battula@huawei.com><= /span> wrote:

Are = you using the right libraries ( built for 64-bit windows and Hadoop=C2=A02.= 6.2) ?

=C2=A0

From: karthi= keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user= @hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

=C2=A0

Hi Brahma,=C2=A0


I have added those libraries to the bin path. Every time =C2=A0when i=C2=A0= communicate with other cluster(hadoop) am facing this issue.<= /p>

Is there any Backward= compatibility =C2=A0?? or some thing else ?

=C2=A0

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battu= la <= brahmareddy.battula@huawei.com> wrote:

Hadoop Cluster installed in Windows or on= ly client is in Windows?

=C2=A0

Whether Hadoop=C2=A0distrib= ution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?

=C2=A0

=C2=A0

From: karthi= keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:29
To: user= @hadoop.apache.org
Subject: UnstaisfiedLinkError - Windows Environment
=

=C2=A0

Hi,=C2=A0

Frequently am facing this issue while reading the Data from HDFS, Every tim= e i have replaced (rebuid) the jars. Does any one suggest me the right way = to resolve this issue? or can any one tell me the root cause for this error= ?=C2=A0

JDK > 1.7
System env - win 64 bit=C2=A0


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc= 32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;II= Ljava/lang/String;JZ)V

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedS= ums(Native Method) ~[hadoop-common-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(Nat= iveCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(Da= taChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket= (RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlo= ckReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.= doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInpu= tStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(D= FSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStrea= m.java:848) ~[hadoop-hdfs-2.6.2.jar:na]

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 at java.io.DataInputStream.read(DataInputStream.java:100) ~= [na:1.7.0]

Regards,

Karthikeyan S=C2=A0

=C2=A0


--001a113dc192b7d352052f2acc8c--