Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4161A195D4 for ; Tue, 29 Mar 2016 06:29:33 +0000 (UTC) Received: (qmail 94987 invoked by uid 500); 29 Mar 2016 06:29:28 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 94716 invoked by uid 500); 29 Mar 2016 06:29:27 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 94704 invoked by uid 99); 29 Mar 2016 06:29:27 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 29 Mar 2016 06:29:27 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id E58C9C0B54 for ; Tue, 29 Mar 2016 06:29:26 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.179 X-Spam-Level: * X-Spam-Status: No, score=1.179 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id zd-xUbDJfIZu for ; Tue, 29 Mar 2016 06:29:25 +0000 (UTC) Received: from mail-ob0-f173.google.com (mail-ob0-f173.google.com [209.85.214.173]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id A75395F23D for ; Tue, 29 Mar 2016 06:29:24 +0000 (UTC) Received: by mail-ob0-f173.google.com with SMTP id x3so4929310obt.0 for ; Mon, 28 Mar 2016 23:29:24 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to; bh=bIBgou+ZyXPB9BG9nzsf5PzGh5wup4SnY65ImX9hOiE=; b=sYRTKlEaREVReleMH48SMQds7exWEa5zLSwzARARdP+Ndgbrz6aOWVxWd7c+P6mS50 JC4OACXCOMGUVwOtRQy1R/2N9vxxPTBeSZsFOHupORwKx/kZG0VAL/S9rlCLfOEvxkKZ R6l9VVRv6vJV7DGAQn0wjOGmS5l6/bA76q3mSkk5cq6faCShD9CkBg8uuLyDIGK1FxLa cN9XFSw5ANmgPHO07oNJuAIUER1uh0/D2LUxv21tOiqrqwOx65FXSIWnOKKi2g/NtVv2 z3Nwm5dfyImbFQb2XFqY/4AifEz9GktstAmIYZouSmrC0ve1DaXZabLqsKCzF6itxDsI t8Jw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:date:message-id:subject:from:to; bh=bIBgou+ZyXPB9BG9nzsf5PzGh5wup4SnY65ImX9hOiE=; b=bWxKpBEC/ssIYYB+quZYmsYmYn81NrmM+9ZA93ZkkRRQzmJOBuSZ44NfTotr8Kx3YB S1IoL84lPEyDTMlZh4DDZqI1Tf2LplUmANng4JARsLpMH9b/lW3Jnj4S0oUCwEfCdV1C 7JaN5OBkKLdHipHAN7B4i1LcQE3aNpfU3fPZIb4uyknVkrPLPY6/Efj1+942tJjffIgQ JbRR7KQM6WewrWUD7jEphXMLrSftNLKmPSzpm/pAIre8Eh6nRq2z9JDKxECNGQLt4rSu s536Ga93igyiMfLUV6c8nDXFz8baGxY+2MW2pje019wWRNePWstDW6ujpEMG29FSS0I4 iqLw== X-Gm-Message-State: AD7BkJJZcuS77tKBJ0YfPtrY/KI3eBNERl8XztrGyPsrcIHnTBRGohi3hWyy0OKAdaSePorpAlDMMV5OlrXkBw== MIME-Version: 1.0 X-Received: by 10.60.92.234 with SMTP id cp10mr173797oeb.75.1459232958078; Mon, 28 Mar 2016 23:29:18 -0700 (PDT) Received: by 10.202.235.11 with HTTP; Mon, 28 Mar 2016 23:29:18 -0700 (PDT) Date: Tue, 29 Mar 2016 11:59:18 +0530 Message-ID: Subject: UnstaisfiedLinkError - Windows Environment From: karthi keyan To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b33d2de1c6666052f2a267e --047d7b33d2de1c6666052f2a267e Content-Type: text/plain; charset=UTF-8 Hi, Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ? JDK > 1.7 System env - win 64 bit Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na] at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na] at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na] at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na] at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0] Regards, Karthikeyan S --047d7b33d2de1c6666052f2a267e Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,=C2=A0

Frequently am facing this issue whil= e reading the Data from HDFS, Every time i have replaced (rebuid) the jars.= Does any one suggest me the right way to resolve this issue? or can any on= e tell me the root cause for this error ?=C2=A0

JDK > 1.7
Syst= em env - win 64 bit=C2=A0


Caused by: java.lang.UnsatisfiedLinkEr= ror: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/ni= o/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
at org.apache.hadoop.util.Nat= iveCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:= na]
at org.apa= che.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop= -common-2.6.2.jar:na]
= at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecks= um.java:301) ~[hadoop-common-2.6.2.jar:na]
at org.apache.hadoop.hdfs.RemoteBlockReader2.r= eadNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
at org.apache.hado= op.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-= 2.6.2.jar:na]
= at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputS= tream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.DFSInputStream.read= Buffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.DFS= InputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.j= ar:na]
at org.= apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hd= fs-2.6.2.jar:na]
at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]
Regards,
Karthikeyan S=C2=A0
--047d7b33d2de1c6666052f2a267e--