Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 240A2188C1 for ; Thu, 4 Jun 2015 08:03:55 +0000 (UTC) Received: (qmail 72588 invoked by uid 500); 4 Jun 2015 08:03:48 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 72469 invoked by uid 500); 4 Jun 2015 08:03:48 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 72459 invoked by uid 99); 4 Jun 2015 08:03:48 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 04 Jun 2015 08:03:48 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 66EC61A4462 for ; Thu, 4 Jun 2015 08:03:47 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 5.351 X-Spam-Level: ***** X-Spam-Status: No, score=5.351 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=3, KAM_LINEPADDING=1.2, KAM_LIVE=1, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id CQroAH8MNI45 for ; Thu, 4 Jun 2015 08:03:33 +0000 (UTC) Received: from mail-ie0-f179.google.com (mail-ie0-f179.google.com [209.85.223.179]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id 07AA0275E3 for ; Thu, 4 Jun 2015 08:03:34 +0000 (UTC) Received: by ieclw1 with SMTP id lw1so30955629iec.3 for ; Thu, 04 Jun 2015 01:03:27 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=EC5i0TZD8F52ozqSnmz4tlkbpm4z7NDEIXW9Ywg/+8A=; b=O1KqZmFpettpeRf3wWtt3sl6kxXzIyB9M39L4/wq8N4PnxAhdSa9zlzSleE9eQhHLT OSTjQPSz6uXERyZWI+WEwovC1wgK1L7OlHLpoVmGTxCM2kx7CEXQA628Dx06K5Hm+FJN YTZtcUCf5F2DaPLF/weOHXFiaCsAcml7QqFn2z7jRJXLhRE4GcI8zrVZawG3esU0SiPL PG9AGuN5TiMNxgHjaUfWVcE81O8dbNqBNd+y78Ni5lkQhn56EnbSS1LvcO6paRfmAvcT uRyy6e7LbDE7lvHzZdbvLm4sXB2mYclOddU5VhCv50cnQ4eUwFOxUjTv4Nj/WxkSMOTd 5wCQ== MIME-Version: 1.0 X-Received: by 10.50.225.35 with SMTP id rh3mr32024648igc.29.1433405007337; Thu, 04 Jun 2015 01:03:27 -0700 (PDT) Received: by 10.36.33.15 with HTTP; Thu, 4 Jun 2015 01:03:27 -0700 (PDT) In-Reply-To: <75E9B3D687D9BA43A99B7097A074C5B5298B920A@szxeml501-mbx.china.huawei.com> References: <0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256@KDCPEXCMB02.cof.ds.capitalone.com> <75E9B3D687D9BA43A99B7097A074C5B5298B920A@szxeml501-mbx.china.huawei.com> Date: Thu, 4 Jun 2015 16:03:27 +0800 Message-ID: Subject: Re: UnsatisfiedLinkError installing Hadoop on Windows From: neil al To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a1132f2144822bd0517ac9ca4 --001a1132f2144822bd0517ac9ca4 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi, You can also try this link : http://mariuszprzydatek.com/2015/05/10/installing_hadoop_on_windows_8_or_8_= 1/ It is using Hadoop version 2.7.0. I don't encounter any problems when using the version 2.7.0. Thanks On Thu, Jun 4, 2015 at 2:19 PM, Kiran Kumar.M.R wrote: > Hi Joshua, > > You will get this error, when native and winutils project are not built > properly. > > (hadoop-common-project\hadoop-common\src\main\native and > hadoop-common-project\hadoop-common\src\main\winutils) > > Even if they are built, you may not be having proper version off MSVCRT > (Visual C++ runtime) DLL in the path. > > > > Provide this information: > > Did you successfully build Hadoop-Common native and winutils? Which VC++ > compiler did you use? > > Which version of Hadoop are you using? > > Your windows is 32-bit or 64-bit? > > > > If you are using Win32 and Hadoop version is less than 2.7 ,apply patch > from HADOOP-9922 to compile on 32-bit > > > > Also have look at compilation steps in this blog: > > > http://zutai.blogspot.com/2014/06/build-install-and-run-hadoop-24-240-on.= html?showComment=3D1422091525887#c2264594416650430988 > > > > > > Regards, > > Kiran > > > _________________________________________________________________________= _________________________________ > > This e-mail and its attachments contain confidential information from > HUAWEI, which is intended only for the person or entity whose address is > listed above. Any use of the information contained herein in any way > (including, but not limited to, total or partial disclosure, reproduction= , > or dissemination) by persons other than the intended recipient(s) is > prohibited. If you receive this e-mail in error, please notify the sender > by phone or email immediately and delete it! > > > _________________________________________________________________________= _________________________________ > > > > > > > > > > *From:* Edwards, Joshua [mailto:Joshua.Edwards@capitalone.com] > *Sent:* Wednesday, June 03, 2015 21:48 > *To:* user@hadoop.apache.org > *Subject:* UnsatisfiedLinkError installing Hadoop on Windows > > > > Hello =E2=80=93 > > > > I am trying to work through the documentation at > http://wiki.apache.org/hadoop/Hadoop2OnWindows to get a basic single node > instance of Hadoop running on Windows. I am on step 3.5, where I am > executing the line =E2=80=9C%HADOOP_PREFIX%\bin\hdfs dfs -put myfile.txt = /=E2=80=9D, and I > get the following stack trace: > > > > Exception in thread "main" java.lang.UnsatisfiedLinkError: > org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[B= I[BIILjava/lang/String;JZ)V > > at > org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Nati= ve > Method) > > at > org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCr= c32.java:86) > > at > org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.jav= a:430) > > at > org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.ja= va:202) > > at > org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163) > > at > org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144) > > at > org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:222= 0) > > at > org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2204) > > at > org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputS= tream.java:72) > > at > org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106= ) > > at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61) > > at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) > > at > org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeS= treamToFile(CommandWithDestination.java:466) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(Comm= andWithDestination.java:391) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(Comman= dWithDestination.java:328) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWith= Destination.java:263) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWith= Destination.java:248) > > at > org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317) > > at > org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(Com= mandWithDestination.java:243) > > at > org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271) > > at > org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255) > > at > org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(Comman= dWithDestination.java:220) > > at > org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands= .java:267) > > at > org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201) > > at org.apache.hadoop.fs.shell.Command.run(Command.java:165) > > at org.apache.hadoop.fs.FsShell.run(FsShell.java:287) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) > > at org.apache.hadoop.fs.FsShell.main(FsShell.java:340) > > > > Could you please help? > > > > Thanks, > > Josh > > > > > ------------------------------ > > The information contained in this e-mail is confidential and/or > proprietary to Capital One and/or its affiliates. The information > transmitted herewith is intended only for use by the individual or entity > to which it is addressed. If the reader of this message is not the > intended recipient, you are hereby notified that any review, > retransmission, dissemination, distribution, copying or other use of, or > taking of any action in reliance upon this information is strictly > prohibited. If you have received this communication in error, please > contact the sender and delete the material from your computer. > --001a1132f2144822bd0517ac9ca4 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

You can also try this link :
<= div>

It is= using Hadoop version 2.7.0. I don't encounter any problems when using = the version 2.7.0.

Thanks

On Thu, Jun 4, 2015 at 2:19 PM= , Kiran Kumar.M.R <Kiran.Kumar.MR@huawei.com> wrote:=

Hi Joshua,

You will get this erro= r, when native and winutils project are not built properly.

(hadoop-common-project= \hadoop-common\src\main\native and hadoop-common-project\hadoop-common\src\= main\winutils)

Even if they are built= , you may not be having proper version off MSVCRT (Visual C++ runtime) DLL = in the path.

=C2=A0

Provide this informati= on:

Did you successfully b= uild Hadoop-Common native and winutils? Which VC++ compiler did you use?=

Which version of Hadoo= p are you using?

Your windows is 32-bit= or 64-bit?

=C2=A0

If you are using Win32= and Hadoop version is less than 2.7 ,apply patch from HADOOP-9922 to compi= le on 32-bit

=C2=A0

Also have look at comp= ilation steps in this blog:

http://zutai.bl= ogspot.com/2014/06/build-install-and-run-hadoop-24-240-on.html?showComment= =3D1422091525887#c2264594416650430988

=C2=A0

=C2=A0

Regards,=

Kiran

_______________________________= ___________________________________________________________________________=

This e-mail and its attachments= contain confidential information from HUAWEI, which is intended only for t= he person or entity whose address is listed above. Any use of the information contained herein in any way (including, but not limited= to, total or partial disclosure, reproduction, or dissemination) by person= s other than the intended recipient(s) is prohibited. If you receive this e= -mail in error, please notify the sender by phone or email immediately and delete it!

_______________________________= ___________________________________________________________________________=

=C2=A0

=C2=A0

=C2=A0

=C2=A0

From: Edwards,= Joshua [mailto:Joshua.Edwards@capitalone.com]
Sent: Wednesday, June 03, 2015 21:48
To: user= @hadoop.apache.org
Subject: UnsatisfiedLinkError installing Hadoop on Windows=

=C2=A0

Hello =E2=80=93

=C2=A0

I am trying to work through the documentation at http://wiki.apache.org/hadoop/Hadoop2OnWindows to get a basic single no= de instance of Hadoop running on Windows.=C2=A0 I am on step 3.5, where I a= m executing the line =E2=80=9C%HADOOP_PREFIX%\bin\hdfs dfs -put myfile.txt = /=E2=80=9D, and I get the following stack trace:

=C2=A0

Exception in thread "main" java.lang.Unsat= isfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSum= sByteArray(II[BI[BIILjava/lang/String;JZ)V

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native Metho= d)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0at org.apa= che.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:= 86)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2220)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2204)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.ja= va:72)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToF= ile(CommandWithDestination.java:466)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithD= estination.java:391)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDes= tination.java:328)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestinat= ion.java:263)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestinat= ion.java:248)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.processPaths(Command.java:317)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWith= Destination.java:243)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.processArgument(Command.java:271)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.processArguments(Command.java:255)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDes= tination.java:220)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:26= 7)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.processRawArguments(Command.java:201)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.shell.Command.run(Command.java:165)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FsShell.run(FsShell.java:287)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.fs.FsShell.main(FsShell.java:340)

=C2=A0

Could you please help?

=C2=A0

Thanks,

Josh

=C2=A0

=C2=A0


The i= nformation contained in this e-mail is confidential and/or proprietary to C= apital One and/or its affiliates. The information transmitted herewith is intended only for use by the individual or entity to which it = is addressed.=C2=A0 If the reader of this message is not the intended recip= ient, you are hereby notified that any review, retransmission, disseminatio= n, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly = prohibited. If you have received this communication in error, please contac= t the sender and delete the material from your computer.


--001a1132f2144822bd0517ac9ca4--