hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From neil al <nva0...@gmail.com>
Subject Re: UnsatisfiedLinkError installing Hadoop on Windows
Date Thu, 04 Jun 2015 08:03:27 GMT
Hi,

You can also try this link :

http://mariuszprzydatek.com/2015/05/10/installing_hadoop_on_windows_8_or_8_1/

It is using Hadoop version 2.7.0. I don't encounter any problems when using
the version 2.7.0.

Thanks

On Thu, Jun 4, 2015 at 2:19 PM, Kiran Kumar.M.R <Kiran.Kumar.MR@huawei.com>
wrote:

>  Hi Joshua,
>
> You will get this error, when native and winutils project are not built
> properly.
>
> (hadoop-common-project\hadoop-common\src\main\native and
> hadoop-common-project\hadoop-common\src\main\winutils)
>
> Even if they are built, you may not be having proper version off MSVCRT
> (Visual C++ runtime) DLL in the path.
>
>
>
> Provide this information:
>
> Did you successfully build Hadoop-Common native and winutils? Which VC++
> compiler did you use?
>
> Which version of Hadoop are you using?
>
> Your windows is 32-bit or 64-bit?
>
>
>
> If you are using Win32 and Hadoop version is less than 2.7 ,apply patch
> from HADOOP-9922 to compile on 32-bit
>
>
>
> Also have look at compilation steps in this blog:
>
>
> http://zutai.blogspot.com/2014/06/build-install-and-run-hadoop-24-240-on.html?showComment=1422091525887#c2264594416650430988
>
>
>
>
>
> Regards,
>
> Kiran
>
>
> __________________________________________________________________________________________________________
>
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by phone or email immediately and delete it!
>
>
> __________________________________________________________________________________________________________
>
>
>
>
>
>
>
>
>
> *From:* Edwards, Joshua [mailto:Joshua.Edwards@capitalone.com]
> *Sent:* Wednesday, June 03, 2015 21:48
> *To:* user@hadoop.apache.org
> *Subject:* UnsatisfiedLinkError installing Hadoop on Windows
>
>
>
> Hello –
>
>
>
> I am trying to work through the documentation at
> http://wiki.apache.org/hadoop/Hadoop2OnWindows to get a basic single node
> instance of Hadoop running on Windows.  I am on step 3.5, where I am
> executing the line “%HADOOP_PREFIX%\bin\hdfs dfs -put myfile.txt /”, and I
> get the following stack trace:
>
>
>
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V
>
>         at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native
> Method)
>
>        at
> org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:86)
>
>         at
> org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)
>
>         at
> org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202)
>
>         at
> org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)
>
>         at
> org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144)
>
>         at
> org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2220)
>
>         at
> org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2204)
>
>         at
> org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
>
>         at
> org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
>
>         at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61)
>
>         at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:391)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:328)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)
>
>         at
> org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)
>
>         at
> org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:243)
>
>         at
> org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
>
>         at
> org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
>
>         at
> org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:220)
>
>         at
> org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)
>
>         at
> org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201)
>
>         at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
>
>         at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
>
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
>         at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
>
>
>
> Could you please help?
>
>
>
> Thanks,
>
> Josh
>
>
>
>
>  ------------------------------
>
> The information contained in this e-mail is confidential and/or
> proprietary to Capital One and/or its affiliates. The information
> transmitted herewith is intended only for use by the individual or entity
> to which it is addressed.  If the reader of this message is not the
> intended recipient, you are hereby notified that any review,
> retransmission, dissemination, distribution, copying or other use of, or
> taking of any action in reliance upon this information is strictly
> prohibited. If you have received this communication in error, please
> contact the sender and delete the material from your computer.
>

Mime
View raw message