Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2C48E18D1B for ; Wed, 3 Jun 2015 19:16:38 +0000 (UTC) Received: (qmail 53775 invoked by uid 500); 3 Jun 2015 19:16:32 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 53670 invoked by uid 500); 3 Jun 2015 19:16:32 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Delivered-To: moderator for user@hadoop.apache.org Received: (qmail 37344 invoked by uid 99); 3 Jun 2015 16:19:33 -0000 X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.888 X-Spam-Level: ** X-Spam-Status: No, score=2.888 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, SPF_HELO_PASS=-0.001, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=capitalone.com DKIM-Signature: v=1; a=rsa-sha256; c=simple/simple; d=capitalone.com; l=12804; q=dns/txt; s=SM2048Apr2013K; t=1433348370; x=1433434770; h=from:to:date:subject:message-id:mime-version; bh=Z2z03qufMqoQ9bZV1njrQ6tfe5hu1FLx4N7xLlLWLRM=; b=CdMmmDrliG/BRQ12moQCSJcYM97oIj/ve7JTyhzio1ZWh1aqFyOywA5f Wy5UF73rhQdmuUhNPWoWxM4nj7EVdHIaRr+Rd82MJMVAAAhdVCQVZKC8o 1t8smn6UpuAIz+K8sQs2GjQauAgYuMx7hsgsubJbV6Gdxb7rhTfl+2v1V l8816IFnxCoNmol+XT0yqh3l521CPCtjTH6phG9KTOrEodRaxMv+A167m 6yoX4Dwa5XV0fuPhPzHgL+hnFSslsl6KwXjWkiOrnkbqRVeceCFKMelnW Mat/qqmRedY7X1Zv2lmhKpKc7QAqfEz22McMQgrbc4yeiZbn2kAHweuQl A==; X-IronPort-AV: E=McAfee;i="5700,7163,7820"; a="236384924" X-IronPort-AV: E=Sophos;i="5.13,547,1427774400"; d="scan'208,217";a="236384924" X-HTML-Disclaimer: True From: "Edwards, Joshua" To: "user@hadoop.apache.org" Date: Wed, 3 Jun 2015 12:18:26 -0400 Subject: UnsatisfiedLinkError installing Hadoop on Windows Thread-Topic: UnsatisfiedLinkError installing Hadoop on Windows Thread-Index: AdCeGJHq58w2aEzWRLOn6/a+Rs/DQg== Message-ID: <0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256@KDCPEXCMB02.cof.ds.capitalone.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: multipart/alternative; boundary="_000_0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256KDCPEXCMB02co_" MIME-Version: 1.0 --_000_0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256KDCPEXCMB02co_ Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable Hello - I am trying to work through the documentation at http://wiki.apache.org/had= oop/Hadoop2OnWindows to get a basic single node instance of Hadoop running = on Windows. I am on step 3.5, where I am executing the line "%HADOOP_PREFI= X%\bin\hdfs dfs -put myfile.txt /", and I get the following stack trace: Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoo= p.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/St= ring;JZ)V at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteA= rray(Native Method) at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(= NativeCrc32.java:86) at org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChe= cksum.java:430) at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutput= Summer.java:202) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.j= ava:163) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.j= ava:144) at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream= .java:2220) at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.jav= a:2204) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDa= taOutputStream.java:72) at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream= .java:106) at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61) at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSyst= em.writeStreamToFile(CommandWithDestination.java:466) at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTa= rget(CommandWithDestination.java:391) at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarg= et(CommandWithDestination.java:328) at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(Co= mmandWithDestination.java:263) at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(Co= mmandWithDestination.java:248) at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317) at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.j= ava:289) at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArg= ument(CommandWithDestination.java:243) at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:= 271) at org.apache.hadoop.fs.shell.Command.processArguments(Command.java= :255) at org.apache.hadoop.fs.shell.CommandWithDestination.processArgumen= ts(CommandWithDestination.java:220) at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(Cop= yCommands.java:267) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.j= ava:201) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:287) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.fs.FsShell.main(FsShell.java:340) Could you please help? Thanks, Josh ________________________________________________________ The information contained in this e-mail is confidential and/or proprietary= to Capital One and/or its affiliates. The information transmitted herewith= is intended only for use by the individual or entity to which it is addres= sed. If the reader of this message is not the intended recipient, you are = hereby notified that any review, retransmission, dissemination, distributio= n, copying or other use of, or taking of any action in reliance upon this i= nformation is strictly prohibited. If you have received this communication = in error, please contact the sender and delete the material from your compu= ter. --_000_0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256KDCPEXCMB02co_ Content-Type: text/html; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable

Hello –

 

= I am trying to work through the documentation at http://wiki.apache.org/hadoop/Hadoop2OnWin= dows to get a basic single node instance of Hadoop running on Windows.&= nbsp; I am on step 3.5, where I am executing the line “%HADOOP_PREFIX= %\bin\hdfs dfs -put myfile.txt /”, and I get the following stack trac= e:

 

Exception in thread "main" java.lang.UnsatisfiedLinkError: o= rg.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[B= IILjava/lang/String;JZ)V

  &nb= sp;     at org.apache.hadoop.util.NativeCrc32.nativeCom= puteChunkedSumsByteArray(Native Method)

=        at org.apache.hadoop.util.Native= Crc32.calculateChunkedSumsByteArray(NativeCrc32.java:86)

        at org.apache.h= adoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)

        at o= rg.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:= 202)

      = ;  at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.j= ava:163)

     &= nbsp;  at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSumm= er.java:144)

    &nb= sp;   at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutp= utStream.java:2220)

   &n= bsp;    at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSO= utputStream.java:2204)

   = ;     at org.apache.hadoop.fs.FSDataOutputStream$Positi= onCache.close(FSDataOutputStream.java:72)

        at org.apache.hadoop.fs.FSData= OutputStream.close(FSDataOutputStream.java:106)

        at org.apache.hadoop.io.= IOUtils.copyBytes(IOUtils.java:61)

 = ;       at org.apache.hadoop.io.IOUtils.copyB= ytes(IOUtils.java:119)

   = ;     at org.apache.hadoop.fs.shell.CommandWithDestinat= ion$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)

        = at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(Com= mandWithDestination.java:391)

 &nbs= p;      at org.apache.hadoop.fs.shell.CommandWithD= estination.copyFileToTarget(CommandWithDestination.java:328)

=

        at org.apac= he.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestinatio= n.java:263)

    &nbs= p;   at org.apache.hadoop.fs.shell.CommandWithDestination.process= Path(CommandWithDestination.java:248)

&n= bsp;       at org.apache.hadoop.fs.shell.Comm= and.processPaths(Command.java:317)

 = ;       at org.apache.hadoop.fs.shell.Command= .processPathArgument(Command.java:289)

&= nbsp;       at org.apache.hadoop.fs.shell.Com= mandWithDestination.processPathArgument(CommandWithDestination.java:243)

       = at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)

       = at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)

       = ; at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(Com= mandWithDestination.java:220)

 &nbs= p;      at org.apache.hadoop.fs.shell.CopyCommands= $Put.processArguments(CopyCommands.java:267)

        at org.apache.hadoop.fs.she= ll.Command.processRawArguments(Command.java:201)

        at org.apache.hadoop.fs= .shell.Command.run(Command.java:165)

&nb= sp;       at org.apache.hadoop.fs.FsShell.run= (FsShell.java:287)

   &nb= sp;    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j= ava:70)

     &n= bsp;  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

        = at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

 

Could you please h= elp?

 

Thanks,

Josh

 



=

The information contained in thi= s e-mail is confidential and/or proprietary to Capital One and/or its affil= iates. The information transmitted herewith is intended only for use by the= individual or entity to which it is addressed.  If the reader of this= message is not the intended recipient, you are hereby notified that any re= view, retransmission, dissemination, distribution, copying or other use of,= or taking of any action in reliance upon this information is strictly proh= ibited. If you have received this communication in error, please contact th= e sender and delete the material from your computer.

--_000_0E5232377AB13C4A8F4F76A07F603ED24DD8FA2256KDCPEXCMB02co_--