Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DB61A106B9 for ; Tue, 21 Jan 2014 21:08:52 +0000 (UTC) Received: (qmail 49594 invoked by uid 500); 21 Jan 2014 21:08:33 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 49465 invoked by uid 500); 21 Jan 2014 21:08:29 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 49392 invoked by uid 99); 21 Jan 2014 21:08:28 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 21 Jan 2014 21:08:28 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,TVD_FW_GRAPHIC_NAME_MID X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of aagarwal@hortonworks.com designates 209.85.214.177 as permitted sender) Received: from [209.85.214.177] (HELO mail-ob0-f177.google.com) (209.85.214.177) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 21 Jan 2014 21:08:22 +0000 Received: by mail-ob0-f177.google.com with SMTP id wp18so7384996obc.36 for ; Tue, 21 Jan 2014 13:08:00 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=qHWZQSS9Wo1FQwWfnJ7jq2BgzOURGjLhQGRMYMc+UEo=; b=QjGc6/dlk9U7t20xhea4bZD4EvWocjyrsabjUW72+ImdwXe+PMW+kytPwd+NAQZgPN dncDBvMOztq3VYS9MJMrILowlSq5mX1dsjTtxA0XNCjr3b6pvyq8kvg+sJxaasLAa9Fp QpibEPp+9ouei15FaGTzBBE5Dmfk98ttiLsd5n1qsQUEew9a1B2SeSZ7bEybm+NVARC9 vFhmUV91HZ0KsKDjBVz4FaaF95vlG3fzRydsxLf9jUC3hgqNH24f+ALgJ/RhTvDOr14k /+kEgHt0o/s8fncdvUf/gz5C9qkh7o6Ei6zIX4qIXIuQyV42eBUqM1qQ0KnEPHWLMeYT 7/lA== X-Gm-Message-State: ALoCoQk0cZE8nt7Ggi4bUcV+7W1mZpwHu1MhjyIzDEv8F7G3+G4haAj2fCnCoWb7N9E0htF60dT6AgJ+NbwRIyYloxgz4JqLyRiQ86dO6YtjcpuRlBjMmPQ= MIME-Version: 1.0 X-Received: by 10.60.65.101 with SMTP id w5mr22896472oes.0.1390338480328; Tue, 21 Jan 2014 13:08:00 -0800 (PST) Received: by 10.76.25.196 with HTTP; Tue, 21 Jan 2014 13:08:00 -0800 (PST) In-Reply-To: References: <8978C61A0021A142A5DC1755FD41C27983552F31@mail1.impetus.co.in> Date: Tue, 21 Jan 2014 13:08:00 -0800 Message-ID: Subject: Re: Building Hadoop 2.2.0 On Windows 7 64-bit From: Arpit Agarwal To: user@hadoop.apache.org Content-Type: multipart/related; boundary=001a11c1d19c3e174404f08167da X-Virus-Checked: Checked by ClamAV on apache.org --001a11c1d19c3e174404f08167da Content-Type: multipart/alternative; boundary=001a11c1d19c3e174004f08167d9 --001a11c1d19c3e174004f08167d9 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Folks, please refer to the wiki page https://wiki.apache.org/hadoop/Hadoop2OnWindows and also BUILDING.txt in the source tree. We believe we captured all the prerequisites in BUILDING.txt so let us know if anything is missing. On Fri, Jan 17, 2014 at 8:16 AM, Steve Lewis wrote: > At lease for development work I find that replacing two classes in the > Hadoop jar (say putting the following code ahead of the hadoop jars in a > project fixes most windows issues - at least in my hands > > > On Fri, Jan 17, 2014 at 6:41 AM, Silvina Ca=EDno Lores < > silvi.caino@gmail.com> wrote: > >> Hey again, >> >> I'm not a Windows user so I'm not very familiar with these issues. >> However, I recall this linkas a useful s= ource for installation problems I've had on my own, since it's >> for Windows it might help you even further. >> >> The error >> >> stdint.h: No such file or directory >> >> is causing your build to fail, it seems like you don't have the headers >> installed or they aren't properly referenced. Sorry that I can't be of m= ore >> help, I'm not sure how MinGW handles these includes. >> >> Good luck :D >> >> >> >> On 17 January 2014 15:18, Jian Feng JF She wrote: >> >>> I have the same the environment,Window 7(64bit) hadoop 2.2 ,yes I have >>> installed the protocbuf previously,and according to the guide, put >>> protoc.exe and >>> >>> libprotobuf.lib,libprotobuf-lite.lib,libprotoc.lib into PATH. >>> >>> run "protoc --version" will get output libprotoc 2.5.0 >>> >>> Now it seems everything ready, but when run mvn install -DskipTests -e >>> will get an error message: >>> >>> can not execute: compile-ms-native-dll in >>> ..\hadoop-common-project\hadoop-common\pom.xml >>> >>> do you have any suggestions? >>> >>> Thanks. >>> >>> Nikshe >>> >>> >>> [image: Inactive hide details for Silvina Ca=EDno Lores ---01/17/2014 >>> 06:43:00 PM---'protoc --version' did not return a version Are you s]Sil= vina >>> Ca=EDno Lores ---01/17/2014 06:43:00 PM---'protoc --version' did not re= turn a >>> version Are you sure that you have Protocol Buffers installed? >>> >>> From: Silvina Ca=EDno Lores >>> To: user@hadoop.apache.org, >>> Date: 01/17/2014 06:43 PM >>> >>> Subject: Re: Building Hadoop 2.2.0 On Windows 7 64-bit >>> ------------------------------ >>> >>> >>> >>> 'protoc --version' did not return a version >>> >>> Are you sure that you have Protocol Buffers installed? >>> >>> >>> >>> On 17 January 2014 11:29, Nirmal Kumar <*nirmal.kumar@impetus.co.in*> >>> wrote: >>> >>> Hi All, >>> >>> >>> >>> I am trying to build Hadoop 2.2.0 On Windows 7 64-bit env. >>> >>> >>> >>> Can you let me know what else is needed for building Hadoop 2.2.0 On >>> Windows platform? >>> >>> >>> >>> I am getting the following error building **hadoop-common** project: >>> >>> >>> >>> [INFO] >>> --------------------------------------------------------------------= ---- >>> >>> [INFO] Building Apache Hadoop Common 2.2.0 >>> >>> [INFO] >>> --------------------------------------------------------------------= ---- >>> >>> [INFO] >>> >>> [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ >>> hadoop-common --- >>> >>> [INFO] Deleting >>> D:\YARN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common\t= arget >>> >>> [INFO] >>> >>> [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ >>> hadoop-common --- >>> >>> [INFO] Executing tasks >>> >>> >>> >>> main: >>> >>> [mkdir] Created dir: >>> D:\YARN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common\t= arget\test-dir >>> >>> [mkdir] Created dir: >>> D:\YARN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common\t= arget\test\data >>> >>> [INFO] Executed tasks >>> >>> [INFO] >>> >>> [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-os) @ >>> hadoop-common --- >>> >>> [INFO] >>> >>> [INFO] --- hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ >>> hadoop-common --- >>> >>> [WARNING] [protoc, --version] failed: java.io.IOException: Cannot >>> run program "protoc": CreateProcess error=3D2, The system cannot fin= d the >>> file specified >>> >>> [ERROR] stdout: [] >>> >>> [INFO] >>> --------------------------------------------------------------------= ---- >>> >>> [INFO] BUILD FAILURE >>> >>> [INFO] >>> --------------------------------------------------------------------= ---- >>> >>> [INFO] Total time: 1.153s >>> >>> [INFO] Finished at: Fri Jan 17 15:55:10 IST 2014 >>> >>> [INFO] Final Memory: 7M/18M >>> >>> [INFO] >>> --------------------------------------------------------------------= ---- >>> >>> [ERROR] Failed to execute goal >>> org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc)= on >>> project hadoop-common: org.apache.maven.plugin.MojoExecutionExceptio= n: >>> 'protoc --version' did not return a version -> [Help 1] >>> >>> [ERROR] >>> >>> [ERROR] To see the full stack trace of the errors, re-run Maven with >>> the -e switch. >>> >>> [ERROR] Re-run Maven using the -X switch to enable full debug >>> logging. >>> >>> [ERROR] >>> >>> [ERROR] For more information about the errors and possible >>> solutions, please read the following articles: >>> >>> [ERROR] [Help 1] >>> *http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionExcep= tion* >>> >>> D:\YARN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common> >>> >>> >>> >>> Thanks, >>> >>> -Nirmal >>> >>> ------------------------------ >>> >>> >>> >>> >>> >>> >>> NOTE: This message may contain information that is confidential, >>> proprietary, privileged or otherwise protected by law. The message i= s >>> intended solely for the named addressee. If received in error, pleas= e >>> destroy and notify the sender. Any use of this email is prohibited w= hen >>> received in error. Impetus does not represent, warrant and/or guaran= tee, >>> that the integrity of this communication has been maintained nor tha= t the >>> communication is free of errors, virus, interception or interference= . >>> >>> >>> >>> >> > > > -- > Steven M. Lewis PhD > 4221 105th Ave NE > Kirkland, WA 98033 > 206-384-1340 (cell) > Skype lordjoe_com > > --=20 CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to= =20 which it is addressed and may contain information that is confidential,=20 privileged and exempt from disclosure under applicable law. If the reader= =20 of this message is not the intended recipient, you are hereby notified that= =20 any printing, copying, dissemination, distribution, disclosure or=20 forwarding of this communication is strictly prohibited. If you have=20 received this communication in error, please contact the sender immediately= =20 and delete it from your system. Thank You. --001a11c1d19c3e174004f08167d9 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Folks, please refer to the wiki page https://= wiki.apache.org/hadoop/Hadoop2OnWindows and also BUILDING.txt in the so= urce tree. We believe we captured all the prerequisites in BUILDING.txt so= let us know if anything is missing.



On Fri, Jan 17, 2014 at 8:16 AM, Steve Lewis <lor= djoe2000@gmail.com> wrote:
At lease for development wo= rk I find that replacing two classes in the Hadoop jar (say putting the fol= lowing code ahead of the hadoop jars in a project fixes most windows issues= - at least in my hands


On Fri, Jan 17, 2014 at 6:41 AM, Silvina= Ca=EDno Lores <silvi.caino@gmail.com> wrote:
Hey again,

I'm not a Windows user so I'm not very familiar with these issues. = However, I recall=A0this link as a useful source for installation problems I've h= ad on my own, since it's for Windows it might help you even further.

The error

stdint.h: No such file or directory

is causing your build to fail, it seems like you don't have the headers= installed or they aren't properly referenced. Sorry that I can't b= e of more help, I'm not sure how MinGW handles these includes.=A0

Good luck :D



On 17 January 2014 15:18, Jian Feng JF She &= lt;shejianf@cn.ibm= .com> wrote:

I have the same the environment,Window 7(64bit= ) hadoop 2.2 ,yes I have installed the protocbuf previously,and according t= o the guide, put protoc.exe and

libprotobuf.lib,libprotobuf-lite.lib,libprotoc.li= b into PATH.

run "protoc --version" will get output = libprotoc 2.5.0

Now it seems everything ready, but when run mvn i= nstall -DskipTests -e will get an error message:

can not execute: compile-ms-native-dll in ..\hado= op-common-project\hadoop-common\pom.xml

do you have any suggestions?

Thanks.

Nikshe


3D"Inac=Silvin= a Ca=EDno Lores ---01/17/2014 06:43:00 PM---'protoc --version' did = not return a version Are you sure that you have Protocol Buffers installed?=

From: Silvina Ca=EDno Lores <silvi.caino@gmail.com>
To: user@hadoop.apache.org,
Date: 01/17/2014 06:43 PM


Subject: Re: Building Hadoop 2.2.0 On Windows 7 64-= bit





'protoc --version' did not return a version<= /font>

Are you sure that you have Protocol Buffers installe= d?



On 17 January 2014 11:29, Nirmal Kumar <= nirmal.kumar@impetus.co.in=
> wrote:
    Hi All,

    =A0

    I am trying to build Hadoop 2.2.0 On= Windows 7 64-bit env.

    =A0

    Can you let me know what else is nee= ded for building Hadoop 2.2.0 On Windows platform?

    =A0

    I am getting the following error bui= lding *hadoop-common* project:

    =A0

    [INFO] -----------------------------= -------------------------------------------

    [INFO] Building Apache Hadoop Common= 2.2.0

    [INFO] -----------------------------= -------------------------------------------

    [INFO]

    [INFO] --- maven-clean-plugin:2.4.1:= clean (default-clean) @ hadoop-common ---

    [INFO] Deleting D:\YARN_Setup\hadoop= -2.2.0-src\hadoop-common-project\hadoop-common\target

    [INFO]

    [INFO] --- maven-antrun-plugin:1.6:r= un (create-testdirs) @ hadoop-common ---

    [INFO] Executing tasks

    =A0

    main:

    =A0=A0=A0 [mkdir] Created dir: D:\YA= RN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common\target\test-d= ir

    =A0=A0=A0 [mkdir] Created dir: D:\YA= RN_Setup\hadoop-2.2.0-src\hadoop-common-project\hadoop-common\target\test\d= ata

    [INFO] Executed tasks

    [INFO]

    [INFO] --- maven-enforcer-plugin:1.3= .1:enforce (enforce-os) @ hadoop-common ---

    [INFO]

    [INFO] --- hadoop-maven-plugins:2.2.= 0:protoc (compile-protoc) @ hadoop-common ---

    [WARNING] [protoc, --version] failed= : java.io.IOException: Cannot run program "protoc": CreateProcess= error=3D2, The system cannot find the file specified

    [ERROR] stdout: []

    [INFO] -----------------------------= -------------------------------------------

    [INFO] BUILD FAILURE

    [INFO] -----------------------------= -------------------------------------------

    [INFO] Total time: 1.153s

    [INFO] Finished at: Fri Jan 17 15:55= :10 IST 2014

    [INFO] Final Memory: 7M/18M

    [INFO] -----------------------------= -------------------------------------------

    [ERROR] Failed to execute goal org.a= pache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project = hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc = --version' did not return a version -> [Help 1]

    [ERROR]

    [ERROR] To see the full stack trace = of the errors, re-run Maven with the -e switch.

    [ERROR] Re-run Maven using the -X sw= itch to enable full debug logging.

    [ERROR]

    [ERROR] For more information about t= he errors and possible solutions, please read the following articles:

    [ERROR] [Help 1] http://= cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException=

    D:\YARN_Setup\hadoop-2.2.0-src\hadoo= p-common-project\hadoop-common>

    =A0

    Thanks,

    -Nirmal








    NOTE: This message may contain information that is confidential, proprietar= y, privileged or otherwise protected by law. The message is intended solely= for the named addressee. If received in error, please destroy and notify t= he sender. Any use of this email is prohibited when received in error. Impe= tus does not represent, warrant and/or guarantee, that the integrity of thi= s communication has been maintained nor that the communication is free of e= rrors, virus, interception or interference.

    <= p>






<= /div>--
Steven M. Lewis = PhD
4221 105th Ave NE
Kirkland, WA 98033


CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u. --001a11c1d19c3e174004f08167d9-- --001a11c1d19c3e174404f08167da Content-Type: image/gif; name="graycol.gif" Content-Transfer-Encoding: base64 Content-ID: <1__=C7BBF6F0DFA8D3658f9e8a93df938@cn.ibm.com> X-Attachment-Id: 30ed5f2158315e92_0.1 R0lGODlhEAAQAKECAMzMzAAAAP///wAAACH5BAEAAAIALAAAAAAQABAAAAIXlI+py+0PopwxUbpu ZRfKZ2zgSJbmSRYAIf4fT3B0aW1pemVkIGJ5IFVsZWFkIFNtYXJ0U2F2ZXIhAAA7 --001a11c1d19c3e174404f08167da--