Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 80FD4200C1C for ; Wed, 1 Feb 2017 02:37:59 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 7FAF5160B5F; Wed, 1 Feb 2017 01:37:59 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id C8B48160B52 for ; Wed, 1 Feb 2017 02:37:58 +0100 (CET) Received: (qmail 82604 invoked by uid 500); 1 Feb 2017 01:37:56 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 82572 invoked by uid 99); 1 Feb 2017 01:37:56 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Feb 2017 01:37:55 +0000 Received: from mail-yb0-f181.google.com (mail-yb0-f181.google.com [209.85.213.181]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id B38AC1A002B; Wed, 1 Feb 2017 01:37:55 +0000 (UTC) Received: by mail-yb0-f181.google.com with SMTP id j82so135823462ybg.1; Tue, 31 Jan 2017 17:37:55 -0800 (PST) X-Gm-Message-State: AIkVDXINE+aLVNb30xnV7zXw/RsyJsRo+XdKnejqnT6150rTPlozgOhtbI8ujEN8zfSH0hbGJu5tL4Qy0j0CVA== X-Received: by 10.37.77.69 with SMTP id a66mr198836ybb.184.1485913074892; Tue, 31 Jan 2017 17:37:54 -0800 (PST) MIME-Version: 1.0 Received: by 10.13.200.196 with HTTP; Tue, 31 Jan 2017 17:37:54 -0800 (PST) In-Reply-To: <8AD4EE147886274A8B495D6AF407DF698F202C12@BLREML509-MBS.china.huawei.com> References: <8AD4EE147886274A8B495D6AF407DF698F202C12@BLREML509-MBS.china.huawei.com> From: Arun Suresh Date: Tue, 31 Jan 2017 17:37:54 -0800 X-Gmail-Original-Message-ID: Message-ID: Subject: Re: Problems compiling hadoop trunk on windows To: Brahma Reddy Battula Cc: Hadoop Common , "yarn-dev@hadoop.apache.org" , Hdfs-dev Content-Type: multipart/alternative; boundary=001a113c5c4cff197105476e18de archived-at: Wed, 01 Feb 2017 01:37:59 -0000 --001a113c5c4cff197105476e18de Content-Type: text/plain; charset=UTF-8 Thanks Brahma.. That worked.. unfortunately, now the compile complains that I do not have the right sdk, even though I have VS 2015, win SDK 8.1 and SDK 7.1 installed. Cheers -Arun On Sun, Jan 29, 2017 at 7:31 PM, Brahma Reddy Battula < brahmareddy.battula@huawei.com> wrote: > It might be problem with maven repository, What's your maven repository > path..? > > May be you can change to shorter path and check. > > > > Regards > Brahma Reddy Battula > > -----Original Message----- > From: Arun Suresh [mailto:asuresh@apache.org] > Sent: 29 January 2017 03:15 > To: Hadoop Common; yarn-dev@hadoop.apache.org; Hdfs-dev > Subject: Problems compiling hadoop trunk on windows > > Hi > > I was wondering if folks who compile Hadoop trunk on windows regularly > have hit this issue: > > [INFO] --- native-maven-plugin:1.0-alpha-8:javah (default) @ hadoop-common > --- > [INFO] cmd.exe /X /C ""C:\Program Files\Java\jdk1.8.0_121\bin\javah" -d > C:\Users\arsuresh\stuff\hadoop\hadoop-common-project\ > hadoop-common\target\native\javah > -classpath org.apache.hadoop.util.NativeCrc32" > .... > .... > *The command line is too long.* > [INFO] > ------------------------------------------------------------------------ > [INFO] Reactor Summary: > [INFO] > [INFO] Apache Hadoop Main ................................. SUCCESS [ > 1.232 s] > ..... > ..... > INFO] Apache Hadoop Common ............................... FAILURE [ > 20.245 s] ..... > ..... > [ERROR] Failed to execute goal > org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on > project hadoop-common: Error running javah command: Error executing command > line. Exit code:1 -> [Help 1] > > It looks like the arguments passed to 'javah' causes the command line to > grow too long. I tried moving the root dir to C:\hadoopo, but that doesnt > work either. Is there some config I need to set before compiling ? > > Cheers > -Arun > --001a113c5c4cff197105476e18de--