hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun C Murthy <...@hortonworks.com>
Subject Re: Build failure in map reduce trunk
Date Sun, 28 Aug 2011 11:06:03 GMT

On Aug 28, 2011, at 3:47 AM, Tharindu Mathew wrote:

> Arun,
> 
> One of the main reasons I built the source is to start contributing. :)
> 

Great, welcome!

> Although, it's clear how to make code contributions in
> http://wiki.apache.org/hadoop/HowToContribute. I'm a bit unclear on how to
> make documentation contributions. Do I just create a JIRA and attach a text
> file, so that it can be included in the wiki?
> 

Yep. You can start by providing a patch to fix hadoop-mapreduce-project/INSTALL and/or hadoop-mapreduce-project/hadoop-yarn/README.

Better yet, you can start by adding docs via docbook for maven which we don't have current.

IIRC, that is the std. way of documenting - I know hbase uses that too.

> Also, the quick start page has a small error as well at
> http://hadoop.apache.org/common/docs/r0.20.0/quickstart.html. A grep doesn't
> seem to show where it's generated from. Can you help me out here as well?
> 

That is done via forrest, look at hadoop-mapreduce-project/src/docs/src/documentation/content/xdocs/.

However, for MRv2 I think we should move to maven based docs via docbook or some such.

Thanks again.

Arun

> 
> On Sun, Aug 28, 2011 at 3:44 PM, Arun C Murthy <acm@hortonworks.com> wrote:
> 
>> I'd strongly encourage you to help out by providing documentation patches
>> with similar content... thus, you'll help the project and future users.
>> Thanks in advance!
>> 
>> Arun
>> 
>> On Aug 28, 2011, at 2:22 AM, Tharindu Mathew wrote:
>> 
>>> Thanks for the explanation Arun.
>>> 
>>> Created a post with the info gathered here, so that it will help someone
>>> else as well:
>>> 
>> http://tharindu-mathew.blogspot.com/2011/08/building-apache-hadoop-from-source.html
>>> 
>>> 
>>> On Sun, Aug 28, 2011 at 1:32 PM, Arun C Murthy <acm@hortonworks.com>
>> wrote:
>>> 
>>>> Mathew,
>>>> 
>>>> The native code, in this context, is the C executable used to launch the
>>>> containers (tasks) by the NodeManager. The short summary of the
>> executable
>>>> is that it's a setuid executable used to ensure that the unix process
>> runs
>>>> as the actual user who submitted the job, not as the unix user of the
>>>> NodeManager.
>>>> 
>>>> Arun
>>>> 
>>>> On Aug 28, 2011, at 12:51 AM, Tharindu Mathew wrote:
>>>> 
>>>>> Hi Praveen/Ravi,
>>>>> 
>>>>> Thanks for all the help. It built successfully.
>>>>> 
>>>>> I'm trying to get a feel about the project structure, build structure
>> and
>>>>> map reduce. Is there a document I can read to understand about the use
>> of
>>>>> native code and where it fits with the mapreduce project?
>>>>> 
>>>>> On Fri, Aug 26, 2011 at 10:00 AM, Ravi Teja <raviteja@huawei.com>
>> wrote:
>>>>> 
>>>>>> Hi Tharindu,
>>>>>> 
>>>>>> I think it is trying to compile the native code. you can add -P-cbuild
>>>> as
>>>>>> argument to skip it, as mentioned earlier by Arun.
>>>>>> 
>>>>>> Regards,
>>>>>> Ravi Teja
>>>>>> 
>>>>>> 
>>>>>> Thanks Praveen.
>>>>>> 
>>>>>> I managed to proceed further. Now I'm stuck at this point. Appreciate
>> if
>>>>>> you
>>>>>> can tell me what I'm doing wrong.
>>>>>> 
>>>>>> Stacktrace:
>>>>>> 
>>>>>> [INFO] --- make-maven-plugin:1.0-beta-1:configure (make) @
>>>>>> hadoop-yarn-server-nodemanager ---
>>>>>> [INFO] checking for a BSD-compatible install... /usr/bin/install
-c
>>>>>> [INFO] checking whether build environment is sane... yes
>>>>>> [INFO] checking for a thread-safe mkdir -p... ./install-sh -c -d
>>>>>> [INFO] checking for gawk... no
>>>>>> [INFO] checking for mawk... no
>>>>>> [INFO] checking for nawk... no
>>>>>> [INFO] checking for awk... awk
>>>>>> [INFO] checking whether make sets $(MAKE)... yes
>>>>>> [INFO] ./configure: line 2226: CHECK_INSTALL_CFLAG: command not found
>>>>>> [INFO] ./configure: line 2227: HADOOP_UTILS_SETUP: command not found
>>>>>> [INFO] checking for gcc... gcc
>>>>>> [INFO] checking for C compiler default output file name... a.out
>>>>>> [INFO] checking whether the C compiler works... yes
>>>>>> [INFO] checking whether we are cross compiling... no
>>>>>> [INFO] checking for suffix of executables...
>>>>>> [INFO] checking for suffix of object files... o
>>>>>> [INFO] checking whether we are using the GNU C compiler... yes
>>>>>> [INFO] checking whether gcc accepts -g... yes
>>>>>> [INFO] checking for gcc option to accept ISO C89... none needed
>>>>>> [INFO] checking for style of include used by make... GNU
>>>>>> [INFO] checking dependency style of gcc... gcc3
>>>>>> [INFO] checking whether gcc and cc understand -c and -o together...
>> yes
>>>>>> [INFO] checking how to run the C preprocessor... gcc -E
>>>>>> [INFO] checking for grep that handles long lines and -e...
>> /usr/bin/grep
>>>>>> [INFO] checking for egrep... /usr/bin/grep -E
>>>>>> [INFO] checking for ANSI C header files... yes
>>>>>> [INFO] checking for sys/types.h... yes
>>>>>> [INFO] checking for sys/stat.h... yes
>>>>>> [INFO] checking for stdlib.h... yes
>>>>>> [INFO] checking for string.h... yes
>>>>>> [INFO] checking for memory.h... yes
>>>>>> [INFO] checking for strings.h... yes
>>>>>> [INFO] checking for inttypes.h... yes
>>>>>> [INFO] checking for stdint.h... yes
>>>>>> [INFO] checking for unistd.h... yes
>>>>>> [INFO] checking for unistd.h... (cached) yes
>>>>>> [INFO] checking for stdbool.h that conforms to C99... yes
>>>>>> [INFO] checking for _Bool... yes
>>>>>> [INFO] checking for an ANSI C-conforming const... yes
>>>>>> [INFO] checking for off_t... yes
>>>>>> [INFO] checking for size_t... yes
>>>>>> [INFO] checking whether strerror_r is declared... yes
>>>>>> [INFO] checking for strerror_r... yes
>>>>>> [INFO] checking whether strerror_r returns char *... no
>>>>>> [INFO] checking for mkdir... yes
>>>>>> [INFO] checking for uname... yes
>>>>>> [INFO] configure: creating ./config.status
>>>>>> [INFO] config.status: creating Makefile
>>>>>> [INFO] config.status: executing depfiles commands
>>>>>> [INFO]
>>>>>> [INFO] --- make-maven-plugin:1.0-beta-1:make-install (install) @
>>>>>> hadoop-yarn-server-nodemanager ---
>>>>>> [INFO] depbase=`echo impl/configuration.o | sed
>>>>>> 's|[^/]*$|.deps/&|;s|\.o$||'`;\
>>>>>> [INFO] gcc -DPACKAGE_NAME=\"linux-container-executor\"
>>>>>> -DPACKAGE_TARNAME=\"linux-container-executor\"
>>>> -DPACKAGE_VERSION=\"1.0.0\"
>>>>>> -DPACKAGE_STRING=\"linux-container-executor\ 1.0.0\"
>>>> -DPACKAGE_BUGREPORT=\"
>>>>>> yarn-dev@yahoo-inc.com\" -D_GNU_SOURCE=1
>>>>>> -DPACKAGE=\"linux-container-executor\" -DVERSION=\"1.0.0\"
>>>> -DSTDC_HEADERS=1
>>>>>> -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1
>>>>>> -DHAVE_STRING_H=1
>>>>>> -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1
>>>> -DHAVE_STDINT_H=1
>>>>>> -DHAVE_UNISTD_H=1 -DHAVE_UNISTD_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1
>>>>>> -DHAVE_DECL_STRERROR_R=1 -DHAVE_STRERROR_R=1 -DHAVE_MKDIR=1
>>>> -DHAVE_UNAME=1
>>>>>> -I.    -I./impl -Wall -g -Werror -DHADOOP_CONF_DIR= -MT
>>>>>> impl/configuration.o
>>>>>> -MD -MP -MF $depbase.Tpo -c -o impl/configuration.o
>> impl/configuration.c
>>>>>> &&\
>>>>>> [INFO] mv -f $depbase.Tpo $depbase.Po
>>>>>> [INFO] cc1: warnings being treated as errors
>>>>>> [INFO] impl/configuration.c: In function 'read_config':
>>>>>> [INFO] impl/configuration.c:144: warning: implicit declaration of
>>>> function
>>>>>> 'getline'
>>>>>> [INFO] make: *** [impl/configuration.o] Error 1
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] hadoop-yarn-api ................................... SUCCESS
>>>>>> [19.783s]
>>>>>> [INFO] hadoop-yarn-common ................................ SUCCESS
>>>>>> [15.172s]
>>>>>> [INFO] hadoop-yarn-server-common ......................... SUCCESS
>>>> [7.966s]
>>>>>> [INFO] hadoop-yarn-server-nodemanager .................... FAILURE
>>>>>> [1:08.482s]
>>>>>> [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
>>>>>> [INFO] hadoop-yarn-server-tests .......................... SKIPPED
>>>>>> [INFO] hadoop-yarn-server ................................ SKIPPED
>>>>>> [INFO] hadoop-yarn ....................................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-core ...................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-common .................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-app ....................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
>>>>>> [INFO] hadoop-mapreduce-client ........................... SKIPPED
>>>>>> [INFO] hadoop-mapreduce .................................. SKIPPED
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] BUILD FAILURE
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] Total time: 1:51.950s
>>>>>> [INFO] Finished at: Fri Aug 26 01:46:51 IST 2011
>>>>>> [INFO] Final Memory: 27M/114M
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [ERROR] Failed to execute goal
>>>>>> org.codehaus.mojo:make-maven-plugin:1.0-beta-1:make-install (install)
>> on
>>>>>> project hadoop-yarn-server-nodemanager: make returned an exit value
!=
>>>> 0.
>>>>>> Aborting build; see command output above for more information. ->
>> [Help
>>>> 1]
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> the
>>>> -e
>>>>>> switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> 
>>>>>> On Thu, Aug 25, 2011 at 5:19 PM, Praveen Sripati
>>>>>> <praveensripati@gmail.com>wrote:
>>>>>> 
>>>>>>> Tharindu,
>>>>>>> 
>>>>>>> Looks like protoc is not available.
>>>>>>> 
>>>>>>> ---
>>>>>>> Cannot run program "protoc" (in directory "HOME/hadoop-trunk/hadoop-
>>>>>>> mapreduce/hadoop-yarn/hadoop-yarn-api"): error=2,
>>>>>>> No such file or directory -> [Help 1]
>>>>>>> ---
>>>>>>> 
>>>>>>> Here are instructions to build protoc
>>>>>>> 
>>>>>>> See
>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-mapreduce/hadoop-
>>>>>> yarn/README
>>>>>>> Make sure protbuf library is in your library path or set: export
>>>>>>> LD_LIBRARY_PATH=/usr/local/lib
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Praveen
>>>>>>> 
>>>>>>> On Thu, Aug 25, 2011 at 3:43 PM, Tharindu Mathew <
>> mccloud35@gmail.com
>>>>>>>> wrote:
>>>>>>> 
>>>>>>>> Hi everyone,
>>>>>>>> 
>>>>>>>> I'm a new bie to this list. Hope this is not too much of
a dumb
>>>>>> question.
>>>>>>>> 
>>>>>>>> I'm trying to build the map reduce trunk. (I already build
from the
>>>>>> root
>>>>>>>> pom
>>>>>>>> and everything built fine, but map reduce is not included
in the
>> root
>>>>>>> pom)
>>>>>>>> 
>>>>>>>> The build fails at this point given below. Appreciate if
someone can
>>>>>> help
>>>>>>>> me
>>>>>>>> out. Thanks in advance.
>>>>>>>> 
>>>>>>>> Stack trace:
>>>>>>>> 
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [INFO] Building hadoop-yarn-api 1.0-SNAPSHOT
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [INFO]
>>>>>>>> [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean)
@
>>>>>>> hadoop-yarn-api
>>>>>>>> ---
>>>>>>>> [INFO] Deleting
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> /Users/mackie/source-checkouts/hadoop-trunk/hadoop-mapreduce/hadoop-yarn/had
>>>>>> oop-yarn-api/target
>>>>>>>> [INFO]
>>>>>>>> [INFO] --- maven-antrun-plugin:1.6:run
>>>>>>>> (create-protobuf-generated-sources-directory) @ hadoop-yarn-api
---
>>>>>>>> [INFO] Executing tasks
>>>>>>>> 
>>>>>>>> main:
>>>>>>>> [mkdir] Created dir:
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> /Users/mackie/source-checkouts/hadoop-trunk/hadoop-mapreduce/hadoop-yarn/had
>>>>>> oop-yarn-api/target/generated-sources/proto
>>>>>>>> [INFO] Executed tasks
>>>>>>>> [INFO]
>>>>>>>> [INFO] --- exec-maven-plugin:1.2:exec (generate-sources)
@
>>>>>>> hadoop-yarn-api
>>>>>>>> ---
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [INFO] Reactor Summary:
>>>>>>>> [INFO]
>>>>>>>> [INFO] hadoop-yarn-api ...................................
FAILURE
>>>>>>> [2.027s]
>>>>>>>> [INFO] hadoop-yarn-common ................................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn-server-common .........................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn-server-nodemanager ....................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn-server-resourcemanager ................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn-server-tests ..........................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn-server ................................
SKIPPED
>>>>>>>> [INFO] hadoop-yarn .......................................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-core ......................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-common ....................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-shuffle ...................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-app .......................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-hs ........................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client-jobclient .................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce-client ...........................
SKIPPED
>>>>>>>> [INFO] hadoop-mapreduce ..................................
SKIPPED
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [INFO] Total time: 2.606s
>>>>>>>> [INFO] Finished at: Thu Aug 25 15:33:02 IST 2011
>>>>>>>> [INFO] Final Memory: 9M/80M
>>>>>>>> [INFO]
>>>>>>>> 
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>> [ERROR] Failed to execute goal
>>>>>>> org.codehaus.mojo:exec-maven-plugin:1.2:exec
>>>>>>>> (generate-sources) on project hadoop-yarn-api: Command execution
>>>>>> failed.
>>>>>>>> Cannot run program "protoc" (in directory
>>>>>>>> "HOME/hadoop-trunk/hadoop-mapreduce/hadoop-yarn/hadoop-yarn-api"):
>>>>>>> error=2,
>>>>>>>> No such file or directory -> [Help 1]
>>>>>>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed
to
>>>>>> execute
>>>>>>>> goal org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources)
>>>> on
>>>>>>>> project hadoop-yarn-api: Command execution failed.
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:2
>>>>>> 17)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:1
>>>>>> 53)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:1
>>>>>> 45)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(Life
>>>>>> cycleModuleBuilder.java:84)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(Life
>>>>>> cycleModuleBuilder.java:59)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(Lif
>>>>>> ecycleStarter.java:183)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarte
>>>>>> r.java:161)
>>>>>>>> at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
>>>>>>>> at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
>>>>>>>> at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
>>>>>>>> at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
>>>>>>>> at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
>>>>>> )
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>>>>>> .java:25)
>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.ja
>>>>>> va:290)
>>>>>>>> at
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>> 
>> org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.
>>>>>> java:409)
>>>>>>>> at
>>>>>>>> 
>>>>>> 
>>>> 
>> org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
>>>>>>>> Caused by: org.apache.maven.plugin.MojoExecutionException:
Command
>>>>>>>> execution
>>>>>>>> failed.
>>>>>>>> at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:350)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPl
>>>>>> uginManager.java:101)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:2
>>>>>> 09)
>>>>>>>> ... 19 more
>>>>>>>> Caused by: java.io.IOException: Cannot run program "protoc"
(in
>>>>>> directory
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> "/Users/mackie/source-checkouts/hadoop-trunk/hadoop-mapreduce/hadoop-yarn/ha
>>>>>> doop-yarn-api"):
>>>>>>>> error=2, No such file or directory
>>>>>>>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
>>>>>>>> at java.lang.Runtime.exec(Runtime.java:593)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.commons.exec.launcher.Java13CommandLauncher.exec(Java13CommandLau
>>>>>> ncher.java:58)
>>>>>>>> at
>>>>>>> 
>>>> org.apache.commons.exec.DefaultExecutor.launch(DefaultExecutor.java:246)
>>>>>>>> at
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>> org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java
>>>>>> :302)
>>>>>>>> at
>>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:149)
>>>>>>>> at
>>>>>> org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:589)
>>>>>>>> at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:335)
>>>>>>>> ... 21 more
>>>>>>>> Caused by: java.io.IOException: error=2, No such file or
directory
>>>>>>>> at java.lang.UNIXProcess.forkAndExec(Native Method)
>>>>>>>> at java.lang.UNIXProcess.<init>(UNIXProcess.java:53)
>>>>>>>> at java.lang.ProcessImpl.start(ProcessImpl.java:91)
>>>>>>>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
>>>>>>>> ... 28 more
>>>>>>>> [ERROR]
>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>> logging.
>>>>>>>> [ERROR]
>>>>>>>> [ERROR] For more information about the errors and possible
>> solutions,
>>>>>>>> please
>>>>>>>> read the following articles:
>>>>>>>> [ERROR] [Help 1]
>>>>>>>> 
>>>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>> 
>>>>>>>> --
>>>>>>>> Regards,
>>>>>>>> 
>>>>>>>> Tharindu
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> --
>>>>>> Regards,
>>>>>> 
>>>>>> Tharindu
>>>>>> 
>>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> Regards,
>>>>> 
>>>>> Tharindu
>>>> 
>>>> 
>>> 
>>> 
>>> --
>>> Regards,
>>> 
>>> Tharindu
>> 
>> 
> 
> 
> -- 
> Regards,
> 
> Tharindu


Mime
View raw message