hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harish Mallipeddi <harish.mallipe...@gmail.com>
Subject Re: Building Hadoop from source
Date Mon, 13 Jul 2009 09:22:01 GMT
Ok this turned out to be quite simple. I'm writing this down just in case
somebody faces the same problem.
The "mapreduce" repo actually contains builds of "common" and "hdfs" within
its lib/ folder. The only thing that's missing is the "scripts" folder (ie
$HADOOP_HOME/bin) which contains things like start-all.sh, stop-all.sh. This
bin/ can be copied from the "common" repo and once you do that everything
works fine.

Cheers,
Harish

On Fri, Jul 10, 2009 at 10:41 AM, Harish Mallipeddi <
harish.mallipeddi@gmail.com> wrote:

> Any ideas people?
> I found this page which includes instructions for core-committers on how to
> make a release from SVN (but this looks outdated too).
>
> http://wiki.apache.org/hadoop/HowToRelease
>
> Thanks,
> Harish
>
>
> On Thu, Jul 9, 2009 at 5:59 PM, Harish Mallipeddi <
> harish.mallipeddi@gmail.com> wrote:
>
>> "ant jar" builds a jar. But the project has been split into 3 separate
>> entities. There has to be a script which combines the builds from the 3
>> sub-projects and produces one neat hadoop tarball similar to the
>> hadoop-0.20.0 release tarball which can be deployed?
>> - Harish
>>
>>
>> On Thu, Jul 9, 2009 at 5:40 PM, Mafish Liu <mafish@gmail.com> wrote:
>>
>>> Use "ant jar" if you want to jar file.
>>>
>>> 2009/7/9 Harish Mallipeddi <harish.mallipeddi@gmail.com>:
>>> > Hi,
>>> > Are there any instructions on how to build Hadoop from source? Now that
>>> the
>>> > project seems to have been split into separate projects (common, hdfs,
>>> and
>>> > mapreduce), there are 3 separate repositories under svn. Information on
>>> this
>>> > page is no longer correct:
>>> > http://hadoop.apache.org/core/version_control.html
>>> >
>>> > I checked out all the three repos and tried building them with "ant".
>>> Even
>>> > though the builds happened without errors, I'm not sure how do I get a
>>> > single hadoop tarball release (similar to hadoop-0.20.0.tar.gz from the
>>> > website) from there? Also is the latest trunk of mapreduce supposed to
>>> work
>>> > with the latest trunk of common and hdfs or will it only work with
>>> specific
>>> > versions of common and hdfs?
>>> >
>>> > Cheers,
>>> >
>>> > --
>>> > Harish Mallipeddi
>>> > http://blog.poundbang.in
>>> >
>>>
>>>
>>>
>>> --
>>> Mafish@gmail.com
>>> Institute of Computing Technology, Chinese Academy of Sciences, Beijing.
>>>
>>
>>
>>
>> --
>> Harish Mallipeddi
>> http://blog.poundbang.in
>>
>
>
>
> --
> Harish Mallipeddi
> http://blog.poundbang.in
>



-- 
Harish Mallipeddi
http://blog.poundbang.in

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message