Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CBCFC11496 for ; Sun, 6 Jul 2014 18:01:59 +0000 (UTC) Received: (qmail 78062 invoked by uid 500); 6 Jul 2014 18:01:53 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 77934 invoked by uid 500); 6 Jul 2014 18:01:53 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 77914 invoked by uid 99); 6 Jul 2014 18:01:53 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 06 Jul 2014 18:01:53 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of jayunit100.apache@gmail.com designates 209.85.192.172 as permitted sender) Received: from [209.85.192.172] (HELO mail-pd0-f172.google.com) (209.85.192.172) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 06 Jul 2014 18:01:47 +0000 Received: by mail-pd0-f172.google.com with SMTP id w10so4139885pde.17 for ; Sun, 06 Jul 2014 11:01:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=P40y0rz84tNBN/DIxZHKN2yjhDq5m8EPZ84pKYKAQfg=; b=ZVtN1C0wRTrBLFG+19uhz2O99zn6KXjeE9fx1kHZ0xur2c9vmaIKCxH2kNYnSafedL 2l+VTLXcsz/4Oa4bcImpgb8JlcdgVLSrRq2FqdkK7vYlRQ77Vk0u9JMo5PLOrrrnXO8i poXHIRySxDIjlG+b3araUiG2H+8Z5I+fR5LxGLeHFqk4MoYAqtPtuuU++Wkk8g0N9gVa RLReCMRCexkHW1BEiwrdbGzoyNpGAao9U4z1MNxjclk7LdVSxTT8Ixj7SxlIm+pUpidb M29nPKmhyJR7idj2CChqQEs8zmW+ykGZMkqIRvJTzvYtLXrdf4zDOaIEFtZHSgOqvPfJ ZEuw== MIME-Version: 1.0 X-Received: by 10.67.16.67 with SMTP id fu3mr23964841pad.38.1404669684448; Sun, 06 Jul 2014 11:01:24 -0700 (PDT) Received: by 10.70.132.39 with HTTP; Sun, 6 Jul 2014 11:01:24 -0700 (PDT) In-Reply-To: References: <53B8C231.5090607@fci-cu.edu.eg> Date: Sun, 6 Jul 2014 14:01:24 -0400 Message-ID: Subject: Re: Hadoop virtual machine From: jay vyas To: "common-user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a113650569183ed04fd8a2584 X-Virus-Checked: Checked by ClamAV on apache.org --001a113650569183ed04fd8a2584 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable I really like the Cascading recipes above, thanks for sharing that ! Also we have *apache bigtop vagrant recipes* which we curate for this kind of thing, which are really useful. You can spin up a 1 or multi node cluster, just by running the startup.sh script. Which are probably the most configurable and flexible. These are super easy to use, and allow you maximal control over your environment. 1) git clone https://github.com/apache/bigtop 2) cd bigtop-deploy/vm/vagrant/vagrant-puppet 3) Follow the directions in the README to create your hadoop cluster You can look into the provision script to see how you can customize exactly which components come (hbase,mahout,pig,....) installed in your distribution. Feel free to drop a line on the bigtop mailing list if you need any help with getting them up and running. On Sun, Jul 6, 2014 at 12:47 PM, Andre Kelpe wrote: > We have a multi-vm or single-vm setup with apache hadoop, if you want to > give that a spin: > https://github.com/Cascading/vagrant-cascading-hadoop-cluster > > - Andr=C3=A9 > > > On Sun, Jul 6, 2014 at 9:05 AM, MrAsanjar . wrote: > >> For my hadoop development and testing I use LXC (linux container) instea= d >> of VM, mainly due to its light weight resource consumption. As mater of >> fact as I am typing, my ubuntu system is automatically building a 6 node= s >> hadoop cluster on my 16G labtop. >> If you have an Ubuntu system you could install a fully configurable >> Hadoop 2.2.0 single node or multi-node cluster in less then 10 minutes. >> Here what you need to do: >> 1) Install and learn Ubuntu Juju (shouldn't take an hour)- instructions = : >> https://juju.ubuntu.com/docs/getting-started.html >> 2) there are two types hadoop charms: >> a) Single node for hadoop development : >> https://jujucharms.com/?text=3Dhadoop2-devel >> b) multi-node for testing testing : >> https://jujucharms.com/?text=3Dhadoop >> Let me know if you need more help >> >> >> On Sun, Jul 6, 2014 at 7:59 AM, Marco Shaw wrote: >> >>> Note that the CDH link is for Cloudera which only provides Hadoop for >>> Linux. >>> >>> HDP has "pre-built VMs" for both Linux and Windows hosts. >>> >>> You can also search for "HDInsight emulator" which runs on Windows and >>> is based on HDP. >>> >>> Marco >>> >>> On Jul 6, 2014, at 12:38 AM, Gavin Yue wrote: >>> >>> http://hortonworks.com/products/hortonworks-sandbox/ >>> >>> or >>> >>> CDH5 >>> >>> http://www.cloudera.com/content/cloudera-content/cloudera-docs/DemoVMs/= Cloudera-QuickStart-VM/cloudera_quickstart_vm.html >>> >>> >>> >>> On Sat, Jul 5, 2014 at 11:27 PM, Manar Elkady >>> wrote: >>> >>>> Hi, >>>> >>>> I am a newcomer in using Hadoop, and I read many online tutorial to se= t >>>> up Hadoop on Window by using virtual machines, but all of them link to= old >>>> versions of Hadoop virtual machines. >>>> Could any one help me to find a Hadoop virtual machine, which include = a >>>> newer version of hadoop? Or should I do it myself from scratch? >>>> Also, any well explained Hadoop installing tutorial and any other >>>> helpful material are appreciated. >>>> >>>> >>>> Manar, >>>> >>>> >>>> -- >>>> >>>> >>> >> > > > -- > Andr=C3=A9 Kelpe > andre@concurrentinc.com > http://concurrentinc.com > --=20 jay vyas --001a113650569183ed04fd8a2584 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
I really like the Cascading recipes ab= ove, thanks for sharing that !

Also we have apache bigtop vagran= t recipes which we curate for this kind of thing, which are really usef= ul.=C2=A0 You can spin up a 1 or multi node cluster, just by running the st= artup.sh script.

Which are probably the most configurable and flexible. =C2=A0 These are= super easy to use, and allow you maximal control over your environment.
1) git clone https= ://github.com/apache/bigtop
2) cd bigtop-deploy/vm/vagrant/vagrant-puppet
3) Follow the = directions in the README to create your hadoop cluster

You can look into the provision script to see how you can customize exa= ctly which components come (hbase,mahout,pig,....) installed in your distri= bution.=C2=A0

Feel free to drop a line on the bigtop mailin= g list if you need any help with getting them up and running.





On Sun, Jul 6, 2014 at 12:47 PM, Andre K= elpe <akelpe@concurrentinc.com> wrote:
We have a multi-vm or single-vm setup with apache had= oop, if you want to give that a spin: https://github.com/C= ascading/vagrant-cascading-hadoop-cluster

- Andr=C3=A9


On Sun, Jul 6, 2014 at 9:05 AM, = MrAsanjar . <afsanjar@gmail.com> wrote:
For my hadoop dev= elopment and testing I use LXC (linux container) instead of VM, mainly due = to its light weight resource consumption. As mater of fact as I am typing, = my ubuntu system is automatically building a 6 nodes hadoop cluster on my 1= 6G labtop.
If you have an Ubuntu system you could install a fully configurable H= adoop 2.2.0 single node or multi-node cluster in less then 10 minutes.
<= /div>
Here what you need to do:
1) Install and learn Ubun= tu Juju (shouldn't take an hour)- instructions :https://juju.ubunt= u.com/docs/getting-started.html
2) there are two types hadoop charms:
=C2=A0=C2=A0= =C2=A0=C2=A0 a) Single node for hadoop development : https://jujucharms.com= /?text=3Dhadoop2-devel
=C2=A0=C2=A0=C2=A0=C2=A0 b) multi-node for testing=C2=A0 testing : https://jujuch= arms.com/?text=3Dhadoop
Let me know if you need more help=


On Sun, Jul 6, 2014 at 7:59 AM, Marco Shaw <= marco.shaw@gmail.com> wrote:
Note that the CDH link is for Cloudera which only pr= ovides Hadoop for Linux.=C2=A0

HDP has "pre-b= uilt VMs" for both Linux and Windows hosts.=C2=A0

=
You can also search for "HDInsight emulator" which runs on W= indows and is based on HDP.=C2=A0

Marco
=

On Jul 6, 2014, at 12:38 AM, Gavin Yue <yue.yuanyuan@gmail.com&g= t; wrote:



On Sat, Jul 5, 2014 at 11:27 PM, Manar Elkady <= ;m.elkady@fci-c= u.edu.eg> wrote:
Hi,

I am a newcomer in using Hadoop, and I read many online tutorial to set up = Hadoop on Window by using virtual machines, but all of them link to old ver= sions of Hadoop virtual machines.
Could any one help me to find a Hadoop virtual machine, which include a new= er version of hadoop? Or should I do it myself from scratch?
Also, any well explained Hadoop installing tutorial and any other helpful m= aterial are appreciated.


Manar,


--






--
Andr=C3=A9 Kelpe
andre@concurrentinc.c= om
http://concurrentinc= .com



--
jay vy= as
--001a113650569183ed04fd8a2584--