ambari-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeetendra G <jeetendr...@housing.com>
Subject Re: Running spark and map reduce jobs
Date Fri, 28 Aug 2015 06:10:12 GMT
is this necessary to run spark,map reduce jobs from oozie?


On Fri, Aug 28, 2015 at 11:23 AM, Jeetendra G <jeetendra.g@housing.com>
wrote:

> when I goto Admin->version I find this HDP-2.3.0.0-2557.
>
>
>
>
>
> On Thu, Aug 27, 2015 at 11:36 PM, Alejandro Fernandez <
> afernandez@hortonworks.com> wrote:
>
>> Hi Jitendra,
>>
>> What version of Ambari and HDP are you running?
>> You just need to install Oozie server on any host, and pick the hosts for
>> the clients.
>> In HDP 2.3, it's possible to have multiple Oozie servers for High
>> Availability.
>>
>> HDP binaries are in /usr/hdp/current/spark-server/bin
>> Note that /usr/hdp/current/spark-server is a symlink to
>> /usr/hdp/2.#.#.#-####/spark
>>
>> Thanks,
>> Alejandro
>>
>> From: Jeetendra G <jeetendra.g@housing.com>
>> Reply-To: "user@ambari.apache.org" <user@ambari.apache.org>
>> Date: Thursday, August 27, 2015 at 4:06 AM
>> To: "user@ambari.apache.org" <user@ambari.apache.org>
>> Subject: Running spark and map reduce jobs
>>
>> Hi All I have installed Ambari and with Ambari I have installed
>> hadoop,Spark,hive,oozie.
>>  When I was installing oozie it was asking me where all you need Ooozie
>> in my cluster means in how many Nodes?
>> I am not really able to understand why its asking what all nodes you want
>> to install Oozie. rather it should install in any one Node?
>>
>>
>> Also how can I run my map reduce and spark jobs?
>>
>> Where does Ambari installed the binary of the installed packages in /bin?
>>
>>
>> Regards
>> Jeetendra
>>
>
>

Mime
View raw message