ambari-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeetendra G <jeetendr...@housing.com>
Subject Running spark and map reduce jobs
Date Thu, 27 Aug 2015 11:06:50 GMT
Hi All I have installed Ambari and with Ambari I have installed
hadoop,Spark,hive,oozie.
 When I was installing oozie it was asking me where all you need Ooozie in
my cluster means in how many Nodes?
I am not really able to understand why its asking what all nodes you want
to install Oozie. rather it should install in any one Node?


Also how can I run my map reduce and spark jobs?

Where does Ambari installed the binary of the installed packages in /bin?


Regards
Jeetendra

Mime
View raw message