ambari-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From aman poonia <aman.poonia...@gmail.com>
Subject Re: Can Ambari be used to manage an existing vanilla apache hadoop cluster?
Date Tue, 23 Jan 2018 07:00:53 GMT
There is no easy approach. But maybe you can try
1. changing the conf in ambari to suit yourself.
2. change data.dir of hdfs to point to what you have currently.
3. install ambari-agent and servers
4. stop service on existing cluster on a master host then start services on
that host through ambari
5. repeat above for all hosts.

Hope this helps and solve your case


-- 
*With Regards,*
*Aman Poonia*

On Tue, Jan 23, 2018 at 12:25 PM, 李兆罚 <lizhaohua25@163.com> wrote:

>     Hi !
>
>
>
>    We have an existing hadoop cluster, which consists of
> vanilla apache hadoop 2.7.3(not hdp) and apache hive 1.2.1(not hdp).
> It has been running for about 2 years in production.
> It has lots of important data.
>
>    We want to use Ambari.  Since the version of hadoop in hdp
> 2.6.3 is  hadoop 2.7.3 and the version of hive is hive 1.2.1, we think we
> might manage the cluster with Ambari. Here is what we plan to do:
>
>     1. Install ambari 2.6.0.0
>     2. Install hadoop、hive(hdp2.6.3) with Ambari
>     3. Replace the data(include metadata and block data) of
> hdp HDFS with our existing HDFS data
>     4. Replace the metadata of hdp hive with our existed hive metadata
>
>     5. Use Ambari to manage the whole cluster
>
>     Does this approach make sense? Does anyone have tried this
> before? What are the risks?
>
>     Thanks in advance!
>
>
>
>
> --
>
> l <mzf@bupt.edu.cn>izhaohua25@163.com
>
>
>
>
>
>

Mime
View raw message