ambari-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alejandro Fernandez <afernan...@hortonworks.com>
Subject Re: ambari hostcheck failing
Date Mon, 17 Nov 2014 18:43:01 GMT
Try increasing "DataNode volumes failure toleration"
dfs.datanode.failed.volumes.tolerated

http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

Thanks,
Alejandro

On Mon, Nov 17, 2014 at 7:42 AM, SSamineni <ssamineni009@yahoo.com> wrote:

> I am trying to install hadoop through ambari, but  host check failing with
> error"Not enough disk space on host (). A minimum of 2GB is required for "
> is there a way to bypass disk check ?
>
>


-- 
[image: Hortonworks, Inc.] <http://hortonworks.com/> *Alejandro Fernandez
<afernandez@hortonworks.com>**Engineering - Ambari*
786.303.7149

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Mime
View raw message