hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nitin Pawar <nitinpawar...@gmail.com>
Subject Re: How to estimate hadoop.tmp.dir disk space
Date Wed, 26 Dec 2012 08:01:39 GMT
Do you have mounted drives on the disk like JBOD setup where you have
allocated few drives to hdfs?

check df -h on all the nodes you may get the mount which holds the logs or
any other information which is outside dfs may be full


On Wed, Dec 26, 2012 at 1:25 PM, centerqi hu <centerqi@gmail.com> wrote:

> hi all
>  I encountered trouble
>
> Message: org.apache.hadoop.ipc.RemoteException: java.io.IOException: org.apache.hadoop.fs.FSError:
java.io.IOException: No space left on device
>
>
>  hadoop dfsadmin -report
>
>     Configured Capacity: 44302785945600 (40.29 TB)
>     Present Capacity: 42020351946752 (38.22 TB)
>     DFS Remaining: 8124859072512 (7.39 TB)
>     DFS Used: 33895492874240 (30.83 TB)
>     DFS Used%: 80.66%
>     Under replicated blocks: 1687
>     Blocks with corrupt replicas: 0
>     Missing blocks: 0
>
> However, my hdfs space is adequate
>
> Change the size of the hadoop.tmp.dir error disappeared
> How to estimate hadoop.tmp.dir disk space?
>
> thx
> --
> centerqi@gmail.com
>



-- 
Nitin Pawar

Mime
View raw message