hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kevin <klz...@gmail.com>
Subject Re: Errors when hadoop.tmp.dir is sent to multiple directories
Date Tue, 26 Aug 2008 20:43:23 GMT
It turns out that I should not set hadoop.tmp.dir to multiple
directories. Instead, I should overwrite the dfs.data.dir and
dfs.name.dir.

-Kevin



On Mon, Aug 18, 2008 at 3:03 PM, Kevin <klzhao@gmail.com> wrote:
> Hi,
>
> I guess it is not a rare use case to have hadoop dfs running with
> multiple native OS directories, by setting them in "hadoop.tmp.dir"
> and seperating them by commas. However, mine does not work. I got
> error like this at some datanodes
>
> 2008-08-18 15:05:35,167 ERROR org.apache.hadoop.dfs.DataNode:
> Exception: java.lang.NullPointerException
>        at org.apache.hadoop.dfs.FSDataset$FSDir.getBlockInfo(FSDataset.java:154)
>        at org.apache.hadoop.dfs.FSDataset$FSVolume.getBlockInfo(FSDataset.java:377)
>        at org.apache.hadoop.dfs.FSDataset$FSVolumeSet.getBlockInfo(FSDataset.java:475)
>        at org.apache.hadoop.dfs.FSDataset.getBlockReport(FSDataset.java:830)
>        at org.apache.hadoop.dfs.DataNode.offerService(DataNode.java:671)
>        at org.apache.hadoop.dfs.DataNode.run(DataNode.java:2667)
>        at java.lang.Thread.run(Thread.java:595)
>
> when I tried to upload(put) a file. But everything is right when I use
> only one directory at each node.
>
> Does any one know about this issue? Thank you!
>
> Best,
> -Kevin
>

Mime
View raw message