hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "real great.." <greatness.hardn...@gmail.com>
Subject Change the storage directory.
Date Fri, 25 Feb 2011 00:32:19 GMT
Hi,
As i guess, Hadoop creates the default dfs in temp directory.
I tried changing it by editing the hdfs-site.xml to:
?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
  <property>
    <name>dfs.replication</name>
    <value>2</value>
  </property>


<property>
  <name>dfs.data.dir</name>
  <value>/home/supreme/hdfs</value>
  <description>Comma separated list of paths on the local filesystem of a
DataNode where it should store its blocks. </description>
</property>
</configuration>

However, this does not seem to work as the hdfs is getting created at temp.
directory even after i format the same.

-- 
Regards,
R.V.

Mime
View raw message