hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ryan Wang" <ryanwang.em...@gmail.com>
Subject Re: Any one can tell me about how to write to HDFS?
Date Fri, 30 Nov 2007 15:05:22 GMT
hadoop version: 0.15.0
<configuration>

<property>
  <name>fs.default.name</name>
  <value>hdfs://node01:9000</value>
  <description>
    The name of the default file system. Either the literal string
    "local" or a host:port for NDFS.
  </description>
</property>

<property>
  <name>mapred.job.tracker</name>
  <value>node01:9001</value>
  <description>
    The host and port that the MapReduce job tracker runs at. If
    "local", then jobs are run in-process as a single map and
    reduce task.
  </description>
</property>

<property>
  <name>mapred.map.tasks</name>
  <value>4</value>
  <description>
    define mapred.map tasks to be number of slave hosts
  </description>
</property>

<property>
  <name>mapred.reduce.tasks</name>
  <value>4</value>
  <description>
    define mapred.reduce tasks to be number of slave hosts
  </description>
</property>

<property>
  <name>dfs.name.dir</name>
  <value>/nutch/hdfs/name</value>
</property>

<property>
  <name>dfs.data.dir</name>
  <value>/nutch/hdfs/data</value>
</property>

<property>
  <name>mapred.system.dir</name>
  <value>/nutch/hdfs/mapreduce/system</value>
</property>

<property>
  <name>mapred.local.dir</name>
  <value>/nutch/hdfs/mapreduce/local</value>
</property>

<property>
  <name>dfs.replication</name>
  <value>1</value>
</property>

</configuration>


On Nov 30, 2007 10:57 PM, Arun C Murthy <arunc@yahoo-inc.com> wrote:

> Ryan,
>
> On Fri, Nov 30, 2007 at 10:48:30PM +0800, Ryan Wang wrote:
> >Hi,
> >I can communicate with the file system via shell command, and it worked
> >corretly.
> >But when I try to write program to write file to the file system, it
> failed.
> >
>
> Could you provide more info on the errors, your configuration,
> hadoop-version etc.?
>
> http://wiki.apache.org/lucene-hadoop/Help
>
> Arun
> >public class HadoopDFSFileReadWrite {
> >
> >
> >    public static void main(String[] argv) throws IOException {
> >
> >        Configuration dfsconf = new Configuration();
> >        FileSystem dfs = FileSystem.get(dfsconf);
> >
> >        Path inFile = new Path(argv[0]);
> >        Path outFile = new Path(argv[1]);
> >
> >        dfs.copyFromLocalFile(inFile, outFile);
> >    }
> >}
> >
> >argv[0]=nutch/search/bin/javalibTest.tar.gz argv[1]=ryan/test.tar.gz
> >The program write the javalibTest.tar.gz  to the Project's
> >Dir/ryan/test.tar.gz
> >I also placed the file modified hadoop-site.xml to  the Project 's Path?
> >I don't know why? anyone could help me out ?
> >
> >Thanks
> >Ryan
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message