hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: How to use the HDFS just like the common file system?
Date Mon, 09 May 2011 05:48:35 GMT
Although you won't be able to write using that, just read out streams
given a URL. For the complete set of operations, its best to use FS
classes as Jeff pointed out.

On Mon, May 9, 2011 at 11:15 AM, Harsh J <harsh@cloudera.com> wrote:
> Tom White's book "Hadoop: The Definitive Guide" has a neat little
> section about this in Chapter 3. Look at the FsUrlStreamHandlerFactory
> class (http://hadoop.apache.org/common/docs/r0.20.2/api/org/apache/hadoop/fs/FsUrlStreamHandlerFactory.html)
>
> Once registered into Java's net.URL, you should be able to use most
> regular Java classes as long as you provide "hdfs" as the scheme (and
> that the FileSystem instance knows what to use for it).
>
> 2011/5/9 ltomuno <ltomuno@163.com>:
>> using java
>> new File("/tmp/common")
>> but
>> /tmp/common is a HDFS file
>> how to implement this feature?
>> thanks
>>
>>
>>
>
>
>
> --
> Harsh J
>



-- 
Harsh J

Mime
View raw message