hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: Store Large files on HBase/HDFS
Date Sun, 21 Feb 2016 15:30:27 GMT
For #1, please take a look
at hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/DFSClient.java

e.g. the following methods:

  public DFSInputStream open(String src) throws IOException {

  public HdfsDataOutputStream append(final String src, final int buffersize,

      EnumSet<CreateFlag> flag, final Progressable progress,

      final FileSystem.Statistics statistics) throws IOException {


On Wed, Feb 17, 2016 at 3:40 PM, Arun Patel <arunp.bigdata@gmail.com> wrote:

> I would like to store large documents (over 100 MB) on HDFS and insert
> metadata in HBase.
> 1) Users will use HBase REST API for PUT and GET requests for storing and
> retrieving documents. In this case, how to PUT and GET documents to/from
> HDFS?What are the recommended ways for storing and accessing document
> to/from HDFS that provides optimum performance?
> Can you please share any sample code?  or a Github project?
> 2)  What are the performance issues I need to know?
> Regards,
> Arun

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message