hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peyman Mohajerian <mohaj...@gmail.com>
Subject Re: XML files in Hadoop
Date Sat, 03 Jan 2015 16:38:17 GMT
You can land the data in HDFS as XML files and use 'hive xml serde' to read
the data and write it back in a more optimal format, e.g. ORC or parquet
(depending somewhat on your choice of Hadoop distro). Querying XML data
directly via Hive is also doable but slow. Converting to Avro is also
doable but in my experience not as fast as ORC or Parquet. Columnar formats
work give you better performance but Avro has its own strength, e.g.
managing schema changes better.
You can also convert the format before you land the data in HDFS, e.g.
using Flume or some other tool for changing the format in flight.



On Sat, Jan 3, 2015 at 8:33 AM, Shashidhar Rao <raoshashidhar123@gmail.com>
wrote:

> Sorry , not Hive files but xml files to some Avro format and store these
> into Hive will be fast .
>
> On Sat, Jan 3, 2015 at 9:59 PM, Shashidhar Rao <raoshashidhar123@gmail.com
> > wrote:
>
>> Hi,
>>
>> Exact number of files is not known but it will run into millions of files
>> depending on client's request who collects terabytes of xml data every day.
>> Basically, storing is just one part but the main part will be how to query
>> these data like  aggregation, count and do some analytics over these data.
>> Fast retrieval is required , say for e.g for a particular year what are the
>> top 10 products, top ten manufacturers and top ten stores etc.
>>
>> Will Hive be a better choice ? And will converting these Hive files to
>> some format work out.
>>
>> Thanks
>> Shashi
>>
>> On Sat, Jan 3, 2015 at 9:44 PM, Wilm Schumacher <
>> wilm.schumacher@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> how many xml files are you planning to store? Perhaps it is possible to
>>> store them directly on hdfs and save meta data in hbase. This sounds
>>> more reasonable to me.
>>>
>>> If the number of xml files is to large (millions and billions), then you
>>> can use hadoop map files to put files together. E.g. based on years, or
>>> month.
>>>
>>> Regards,
>>>
>>> Wilm
>>>
>>> Am 03.01.2015 um 17:06 schrieb Shashidhar Rao:
>>> > Hi,
>>> >
>>> > Can someone help me by suggesting the best way to solve this use case
>>> >
>>> > 1. XML files keep flowing from external system and need to be stored
>>> > into HDFS.
>>> > 2. These files  can be directly stored using NoSql database e.g any
>>> > xml supported NoSql. or
>>> > 3. These files need to be processed and stored in one of the database
>>> > HBase, Hive etc.
>>> > 4. There won't be any updates only read and has to be retrieved based
>>> > on some queries and a dashboard has to be created , bits of analytics
>>> >
>>> > The xml files are huge and expected number of nodes is roughly around
>>> > 12 nodes.
>>> > I am stuck in the storage part say if I convert xml to json and store
>>> > it into HBase , the processing part from xml to json will be huge.
>>> >
>>> > It will be only reading and no updates.
>>> >
>>> > Please suggest how to store these xml files.
>>> >
>>> > Thanks
>>> > Shashi
>>>
>>>
>>
>

Mime
View raw message