accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aji Janis <aji1...@gmail.com>
Subject Re: importdirectory in accumulo
Date Wed, 03 Apr 2013 18:15:12 GMT
I have some data in a text file in the following format.

rowid1 columnFamily1 colQualifier1 value
rowid1 columnFamily1 colQualifier2 value
rowid1 columnFamily2 colQualifier1 value
rowid2 columnFamily1 colQualifier1 value
rowid3 columnFamily1 colQualifier1 value

I want to import this data into a table in accumulo. My end goal is to
understand how to use the BulkImport feature in accumulo. I tried to login
to the accumulo shell as root and then run:

#table mytable
#importdirectory /home/inputDir /home/failureDir true

but it didn't work. My data file was saved as data.txt in /home/inputDir. I
tried to create the dir/file structure in hdfs and linux but neither
worked. When trying locally, it keeps complaining about failureDir not
existing.
...
java.io.FileNotFoundException: File does not exist: failures

When trying with files on hdfs, I get no error on the console but the
logger had the following messages:
...
[tableOps.BulkImport] WARN : hdfs://node....//inputDir/data.txt does not
have a valid extension, ignoring

or,

[tableOps.BulkImport] WARN : hdfs://node....//inputDir/data.txt is not a
map file, ignoring


Suggestions? Am I not setting up the job right? Thank you for help in
advance.


On Wed, Apr 3, 2013 at 2:04 PM, Aji Janis <aji1705@gmail.com> wrote:

> I have some data in a text file in the following format:
>
> rowid1 columnFamily colQualifier value
> rowid1 columnFamily colQualifier value
> rowid1 columnFamily colQualifier value
>

Mime
View raw message