Return-Path: X-Original-To: apmail-accumulo-user-archive@www.apache.org Delivered-To: apmail-accumulo-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 59EBDF5B9 for ; Wed, 3 Apr 2013 18:36:58 +0000 (UTC) Received: (qmail 65172 invoked by uid 500); 3 Apr 2013 18:36:58 -0000 Delivered-To: apmail-accumulo-user-archive@accumulo.apache.org Received: (qmail 65125 invoked by uid 500); 3 Apr 2013 18:36:58 -0000 Mailing-List: contact user-help@accumulo.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@accumulo.apache.org Delivered-To: mailing list user@accumulo.apache.org Received: (qmail 65117 invoked by uid 99); 3 Apr 2013 18:36:58 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 03 Apr 2013 18:36:58 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of eric.newton@gmail.com designates 209.85.212.182 as permitted sender) Received: from [209.85.212.182] (HELO mail-wi0-f182.google.com) (209.85.212.182) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 03 Apr 2013 18:36:53 +0000 Received: by mail-wi0-f182.google.com with SMTP id hi18so1845806wib.9 for ; Wed, 03 Apr 2013 11:36:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=Sjw3bvMFcmw0gM2tC4J05SB1wUcw0sOoN9WFygyatN4=; b=B162/MFgc3LCdvrpDUTQpg9dmcCS3FvuhQjZrf+VnS27ADu53KtaNjYFGYxAP5xfgq yX+hb1p0lEhZqajz5klJXeruPxa1eFeRfMLr+4uKqwZypWTMTUL9sg37jAnlkfA29Yxt PZGtWeQU+itjDxO+1sXdIuVyHQth43L/QTcB2uvarm1sV2ITlYfhjvKiEl93wXwSOI/5 6h9Vf4+4CCmVrOQOqcfmGfZwP9gP+Q9jDigF0dnRX7UtbSoVMPIzcpWwThFey0XvIEDr D5qSZY53sC0rxfya7TLFJZeToTR4Nr4rO9YaGcFXvhhH12kfW8hPcU44NI3S84G4iLSv A5UA== MIME-Version: 1.0 X-Received: by 10.194.119.200 with SMTP id kw8mr4571132wjb.31.1365014191792; Wed, 03 Apr 2013 11:36:31 -0700 (PDT) Received: by 10.217.107.138 with HTTP; Wed, 3 Apr 2013 11:36:31 -0700 (PDT) In-Reply-To: References: Date: Wed, 3 Apr 2013 14:36:31 -0400 Message-ID: Subject: Re: importdirectory in accumulo From: Eric Newton To: "user@accumulo.apache.org" Content-Type: multipart/alternative; boundary=089e0117791303ed9b04d9792227 X-Virus-Checked: Checked by ClamAV on apache.org --089e0117791303ed9b04d9792227 Content-Type: text/plain; charset=ISO-8859-1 You will have to write your own InputFormat class which will parse your file and pass records to your reducer. -Eric On Wed, Apr 3, 2013 at 2:29 PM, Aji Janis wrote: > Looking at the BulkIngestExample, it uses GenerateTestData and creates a > .txt file which contians Key: Value pair and correct me if I am wrong but > each new line is a new row right? > > I need to know how to have family and qualifiers also. In other words, > > 1) Do I set up a .txt file that can be converted into an Accumulo RF File > using AccumuloFileOutputFormat which can then be imported into my table? > > 2) if yes, what is the format of the .txt file. > > > > > On Wed, Apr 3, 2013 at 2:19 PM, Eric Newton wrote: > >> Your data needs to be in the RFile format, and more importantly it needs >> to be sorted. >> >> It's handy to use a Map/Reduce job to convert/sort your data. See the >> BulkIngestExample. >> >> -Eric >> >> >> On Wed, Apr 3, 2013 at 2:15 PM, Aji Janis wrote: >> >>> I have some data in a text file in the following format. >>> >>> rowid1 columnFamily1 colQualifier1 value >>> rowid1 columnFamily1 colQualifier2 value >>> rowid1 columnFamily2 colQualifier1 value >>> rowid2 columnFamily1 colQualifier1 value >>> rowid3 columnFamily1 colQualifier1 value >>> >>> I want to import this data into a table in accumulo. My end goal is to >>> understand how to use the BulkImport feature in accumulo. I tried to login >>> to the accumulo shell as root and then run: >>> >>> #table mytable >>> #importdirectory /home/inputDir /home/failureDir true >>> >>> but it didn't work. My data file was saved as data.txt in >>> /home/inputDir. I tried to create the dir/file structure in hdfs and linux >>> but neither worked. When trying locally, it keeps complaining about >>> failureDir not existing. >>> ... >>> java.io.FileNotFoundException: File does not exist: failures >>> >>> When trying with files on hdfs, I get no error on the console but the >>> logger had the following messages: >>> ... >>> [tableOps.BulkImport] WARN : hdfs://node....//inputDir/data.txt does not >>> have a valid extension, ignoring >>> >>> or, >>> >>> [tableOps.BulkImport] WARN : hdfs://node....//inputDir/data.txt is not a >>> map file, ignoring >>> >>> >>> Suggestions? Am I not setting up the job right? Thank you for help in >>> advance. >>> >>> >>> On Wed, Apr 3, 2013 at 2:04 PM, Aji Janis wrote: >>> >>>> I have some data in a text file in the following format: >>>> >>>> rowid1 columnFamily colQualifier value >>>> rowid1 columnFamily colQualifier value >>>> rowid1 columnFamily colQualifier value >>>> >>> >>> >> > --089e0117791303ed9b04d9792227 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
You will have to write your own InputFormat class which wi= ll parse your file and pass records to your reducer.

-Eric


On Wed, Apr 3, 2013 at 2:29 PM, Aji Janis <aji1705@gmail.com> wrote:
Looking at the BulkIngestExample, it uses GenerateTestData= and creates a .txt file which contians Key: Value pair and correct me if I= am wrong but each new line is a new row right?

I need t= o know how to have family and qualifiers also. In other words,

1) Do I set up a .txt file that can be converted into a= n Accumulo RF File using AccumuloFileOutputFormat =A0which can then be impo= rted into my table?

2) if yes, what is the format = of the .txt file.




On Wed, Apr = 3, 2013 at 2:19 PM, Eric Newton <eric.newton@gmail.com> = wrote:
Your data needs to be in th= e RFile format, and more importantly it needs to be sorted.

<= div> It's handy to use a Map/Reduce job to convert/sort your data. =A0See th= e BulkIngestExample.

-Eric


On Wed, Apr 3, 2013 at 2:15= PM, Aji Janis <aji1705@gmail.com> wrote:
I have some data in a = text file in the following format.

rowid1 columnFa= mily1 colQualifier1 value
rowid1 columnFamily1 colQualifier2 value
rowid1 columnFamily= 2 colQualifier1 value
rowid2 columnFamily1 colQualifier1 value
rowid3 columnFamily= 1 colQualifier1 value

I want to import this data i= nto a table in accumulo. My end goal is to understand how to use the BulkIm= port feature in accumulo. I tried to login to the accumulo shell as root an= d then run:

#table mytable
#importdirectory /home/inputDi= r /home/failureDir true

but it didn't work. My= data file was saved as data.txt in /home/inputDir. I tried to create the d= ir/file structure in hdfs and linux but neither worked. When trying locally= , it keeps complaining about failureDir not existing.=A0
...
java.io.FileNotFoundException: File does not exist:= failures

When trying with files on hdfs, I = get no error on the console but the logger had the following messages:=A0
...
[tableOps.BulkImport] WARN : hdfs://node..= ..//inputDir/data.txt does not have a valid extension, ignoring
=

or,

[tableOps.BulkImport] WARN :=A0hdfs://node....//inputDir/data.txt=A0is= not a map file, ignoring


Suggestio= ns? Am I not setting up the job right? Thank you for help in advance.


On Wed, Apr 3, 2013 at 2:04 PM, Aji Janis <aji1705@gmail.= com> wrote:
I have some data in a text = file in the following format:

rowid1 columnFamily colQua= lifier value
rowid1 columnFamily colQualifier value
rowid1 columnFami= ly colQualifier value




--089e0117791303ed9b04d9792227--