Return-Path: X-Original-To: apmail-incubator-accumulo-user-archive@minotaur.apache.org Delivered-To: apmail-incubator-accumulo-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5038A96C9 for ; Fri, 17 Feb 2012 13:49:59 +0000 (UTC) Received: (qmail 1687 invoked by uid 500); 17 Feb 2012 13:49:59 -0000 Delivered-To: apmail-incubator-accumulo-user-archive@incubator.apache.org Received: (qmail 1668 invoked by uid 500); 17 Feb 2012 13:49:59 -0000 Mailing-List: contact accumulo-user-help@incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: accumulo-user@incubator.apache.org Delivered-To: mailing list accumulo-user@incubator.apache.org Received: (qmail 1660 invoked by uid 99); 17 Feb 2012 13:49:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 17 Feb 2012 13:49:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of eric.newton@gmail.com designates 209.85.215.47 as permitted sender) Received: from [209.85.215.47] (HELO mail-lpp01m010-f47.google.com) (209.85.215.47) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 17 Feb 2012 13:49:52 +0000 Received: by lahc1 with SMTP id c1so3904945lah.6 for ; Fri, 17 Feb 2012 05:49:31 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=3x8/VNM3nswPYXF4zyfomLnJcbqYfseIWSF3v2Q1I1k=; b=E6ZJubm/XYkEIlzSildLnfK5R2qrdcozwEusGVOmyjGig+JAZRhZvWGCCWoj+zdIN1 nAiJno+EJ3FJRLbMRT5n7fk/43J4Uw/y8KygfDLukAJ/ZSBj1BBaTjDkR0Br/PBQymc/ g+AHF1M0TreGq7UecUxzN8xxZbv5OLap6mUKg= MIME-Version: 1.0 Received: by 10.112.105.164 with SMTP id gn4mr2648936lbb.23.1329486571668; Fri, 17 Feb 2012 05:49:31 -0800 (PST) Received: by 10.112.102.161 with HTTP; Fri, 17 Feb 2012 05:49:31 -0800 (PST) In-Reply-To: References: Date: Fri, 17 Feb 2012 08:49:31 -0500 Message-ID: Subject: Re: Error executing the Bulk Ingest Example From: Eric Newton To: accumulo-user@incubator.apache.org Content-Type: multipart/alternative; boundary=14dae9d67c16d6850304b929368b X-Virus-Checked: Checked by ClamAV on apache.org --14dae9d67c16d6850304b929368b Content-Type: text/plain; charset=ISO-8859-1 Please use full pathnames for the bulk directories: ./bin/tool.sh lib/accumulo-examples-*[^c].jar org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample test_bulk */tmp/*bulk */tmp*/bulkWork -Eric On Fri, Feb 17, 2012 at 2:00 AM, Scott Roberts wrote: > All, > > I'm getting a stack trace when I run the Bulk Ingest example and the > entries never get added to the test_bulk table. I'm using the example > straight from the documentation, substituting the appropriate values for > instance, zookeepers, username, and password: > > ./bin/tool.sh lib/accumulo-examples-*[^c].jar > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample > test_bulk bulk tmp/bulkWork > > ------ > Mapreduce log snippet: > > 2/02/17 01:45:58 INFO input.FileInputFormat: Total input paths to process > : 1 > 12/02/17 01:45:59 INFO mapred.JobClient: Running job: job_201202162354_0003 > 12/02/17 01:46:00 INFO mapred.JobClient: map 0% reduce 0% > 12/02/17 01:46:13 INFO mapred.JobClient: map 100% reduce 0% > 12/02/17 01:46:25 INFO mapred.JobClient: map 100% reduce 100% > 12/02/17 01:46:30 INFO mapred.JobClient: Job complete: > job_201202162354_0003 > 12/02/17 01:46:30 INFO mapred.JobClient: Counters: 25 > 12/02/17 01:46:30 INFO mapred.JobClient: Job Counters > 12/02/17 01:46:30 INFO mapred.JobClient: Launched reduce tasks=3 > 12/02/17 01:46:30 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=12712 > 12/02/17 01:46:30 INFO mapred.JobClient: Total time spent by all > reduces waiting after reserving slots (ms)=0 > 12/02/17 01:46:30 INFO mapred.JobClient: Total time spent by all maps > waiting after reserving slots (ms)=0 > 12/02/17 01:46:30 INFO mapred.JobClient: Rack-local map tasks=1 > 12/02/17 01:46:30 INFO mapred.JobClient: Launched map tasks=1 > 12/02/17 01:46:30 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=30570 > 12/02/17 01:46:30 INFO mapred.JobClient: File Output Format Counters > 12/02/17 01:46:30 INFO mapred.JobClient: Bytes Written=5552 > 12/02/17 01:46:30 INFO mapred.JobClient: FileSystemCounters > 12/02/17 01:46:30 INFO mapred.JobClient: FILE_BYTES_READ=30018 > 12/02/17 01:46:30 INFO mapred.JobClient: HDFS_BYTES_READ=28111 > 12/02/17 01:46:30 INFO mapred.JobClient: FILE_BYTES_WRITTEN=181390 > 12/02/17 01:46:30 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=5552 > 12/02/17 01:46:30 INFO mapred.JobClient: File Input Format > Counters12/02/17 01:46:30 INFO mapred.JobClient: Bytes Read=28000 > 12/02/17 01:46:30 INFO mapred.JobClient: Map-Reduce Framework > 12/02/17 01:46:30 INFO mapred.JobClient: Reduce input groups=1000 > 12/02/17 01:46:30 INFO mapred.JobClient: Map output materialized > bytes=30018 > 12/02/17 01:46:30 INFO mapred.JobClient: Combine output records=0 > 12/02/17 01:46:30 INFO mapred.JobClient: Map input > records=100012/02/17 01:46:30 INFO mapred.JobClient: Reduce shuffle > bytes=0 > 12/02/17 01:46:30 INFO mapred.JobClient: Reduce output records=1000 > 12/02/17 01:46:30 INFO mapred.JobClient: Spilled Records=2000 > 12/02/17 01:46:30 INFO mapred.JobClient: Map output bytes=28000 > 12/02/17 01:46:30 INFO mapred.JobClient: Combine input records=0 > 12/02/17 01:46:30 INFO mapred.JobClient: Map output > records=100012/02/17 01:46:30 INFO mapred.JobClient: SPLIT_RAW_BYTES=111 > 12/02/17 01:46:30 INFO mapred.JobClient: Reduce input > records=100012/02/17 01:46:30 ERROR util.BulkImportHelper: > org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown > result > org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown > result at > org.apache.accumulo.core.client.impl.thrift.ClientService$Client.recv_prepareBulkImport(ClientService.java:245) > at > org.apache.accumulo.core.client.impl.thrift.ClientService$Client.prepareBulkImport(ClientService.java:206) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) at > cloudtrace.instrument.thrift.TraceWrap$2.invoke(TraceWrap.java:83) > at $Proxy1.prepareBulkImport(Unknown Source) > at > org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:152) > at > org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717) > at > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > 12/02/17 01:46:30 ERROR util.BulkImportHelper: prepareBulkImport failed: > unknown result > Exception in thread "main" java.lang.RuntimeException: > java.lang.RuntimeException: prepareBulkImport failed: unknown result > at > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:146) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > Caused by: java.lang.RuntimeException: prepareBulkImport failed: unknown > result > at > org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:160) > at > org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717) > at > org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143) > ... 7 more > > > ------ > Monitor log snippet: > > 17 01:46:30,364 [client.ClientServiceHandler] ERROR: > tserver:compute-0-0.local Error preparing bulk import directory > tmp/bulkWork/files > java.lang.NullPointerException > at > org.apache.accumulo.server.client.ClientServiceHandler.prepareBulkImport(ClientServiceHandler.java:268) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at > cloudtrace.instrument.thrift.TraceWrap$1.invoke(TraceWrap.java:58) > at $Proxy2.prepareBulkImport(Unknown Source) > at > org.apache.accumulo.core.client.impl.thrift.ClientService$Processor$prepareBulkImport.process(ClientService.java:984) > at > org.apache.accumulo.core.tabletserver.thrift.TabletClientService$Processor.process(TabletClientService.java:904) > at > org.apache.accumulo.server.util.TServerUtils$TimedProcessor.process(TServerUtils.java:141) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253) > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) > at java.lang.Thread.run(Thread.java:662) > ------ > > I created the table and test data with no issues. --14dae9d67c16d6850304b929368b Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Please use full pathnames for the bulk directories:

./bi= n/tool.sh lib/accumulo-examples-*[^c].jar org.apache.accumulo.examples.mapr= educe.bulk.BulkIngestExample <instance> <zookeepers> <userna= me> <PW> test_bulk /tmp/bulk /tmp/bulkWork

-Eric

On Fri, Feb 17, 2012 at 2:00 AM= , Scott Roberts <sco= tty@jhu.edu> wrote:
All,

I'm getting a stack trace when I run the Bulk Ingest example and the en= tries never get added to the test_bulk table. =A0I'm using the example = straight from the documentation, substituting the appropriate values for in= stance, zookeepers, username, and password:

./bin/tool.sh lib/accumulo-examples-*[^c].jar org.apache.accumulo.examples.= mapreduce.bulk.BulkIngestExample <instance> <zookeepers> <us= ername> <PW> test_bulk bulk tmp/bulkWork

------
Mapreduce log snippet:

2/02/17 01:45:58 INFO input.FileInputFormat: Total input paths to process := 1
12/02/17 01:45:59 INFO mapred.JobClient: Running job: job_201202162354_0003=
12/02/17 01:46:00 INFO mapred.JobClient: =A0map 0% reduce 0%
12/02/17 01:46:13 INFO mapred.JobClient: =A0map 100% reduce 0%
12/02/17 01:46:25 INFO mapred.JobClient: =A0map 100% reduce 100%
12/02/17 01:46:30 INFO mapred.JobClient: Job complete: job_201202162354_000= 3
12/02/17 01:46:30 INFO mapred.JobClient: Counters: 25
12/02/17 01:46:30 INFO mapred.JobClient: =A0 Job Counters
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Launched reduce tasks=3D3<= br> 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 SLOTS_MILLIS_MAPS=3D12712<= br> 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Total time spent by all re= duces waiting after reserving slots (ms)=3D0
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Total time spent by all ma= ps waiting after reserving slots (ms)=3D0
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Rack-local map tasks=3D1 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Launched map tasks=3D1
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 SLOTS_MILLIS_REDUCES=3D305= 70
12/02/17 01:46:30 INFO mapred.JobClient: =A0 File Output Format Counters 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Bytes Written=3D5552
12/02/17 01:46:30 INFO mapred.JobClient: =A0 FileSystemCounters
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 FILE_BYTES_READ=3D30018 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 HDFS_BYTES_READ=3D28111 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 FILE_BYTES_WRITTEN=3D18139= 0
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 HDFS_BYTES_WRITTEN=3D5552<= br> 12/02/17 01:46:30 INFO mapred.JobClient: =A0 File Input Format Counters12/0= 2/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Bytes Read=3D28000
12/02/17 01:46:30 INFO mapred.JobClient: =A0 Map-Reduce Framework
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Reduce input groups=3D1000=
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Map output materialized by= tes=3D30018
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Combine output records=3D0=
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Map input records=3D100012= /02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Reduce shuffle bytes=3D0
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Reduce output records=3D10= 00
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Spilled Records=3D2000
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Map output bytes=3D28000 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Combine input records=3D0<= br> 12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Map output records=3D10001= 2/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 SPLIT_RAW_BYTES=3D111
12/02/17 01:46:30 INFO mapred.JobClient: =A0 =A0 Reduce input records=3D100= 012/02/17 01:46:30 ERROR util.BulkImportHelper: org.apache.thrift.TApplicat= ionException: prepareBulkImport failed: unknown result
org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown = result =A0 =A0 =A0 =A0at org.apache.accumulo.core.client.impl.thrift.Client= Service$Client.recv_prepareBulkImport(ClientService.java:245)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.client.impl.thrift.ClientServic= e$Client.prepareBulkImport(ClientService.java:206)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:597) =A0 =A0= =A0 =A0at cloudtrace.instrument.thrift.TraceWrap$2.invoke(TraceWrap.java:8= 3)
=A0 =A0 =A0 =A0at $Proxy1.prepareBulkImport(Unknown Source)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.util.BulkImportHelper.importDir= ectory(BulkImportHelper.java:152)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.client.admin.TableOperationsImp= l.importDirectory(TableOperationsImpl.java:717)
=A0 =A0 =A0 =A0at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestEx= ample.run(BulkIngestExample.java:143)
=A0 =A0 =A0 =A0at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65= )
=A0 =A0 =A0 =A0at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestEx= ample.main(BulkIngestExample.java:161)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
12/02/17 01:46:30 ERROR util.BulkImportHelper: prepareBulkImport failed: un= known result
Exception in thread "main" java.lang.RuntimeException: java.lang.= RuntimeException: prepareBulkImport failed: unknown result
=A0 =A0 =A0 =A0at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestEx= ample.run(BulkIngestExample.java:146)
=A0 =A0 =A0 =A0at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65= )
=A0 =A0 =A0 =A0at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestEx= ample.main(BulkIngestExample.java:161)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeException: prepareBulkImport failed: unknown re= sult
=A0 =A0 =A0 =A0at org.apache.accumulo.core.util.BulkImportHelper.importDir= ectory(BulkImportHelper.java:160)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.client.admin.TableOperationsImp= l.importDirectory(TableOperationsImpl.java:717)
=A0 =A0 =A0 =A0at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestEx= ample.run(BulkIngestExample.java:143)
=A0 =A0 =A0 =A0... 7 more


------
Monitor log snippet:

17 01:46:30,364 [client.ClientServiceHandler] ERROR: tserver:compute-0-0.lo= cal Error preparing bulk import directory tmp/bulkWork/files
java.lang.NullPointerException
=A0 =A0 =A0 =A0at org.apache.accumulo.server.client.ClientServiceHandler.p= repareBulkImport(ClientServiceHandler.java:268)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=A0 =A0 =A0 =A0at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)
=A0 =A0 =A0 =A0at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)
=A0 =A0 =A0 =A0at java.lang.reflect.Method.invoke(Method.java:597)
=A0 =A0 =A0 =A0at cloudtrace.instrument.thrift.TraceWrap$1.invoke(TraceWra= p.java:58)
=A0 =A0 =A0 =A0at $Proxy2.prepareBulkImport(Unknown Source)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.client.impl.thrift.ClientServic= e$Processor$prepareBulkImport.process(ClientService.java:984)
=A0 =A0 =A0 =A0at org.apache.accumulo.core.tabletserver.thrift.TabletClien= tService$Processor.process(TabletClientService.java:904)
=A0 =A0 =A0 =A0at org.apache.accumulo.server.util.TServerUtils$TimedProces= sor.process(TServerUtils.java:141)
=A0 =A0 =A0 =A0at org.apache.thrift.server.TThreadPoolServer$WorkerProcess= .run(TThreadPoolServer.java:253)
=A0 =A0 =A0 =A0at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(T= hreadPoolExecutor.java:886)
=A0 =A0 =A0 =A0at java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa= dPoolExecutor.java:908)
=A0 =A0 =A0 =A0at java.lang.Thread.run(Thread.java:662)
------

I created the table and test data with no issues.

--14dae9d67c16d6850304b929368b--