Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4C98DD4F2 for ; Wed, 5 Sep 2012 17:35:36 +0000 (UTC) Received: (qmail 77035 invoked by uid 500); 5 Sep 2012 17:35:31 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 76768 invoked by uid 500); 5 Sep 2012 17:35:31 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Delivered-To: moderator for user@hadoop.apache.org Received: (qmail 94491 invoked by uid 99); 2 Sep 2012 22:11:07 -0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=FSL_RCVD_USER,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of pat.ferrel@gmail.com designates 209.85.210.48 as permitted sender) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=content-type:mime-version:subject:from:in-reply-to:date :content-transfer-encoding:message-id:references:to:x-mailer; bh=S4BdZ1TgsbP1Cctdgh8utjI+EPCFFuuEReL/684CLbw=; b=as3Y4h+oFe4PZWCz29fEvGh+UoYPRq1lYBNcE6twFjpj6fkoUogB26XevySB0cC3zr R1qGSQY4CIliMBsX0ko56EpS1jRjF5t0719SGYNWYWUpwK5SPcFVa5ixJTH0dQrqzz3W dcd/O6XNKavXg+wmESey4wNwN4vPZo26cbO26LTKne4o9IRN0F4P1obUJI/lIFpMzfOL kfSyIier9xIg15gUUwi3I8+i/FWeTATb6Gi1lVXqA1qY9N8MgoI7P6G3+9fEiPssB5yR C90zO1S9Mwdpg54vALxsEUgtCfkCQ+YdFQFp/RrCTOm8NaWDgY1JHj2J0sKwqJQh7IPn bhMw== Content-Type: text/plain; charset=windows-1252 Mime-Version: 1.0 (Mac OS X Mail 6.0 \(1486\)) Subject: Error using hadoop in non-distributed mode From: Pat Ferrel In-Reply-To: Date: Sun, 2 Sep 2012 15:10:08 -0700 Content-Transfer-Encoding: quoted-printable Message-Id: <62F3AC2C-7D24-47A6-9FA4-415062F1D7C0@gmail.com> References: To: user@hadoop.apache.org X-Mailer: Apple Mail (2.1486) X-Virus-Checked: Checked by ClamAV on apache.org I'm using mahout with a local filesystem/non-hdfs config for debugging = purposes. I'm running inside Intellij IDEA. When I run one particular = part of the analysis I get the following error. I didn't write the code = but we are looking for some hint about what might cause it. This job = completes without error in a single node pseudo-clustered config but I = can't use the debugger on it very easily. several jobs in the pipeline complete without error creating part files = just fine =85.=20 12/09/02 14:56:29 INFO compress.CodecPool: Got brand-new decompressor 12/09/02 14:56:29 INFO compress.CodecPool: Got brand-new decompressor 12/09/02 14:56:29 INFO compress.CodecPool: Got brand-new decompressor 12/09/02 14:56:29 WARN mapred.LocalJobRunner: job_local_0002 java.io.FileNotFoundException: File = /tmp/hadoop-pat/mapred/local/archive/-4686065962599733460_1587570556_15073= 8331/file/Users/pat/Projects/big-data/b/ssvd/Q-job/R-m-00000 does not = exist. at = org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.j= ava:371) at = org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:= 245) at = org.apache.mahout.common.iterator.sequencefile.SequenceFileDirValueIterato= r.(SequenceFileDirValueIterator.java:92) at = org.apache.mahout.math.hadoop.stochasticsvd.BtJob$BtMapper.setup(BtJob.jav= a:219) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142) at = org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at = org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212) Exception in thread "main" java.io.IOException: Bt job unsuccessful. at = org.apache.mahout.math.hadoop.stochasticsvd.BtJob.run(BtJob.java:609) at = org.apache.mahout.math.hadoop.stochasticsvd.SSVDSolver.run(SSVDSolver.java= :397) at = com.finderbots.analysis.AnalysisPipeline.SSVDTransformAndBack(AnalysisPipe= line.java:257) at com.finderbots.analysis.AnalysisJob.run(AnalysisJob.java:20) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at com.finderbots.analysis.AnalysisJob.main(AnalysisJob.java:34) Disconnected from the target VM, address: '127.0.0.1:63483', transport: = 'socket' The file = /tmp/hadoop-pat/mapred/local/archive/6590995089539988730_1587570556_371223= 31/file/Users/pat/Projects/big-data/b/ssvd/Q-job/R-m-00000 which is the subject of the error - does not exist Users/pat/Projects/big-data/b/ssvd/Q-job/R-m-00000 does exist at the time of the error. So the code is looking for the data = in the wrong place?