Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 05E8F104A9 for ; Sat, 20 Apr 2013 00:37:01 +0000 (UTC) Received: (qmail 73498 invoked by uid 500); 20 Apr 2013 00:36:56 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 73396 invoked by uid 500); 20 Apr 2013 00:36:55 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 73389 invoked by uid 99); 20 Apr 2013 00:36:55 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 20 Apr 2013 00:36:55 +0000 X-ASF-Spam-Status: No, hits=0.0 required=5.0 tests= X-Spam-Check-By: apache.org Received-SPF: error (nike.apache.org: local policy) Received: from [108.166.43.65] (HELO smtp65.ord1c.emailsrvr.com) (108.166.43.65) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 20 Apr 2013 00:36:49 +0000 Received: from localhost (localhost.localdomain [127.0.0.1]) by smtp1.relay.ord1c.emailsrvr.com (SMTP Server) with ESMTP id 0EC5814808B for ; Fri, 19 Apr 2013 20:35:52 -0400 (EDT) X-Virus-Scanned: OK Received: by smtp1.relay.ord1c.emailsrvr.com (Authenticated sender: kaveh-AT-plutoz.com) with ESMTPSA id B06CC148075 for ; Fri, 19 Apr 2013 20:35:51 -0400 (EDT) Message-ID: <5171E2E6.6080606@plutoz.com> Date: Fri, 19 Apr 2013 17:35:50 -0700 From: kaveh minooie User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130402 Thunderbird/17.0.5 MIME-Version: 1.0 To: user@hadoop.apache.org Subject: error while running TestDFSIO References: <000a01ce3d41$4c922da0$e5b688e0$@email.t-com.hr> In-Reply-To: <000a01ce3d41$4c922da0$e5b688e0$@email.t-com.hr> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org Hi everyone I am getting this error when i run TestDFSIO. the job actually finishes successfully. ( according to jobtracker at least ) but this is what i get on the console : crawler@d1r2n2:/hadoop$ bin/hadoop jar hadoop-test-1.1.1.jar TestDFSIO -write -nrFiles 10 -fileSize 1000 TestDFSIO.0.0.4 13/04/19 17:23:43 INFO fs.TestDFSIO: nrFiles = 10 13/04/19 17:23:43 INFO fs.TestDFSIO: fileSize (MB) = 1000 13/04/19 17:23:43 INFO fs.TestDFSIO: bufferSize = 1000000 13/04/19 17:23:43 INFO fs.TestDFSIO: creating control file: 1000 mega bytes, 10 files 13/04/19 17:23:44 INFO fs.TestDFSIO: created control files for: 10 files 13/04/19 17:23:44 INFO mapred.FileInputFormat: Total input paths to process : 10 13/04/19 17:23:44 INFO mapred.JobClient: Running job: job_201304191712_0002 13/04/19 17:23:45 INFO mapred.JobClient: map 0% reduce 0% 13/04/19 17:24:06 INFO mapred.JobClient: map 20% reduce 0% 13/04/19 17:24:07 INFO mapred.JobClient: map 30% reduce 0% 13/04/19 17:24:09 INFO mapred.JobClient: map 50% reduce 0% 13/04/19 17:24:11 INFO mapred.JobClient: map 60% reduce 0% 13/04/19 17:24:12 INFO mapred.JobClient: map 90% reduce 0% 13/04/19 17:24:13 INFO mapred.JobClient: map 100% reduce 0% 13/04/19 17:24:21 INFO mapred.JobClient: map 100% reduce 33% 13/04/19 17:24:22 INFO mapred.JobClient: map 100% reduce 100% 13/04/19 17:24:23 INFO mapred.JobClient: Job complete: job_201304191712_0002 13/04/19 17:24:23 INFO mapred.JobClient: Counters: 33 13/04/19 17:24:23 INFO mapred.JobClient: Job Counters 13/04/19 17:24:23 INFO mapred.JobClient: Launched reduce tasks=1 13/04/19 17:24:23 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=210932 13/04/19 17:24:23 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 13/04/19 17:24:23 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 13/04/19 17:24:23 INFO mapred.JobClient: Rack-local map tasks=2 13/04/19 17:24:23 INFO mapred.JobClient: Launched map tasks=10 13/04/19 17:24:23 INFO mapred.JobClient: Data-local map tasks=8 13/04/19 17:24:23 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=8650 13/04/19 17:24:23 INFO mapred.JobClient: File Input Format Counters 13/04/19 17:24:23 INFO mapred.JobClient: Bytes Read=1120 13/04/19 17:24:23 INFO mapred.JobClient: SkippingTaskCounters 13/04/19 17:24:23 INFO mapred.JobClient: MapProcessedRecords=10 13/04/19 17:24:23 INFO mapred.JobClient: ReduceProcessedGroups=5 13/04/19 17:24:23 INFO mapred.JobClient: File Output Format Counters 13/04/19 17:24:23 INFO mapred.JobClient: Bytes Written=79 13/04/19 17:24:23 INFO mapred.JobClient: FileSystemCounters 13/04/19 17:24:23 INFO mapred.JobClient: FILE_BYTES_READ=871 13/04/19 17:24:23 INFO mapred.JobClient: HDFS_BYTES_READ=2330 13/04/19 17:24:23 INFO mapred.JobClient: FILE_BYTES_WRITTEN=272508 13/04/19 17:24:23 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=10485760079 13/04/19 17:24:23 INFO mapred.JobClient: Map-Reduce Framework 13/04/19 17:24:23 INFO mapred.JobClient: Map output materialized bytes=925 13/04/19 17:24:23 INFO mapred.JobClient: Map input records=10 13/04/19 17:24:23 INFO mapred.JobClient: Reduce shuffle bytes=925 13/04/19 17:24:23 INFO mapred.JobClient: Spilled Records=100 13/04/19 17:24:23 INFO mapred.JobClient: Map output bytes=765 13/04/19 17:24:23 INFO mapred.JobClient: Total committed heap usage (bytes)=7996702720 13/04/19 17:24:23 INFO mapred.JobClient: CPU time spent (ms)=104520 13/04/19 17:24:23 INFO mapred.JobClient: Map input bytes=260 13/04/19 17:24:23 INFO mapred.JobClient: SPLIT_RAW_BYTES=1210 13/04/19 17:24:23 INFO mapred.JobClient: Combine input records=0 13/04/19 17:24:23 INFO mapred.JobClient: Reduce input records=50 13/04/19 17:24:23 INFO mapred.JobClient: Reduce input groups=5 13/04/19 17:24:23 INFO mapred.JobClient: Combine output records=0 13/04/19 17:24:23 INFO mapred.JobClient: Physical memory (bytes) snapshot=7111999488 13/04/19 17:24:23 INFO mapred.JobClient: Reduce output records=5 13/04/19 17:24:23 INFO mapred.JobClient: Virtual memory (bytes) snapshot=28466053120 13/04/19 17:24:23 INFO mapred.JobClient: Map output records=50 java.io.FileNotFoundException: File does not exist: /benchmarks/TestDFSIO/io_write/part-00000 at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.fetchLocatedBlocks(DFSClient.java:1975) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1944) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.(DFSClient.java:1936) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:731) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:165) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:427) at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:339) at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:462) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:317) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.hadoop.test.AllTestDriver.main(AllTestDriver.java:81) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) crawler@d1r2n2:/hadoop$ bin/hadoop fs -ls /benchmarks/TestDFSIO/io_write Found 3 items -rw-r--r-- 2 crawler supergroup 0 2013-04-19 17:24 /benchmarks/TestDFSIO/io_write/_SUCCESS drwxr-xr-x - crawler supergroup 0 2013-04-19 17:23 /benchmarks/TestDFSIO/io_write/_logs -rw-r--r-- 2 crawler supergroup 79 2013-04-19 17:24 /benchmarks/TestDFSIO/io_write/part-00000.deflate crawler@d1r2n2:/hadoop$ Does anyone have any idea what might be wrong here?