hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Lewis <lordjoe2...@gmail.com>
Subject Re: Why it don't print anything into part-00000 file
Date Tue, 31 Jan 2012 16:09:10 GMT
1 run locally ether problem is small and until you can run a small local
problem do not mover to the cluster.
2 make sure ether mapper writes something set a breakpoint at the write
call.
3 make sure the reducer reads something.
4 add write call s in the reducers setup and cleanup calls until something
is written in between
On Jan 31, 2012 3:39 AM, "Luiz Antonio Falaguasta Barbosa" <
lafbarbosa@gmail.com> wrote:

> Ronald,
>
> It is the directory where _SUCCESS and part-00000 are generated, but they
> are empty. In my Eclipse, I put in 'Run Configurations' the following, in
> the tab Arguments:
>
> input.small out.ivory.small (as the file where is words and the directory
> where Hadoop should write results, respectively)
>
> Any idea that you could give me, I'll appreciate it.
>
> Thanks until now and thanks in advance!
>
> Regards,
>
> Luiz
>
> 2012/1/30 Ronald Petty <ronald.petty@gmail.com>
>
>> Luiz,
>>
>> What is in this file hdfs://10.22.1.2:54310/user/hadoop/out.ivory.small
>>
>> Kindest regards.
>>
>> Ron
>>
>> On Mon, Jan 30, 2012 at 7:24 AM, Luiz Antonio Falaguasta Barbosa <
>> lafbarbosa@gmail.com> wrote:
>>
>>> Hi Ronald,
>>>
>>> I didn't try to run it locally. I used a cluster in the university where
>>> I study.
>>>
>>> The console of Eclipse returns the following:
>>>
>>> 2012-01-28 10:34:54.450 java[22689:1903] Unable to load realm info from
>>> SCDynamicStore
>>>
>>> 12/01/28 10:34:58 INFO mapred.FileInputFormat: Total input paths to
>>> process : 2
>>>
>>> 12/01/28 10:34:58 INFO mapred.JobClient: Running job: job_local_0001
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: numReduceTasks: 1
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: io.sort.mb = 100
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: data buffer = 79691776/99614720
>>>
>>> 12/01/28 10:34:59 INFO mapred.MapTask: record buffer = 262144/327680
>>>
>>> 12/01/28 10:34:59 INFO mapred.JobClient:  map 0% reduce 0%
>>>
>>> 12/01/28 10:35:05 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CHANGES.txt:0+412413
>>>
>>> 12/01/28 10:35:27 INFO mapred.MapTask: Starting flush of map output
>>>
>>> 12/01/28 10:35:27 INFO mapred.Task: Task:attempt_local_0001_m_000000_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:29 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CHANGES.txt:0+412413
>>>
>>> 12/01/28 10:35:29 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0'
>>> done.
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: numReduceTasks: 1
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: io.sort.mb = 100
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: data buffer = 79691776/99614720
>>>
>>> 12/01/28 10:35:29 INFO mapred.MapTask: record buffer = 262144/327680
>>>
>>> 12/01/28 10:35:29 INFO mapred.JobClient:  map 100% reduce 0%
>>>
>>> 12/01/28 10:35:30 INFO mapred.MapTask: Starting flush of map output
>>>
>>> 12/01/28 10:35:30 INFO mapred.Task: Task:attempt_local_0001_m_000001_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner: hdfs://
>>> 10.22.1.2:54310/user/hadoop/input.small/CETEMPublico.small:0+3801
>>>
>>> 12/01/28 10:35:32 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0'
>>> done.
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:32 INFO mapred.Merger: Merging 2 sorted segments
>>>
>>> 12/01/28 10:35:32 INFO mapred.Merger: Down to the last merge-pass, with
>>> 0 segments left of total size: 0 bytes
>>>
>>> 12/01/28 10:35:32 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:33 INFO mapred.Task: Task:attempt_local_0001_r_000000_0
>>> is done. And is in the process of commiting
>>>
>>> 12/01/28 10:35:33 INFO mapred.LocalJobRunner:
>>>
>>> 12/01/28 10:35:33 INFO mapred.Task: Task attempt_local_0001_r_000000_0
>>> is allowed to commit now
>>>
>>> 12/01/28 10:35:35 INFO mapred.FileOutputCommitter: Saved output of task
>>> 'attempt_local_0001_r_000000_0' to hdfs://
>>> 10.22.1.2:54310/user/hadoop/out.ivory.small
>>>
>>> 12/01/28 10:35:38 INFO mapred.LocalJobRunner: reduce > reduce
>>>
>>> 12/01/28 10:35:38 INFO mapred.LocalJobRunner: reduce > reduce
>>>
>>> 12/01/28 10:35:38 INFO mapred.Task: Task 'attempt_local_0001_r_000000_0'
>>> done.
>>>
>>> 12/01/28 10:35:38 INFO mapred.JobClient:  map 100% reduce 100%
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient: Job complete: job_local_0001
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient: Counters: 18
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:   File Input Format Counters
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Bytes Read=3800
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:   File Output Format Counters
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Bytes Written=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:   FileSystemCounters
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     FILE_BYTES_READ=1402219
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     HDFS_BYTES_READ=1244841
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=1537564
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:   Map-Reduce Framework
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Map output materialized
>>> bytes=12
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Map input records=5
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Reduce shuffle bytes=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Spilled Records=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Map output bytes=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Map input bytes=3800
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     SPLIT_RAW_BYTES=229
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Combine input records=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Reduce input records=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Reduce input groups=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Combine output records=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Reduce output records=0
>>>
>>> 12/01/28 10:35:42 INFO mapred.JobClient:     Map output records=0
>>>
>>> Thanks in advance!
>>>
>>> Regards,
>>>
>>> Luiz
>>>
>>> 2012/1/29 Ronald Petty <ronald.petty@gmail.com>
>>>
>>>> Luiz,
>>>>
>>>> Does your code work locally?  I need a couple of more details to help.
>>>>
>>>> Ron
>>>>
>>>> On Sat, Jan 28, 2012 at 4:51 PM, Luiz <luixbr@gmail.com> wrote:
>>>>
>>>>> Hi people,
>>>>>
>>>>> I wrote this code to implemment per-term indexing (Ivory), like figure
>>>>> 4 of paper http://www.dcs.gla.ac.uk/~richardm/papers/IPM_MapReduce.pdfbut
it don't print anything into part-00000 file.
>>>>>
>>>>> Does somebody know why it don't print anything?
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Thanks in advance!
>>>>>
>>>>> Luiz
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> []s,
>>>
>>> Luiz
>>>
>>
>>
>
>
> --
> []s,
>
> Luiz
>

Mime
View raw message