mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: Mahout-Examples-Cluster-Reuters #143
Date Fri, 25 May 2012 19:26:14 GMT
See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/143/>

------------------------------------------
[...truncated 6021 lines...]
12/05/25 19:25:33 INFO mapred.LocalJobRunner: 
12/05/25 19:25:33 INFO mapred.Task: Task 'attempt_local_0003_m_000000_0' done.
12/05/25 19:25:33 INFO mapred.LocalJobRunner: 
12/05/25 19:25:33 INFO mapred.Merger: Merging 1 sorted segments
12/05/25 19:25:33 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of
total size: 0 bytes
12/05/25 19:25:33 INFO mapred.LocalJobRunner: 
12/05/25 19:25:33 INFO mapred.JobClient:  map 100% reduce 0%
12/05/25 19:25:33 INFO mapred.Task: Task:attempt_local_0003_r_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:33 INFO mapred.LocalJobRunner: 
12/05/25 19:25:33 INFO mapred.Task: Task attempt_local_0003_r_000000_0 is allowed to commit
now
12/05/25 19:25:33 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0003_r_000000_0'
to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/05/25 19:25:36 INFO mapred.LocalJobRunner: reduce > reduce
12/05/25 19:25:36 INFO mapred.Task: Task 'attempt_local_0003_r_000000_0' done.
12/05/25 19:25:36 INFO mapred.JobClient:  map 100% reduce 100%
12/05/25 19:25:36 INFO mapred.JobClient: Job complete: job_local_0003
12/05/25 19:25:36 INFO mapred.JobClient: Counters: 16
12/05/25 19:25:36 INFO mapred.JobClient:   File Output Format Counters 
12/05/25 19:25:36 INFO mapred.JobClient:     Bytes Written=102
12/05/25 19:25:36 INFO mapred.JobClient:   FileSystemCounters
12/05/25 19:25:36 INFO mapred.JobClient:     FILE_BYTES_READ=133987827
12/05/25 19:25:36 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=135233642
12/05/25 19:25:36 INFO mapred.JobClient:   File Input Format Counters 
12/05/25 19:25:36 INFO mapred.JobClient:     Bytes Read=101
12/05/25 19:25:36 INFO mapred.JobClient:   Map-Reduce Framework
12/05/25 19:25:36 INFO mapred.JobClient:     Reduce input groups=0
12/05/25 19:25:36 INFO mapred.JobClient:     Map output materialized bytes=6
12/05/25 19:25:36 INFO mapred.JobClient:     Combine output records=0
12/05/25 19:25:36 INFO mapred.JobClient:     Map input records=0
12/05/25 19:25:36 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/05/25 19:25:36 INFO mapred.JobClient:     Reduce output records=0
12/05/25 19:25:36 INFO mapred.JobClient:     Spilled Records=0
12/05/25 19:25:36 INFO mapred.JobClient:     Map output bytes=0
12/05/25 19:25:36 INFO mapred.JobClient:     Combine input records=0
12/05/25 19:25:36 INFO mapred.JobClient:     Map output records=0
12/05/25 19:25:36 INFO mapred.JobClient:     SPLIT_RAW_BYTES=159
12/05/25 19:25:36 INFO mapred.JobClient:     Reduce input records=0
12/05/25 19:25:36 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
12/05/25 19:25:36 INFO input.FileInputFormat: Total input paths to process : 1
12/05/25 19:25:36 INFO mapred.JobClient: Running job: job_local_0004
12/05/25 19:25:36 INFO mapred.MapTask: io.sort.mb = 100
12/05/25 19:25:37 INFO mapred.MapTask: data buffer = 79691776/99614720
12/05/25 19:25:37 INFO mapred.MapTask: record buffer = 262144/327680
12/05/25 19:25:37 INFO mapred.MapTask: Starting flush of map output
12/05/25 19:25:37 INFO mapred.Task: Task:attempt_local_0004_m_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:37 INFO mapred.JobClient:  map 0% reduce 0%
12/05/25 19:25:39 INFO mapred.LocalJobRunner: 
12/05/25 19:25:39 INFO mapred.Task: Task 'attempt_local_0004_m_000000_0' done.
12/05/25 19:25:39 INFO mapred.LocalJobRunner: 
12/05/25 19:25:39 INFO mapred.Merger: Merging 1 sorted segments
12/05/25 19:25:39 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of
total size: 0 bytes
12/05/25 19:25:39 INFO mapred.LocalJobRunner: 
12/05/25 19:25:39 INFO mapred.Task: Task:attempt_local_0004_r_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:39 INFO mapred.LocalJobRunner: 
12/05/25 19:25:39 INFO mapred.Task: Task attempt_local_0004_r_000000_0 is allowed to commit
now
12/05/25 19:25:39 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0004_r_000000_0'
to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tf-vectors
12/05/25 19:25:39 INFO mapred.JobClient:  map 100% reduce 0%
12/05/25 19:25:42 INFO mapred.LocalJobRunner: reduce > reduce
12/05/25 19:25:42 INFO mapred.Task: Task 'attempt_local_0004_r_000000_0' done.
12/05/25 19:25:43 INFO mapred.JobClient:  map 100% reduce 100%
12/05/25 19:25:43 INFO mapred.JobClient: Job complete: job_local_0004
12/05/25 19:25:43 INFO mapred.JobClient: Counters: 16
12/05/25 19:25:43 INFO mapred.JobClient:   File Output Format Counters 
12/05/25 19:25:43 INFO mapred.JobClient:     Bytes Written=102
12/05/25 19:25:43 INFO mapred.JobClient:   FileSystemCounters
12/05/25 19:25:43 INFO mapred.JobClient:     FILE_BYTES_READ=178650408
12/05/25 19:25:43 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=180311008
12/05/25 19:25:43 INFO mapred.JobClient:   File Input Format Counters 
12/05/25 19:25:43 INFO mapred.JobClient:     Bytes Read=102
12/05/25 19:25:43 INFO mapred.JobClient:   Map-Reduce Framework
12/05/25 19:25:43 INFO mapred.JobClient:     Reduce input groups=0
12/05/25 19:25:43 INFO mapred.JobClient:     Map output materialized bytes=6
12/05/25 19:25:43 INFO mapred.JobClient:     Combine output records=0
12/05/25 19:25:43 INFO mapred.JobClient:     Map input records=0
12/05/25 19:25:43 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/05/25 19:25:43 INFO mapred.JobClient:     Reduce output records=0
12/05/25 19:25:43 INFO mapred.JobClient:     Spilled Records=0
12/05/25 19:25:43 INFO mapred.JobClient:     Map output bytes=0
12/05/25 19:25:43 INFO mapred.JobClient:     Combine input records=0
12/05/25 19:25:43 INFO mapred.JobClient:     Map output records=0
12/05/25 19:25:43 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
12/05/25 19:25:43 INFO mapred.JobClient:     Reduce input records=0
12/05/25 19:25:43 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/05/25 19:25:43 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
12/05/25 19:25:43 INFO input.FileInputFormat: Total input paths to process : 1
12/05/25 19:25:43 INFO mapred.JobClient: Running job: job_local_0005
12/05/25 19:25:43 INFO mapred.MapTask: io.sort.mb = 100
12/05/25 19:25:43 INFO mapred.MapTask: data buffer = 79691776/99614720
12/05/25 19:25:43 INFO mapred.MapTask: record buffer = 262144/327680
12/05/25 19:25:43 INFO mapred.MapTask: Starting flush of map output
12/05/25 19:25:43 INFO mapred.Task: Task:attempt_local_0005_m_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:44 INFO mapred.JobClient:  map 0% reduce 0%
12/05/25 19:25:46 INFO mapred.LocalJobRunner: 
12/05/25 19:25:46 INFO mapred.Task: Task 'attempt_local_0005_m_000000_0' done.
12/05/25 19:25:46 INFO mapred.LocalJobRunner: 
12/05/25 19:25:46 INFO mapred.Merger: Merging 1 sorted segments
12/05/25 19:25:46 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of
total size: 0 bytes
12/05/25 19:25:46 INFO mapred.LocalJobRunner: 
12/05/25 19:25:46 INFO mapred.Task: Task:attempt_local_0005_r_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:46 INFO mapred.LocalJobRunner: 
12/05/25 19:25:46 INFO mapred.Task: Task attempt_local_0005_r_000000_0 is allowed to commit
now
12/05/25 19:25:46 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0005_r_000000_0'
to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/df-count
12/05/25 19:25:46 INFO mapred.JobClient:  map 100% reduce 0%
12/05/25 19:25:49 INFO mapred.LocalJobRunner: reduce > reduce
12/05/25 19:25:49 INFO mapred.Task: Task 'attempt_local_0005_r_000000_0' done.
12/05/25 19:25:49 INFO mapred.JobClient:  map 100% reduce 100%
12/05/25 19:25:49 INFO mapred.JobClient: Job complete: job_local_0005
12/05/25 19:25:49 INFO mapred.JobClient: Counters: 16
12/05/25 19:25:49 INFO mapred.JobClient:   File Output Format Counters 
12/05/25 19:25:49 INFO mapred.JobClient:     Bytes Written=105
12/05/25 19:25:49 INFO mapred.JobClient:   FileSystemCounters
12/05/25 19:25:49 INFO mapred.JobClient:     FILE_BYTES_READ=223312878
12/05/25 19:25:49 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=225387847
12/05/25 19:25:49 INFO mapred.JobClient:   File Input Format Counters 
12/05/25 19:25:49 INFO mapred.JobClient:     Bytes Read=102
12/05/25 19:25:49 INFO mapred.JobClient:   Map-Reduce Framework
12/05/25 19:25:49 INFO mapred.JobClient:     Reduce input groups=0
12/05/25 19:25:49 INFO mapred.JobClient:     Map output materialized bytes=6
12/05/25 19:25:49 INFO mapred.JobClient:     Combine output records=0
12/05/25 19:25:49 INFO mapred.JobClient:     Map input records=0
12/05/25 19:25:49 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/05/25 19:25:49 INFO mapred.JobClient:     Reduce output records=0
12/05/25 19:25:49 INFO mapred.JobClient:     Spilled Records=0
12/05/25 19:25:49 INFO mapred.JobClient:     Map output bytes=0
12/05/25 19:25:49 INFO mapred.JobClient:     Combine input records=0
12/05/25 19:25:49 INFO mapred.JobClient:     Map output records=0
12/05/25 19:25:49 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
12/05/25 19:25:49 INFO mapred.JobClient:     Reduce input records=0
12/05/25 19:25:49 INFO input.FileInputFormat: Total input paths to process : 1
12/05/25 19:25:49 INFO filecache.TrackerDistributedCacheManager: Creating frequency.file-0
in /tmp/hadoop-hudson/mapred/local/archive/682467587838248033_1334525619_91636919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans-work--6348288640532832657
with rwxr-xr-x
12/05/25 19:25:49 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
as /tmp/hadoop-hudson/mapred/local/archive/682467587838248033_1334525619_91636919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
12/05/25 19:25:49 INFO filecache.TrackerDistributedCacheManager: Cached /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
as /tmp/hadoop-hudson/mapred/local/archive/682467587838248033_1334525619_91636919/file/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/frequency.file-0
12/05/25 19:25:49 INFO mapred.JobClient: Running job: job_local_0006
12/05/25 19:25:50 INFO mapred.MapTask: io.sort.mb = 100
12/05/25 19:25:50 INFO mapred.MapTask: data buffer = 79691776/99614720
12/05/25 19:25:50 INFO mapred.MapTask: record buffer = 262144/327680
12/05/25 19:25:50 INFO mapred.MapTask: Starting flush of map output
12/05/25 19:25:50 INFO mapred.Task: Task:attempt_local_0006_m_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:50 INFO mapred.JobClient:  map 0% reduce 0%
12/05/25 19:25:53 INFO mapred.LocalJobRunner: 
12/05/25 19:25:53 INFO mapred.Task: Task 'attempt_local_0006_m_000000_0' done.
12/05/25 19:25:53 INFO mapred.LocalJobRunner: 
12/05/25 19:25:53 INFO mapred.Merger: Merging 1 sorted segments
12/05/25 19:25:53 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of
total size: 0 bytes
12/05/25 19:25:53 INFO mapred.LocalJobRunner: 
12/05/25 19:25:53 INFO mapred.JobClient:  map 100% reduce 0%
12/05/25 19:25:53 INFO mapred.Task: Task:attempt_local_0006_r_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:53 INFO mapred.LocalJobRunner: 
12/05/25 19:25:53 INFO mapred.Task: Task attempt_local_0006_r_000000_0 is allowed to commit
now
12/05/25 19:25:53 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0006_r_000000_0'
to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/05/25 19:25:56 INFO mapred.LocalJobRunner: reduce > reduce
12/05/25 19:25:56 INFO mapred.Task: Task 'attempt_local_0006_r_000000_0' done.
12/05/25 19:25:56 INFO mapred.JobClient:  map 100% reduce 100%
12/05/25 19:25:56 INFO mapred.JobClient: Job complete: job_local_0006
12/05/25 19:25:56 INFO mapred.JobClient: Counters: 16
12/05/25 19:25:56 INFO mapred.JobClient:   File Output Format Counters 
12/05/25 19:25:56 INFO mapred.JobClient:     Bytes Written=102
12/05/25 19:25:56 INFO mapred.JobClient:   FileSystemCounters
12/05/25 19:25:56 INFO mapred.JobClient:     FILE_BYTES_READ=267975873
12/05/25 19:25:56 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=270468540
12/05/25 19:25:56 INFO mapred.JobClient:   File Input Format Counters 
12/05/25 19:25:56 INFO mapred.JobClient:     Bytes Read=102
12/05/25 19:25:56 INFO mapred.JobClient:   Map-Reduce Framework
12/05/25 19:25:56 INFO mapred.JobClient:     Reduce input groups=0
12/05/25 19:25:56 INFO mapred.JobClient:     Map output materialized bytes=6
12/05/25 19:25:56 INFO mapred.JobClient:     Combine output records=0
12/05/25 19:25:56 INFO mapred.JobClient:     Map input records=0
12/05/25 19:25:56 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/05/25 19:25:56 INFO mapred.JobClient:     Reduce output records=0
12/05/25 19:25:56 INFO mapred.JobClient:     Spilled Records=0
12/05/25 19:25:56 INFO mapred.JobClient:     Map output bytes=0
12/05/25 19:25:56 INFO mapred.JobClient:     Combine input records=0
12/05/25 19:25:56 INFO mapred.JobClient:     Map output records=0
12/05/25 19:25:56 INFO mapred.JobClient:     SPLIT_RAW_BYTES=150
12/05/25 19:25:56 INFO mapred.JobClient:     Reduce input records=0
12/05/25 19:25:56 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
12/05/25 19:25:56 INFO input.FileInputFormat: Total input paths to process : 1
12/05/25 19:25:56 INFO mapred.JobClient: Running job: job_local_0007
12/05/25 19:25:56 INFO mapred.MapTask: io.sort.mb = 100
12/05/25 19:25:57 INFO mapred.MapTask: data buffer = 79691776/99614720
12/05/25 19:25:57 INFO mapred.MapTask: record buffer = 262144/327680
12/05/25 19:25:57 INFO mapred.MapTask: Starting flush of map output
12/05/25 19:25:57 INFO mapred.Task: Task:attempt_local_0007_m_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:57 INFO mapred.JobClient:  map 0% reduce 0%
12/05/25 19:25:59 INFO mapred.LocalJobRunner: 
12/05/25 19:25:59 INFO mapred.Task: Task 'attempt_local_0007_m_000000_0' done.
12/05/25 19:25:59 INFO mapred.LocalJobRunner: 
12/05/25 19:25:59 INFO mapred.Merger: Merging 1 sorted segments
12/05/25 19:25:59 INFO mapred.Merger: Down to the last merge-pass, with 0 segments left of
total size: 0 bytes
12/05/25 19:25:59 INFO mapred.LocalJobRunner: 
12/05/25 19:25:59 INFO mapred.JobClient:  map 100% reduce 0%
12/05/25 19:25:59 INFO mapred.Task: Task:attempt_local_0007_r_000000_0 is done. And is in
the process of commiting
12/05/25 19:25:59 INFO mapred.LocalJobRunner: 
12/05/25 19:25:59 INFO mapred.Task: Task attempt_local_0007_r_000000_0 is allowed to commit
now
12/05/25 19:25:59 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0007_r_000000_0'
to /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
12/05/25 19:26:02 INFO mapred.LocalJobRunner: reduce > reduce
12/05/25 19:26:02 INFO mapred.Task: Task 'attempt_local_0007_r_000000_0' done.
12/05/25 19:26:02 INFO mapred.JobClient:  map 100% reduce 100%
12/05/25 19:26:02 INFO mapred.JobClient: Job complete: job_local_0007
12/05/25 19:26:02 INFO mapred.JobClient: Counters: 16
12/05/25 19:26:02 INFO mapred.JobClient:   File Output Format Counters 
12/05/25 19:26:02 INFO mapred.JobClient:     Bytes Written=102
12/05/25 19:26:02 INFO mapred.JobClient:   FileSystemCounters
12/05/25 19:26:02 INFO mapred.JobClient:     FILE_BYTES_READ=312638462
12/05/25 19:26:02 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=315545934
12/05/25 19:26:02 INFO mapred.JobClient:   File Input Format Counters 
12/05/25 19:26:02 INFO mapred.JobClient:     Bytes Read=102
12/05/25 19:26:02 INFO mapred.JobClient:   Map-Reduce Framework
12/05/25 19:26:02 INFO mapred.JobClient:     Reduce input groups=0
12/05/25 19:26:02 INFO mapred.JobClient:     Map output materialized bytes=6
12/05/25 19:26:02 INFO mapred.JobClient:     Combine output records=0
12/05/25 19:26:02 INFO mapred.JobClient:     Map input records=0
12/05/25 19:26:02 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/05/25 19:26:02 INFO mapred.JobClient:     Reduce output records=0
12/05/25 19:26:02 INFO mapred.JobClient:     Spilled Records=0
12/05/25 19:26:02 INFO mapred.JobClient:     Map output bytes=0
12/05/25 19:26:02 INFO mapred.JobClient:     Combine input records=0
12/05/25 19:26:02 INFO mapred.JobClient:     Map output records=0
12/05/25 19:26:02 INFO mapred.JobClient:     SPLIT_RAW_BYTES=157
12/05/25 19:26:02 INFO mapred.JobClient:     Reduce input records=0
12/05/25 19:26:02 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
12/05/25 19:26:02 INFO driver.MahoutDriver: Program took 44100 ms (Minutes: 0.735)
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/mahout-examples-0.7-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/zonestorage/hudson_solaris/home/hudson/hudson-slave/workspace/Mahout-Examples-Cluster-Reuters/trunk/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
12/05/25 19:26:03 INFO common.AbstractJob: Command line arguments: {--clustering=null, --clusters=[/tmp/mahout-work-hudson/reuters-kmeans-clusters],
--convergenceDelta=[0.5], --distanceMeasure=[org.apache.mahout.common.distance.CosineDistanceMeasure],
--endPhase=[2147483647], --input=[/tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/],
--maxIter=[10], --method=[mapreduce], --numClusters=[20], --output=[/tmp/mahout-work-hudson/reuters-kmeans],
--overwrite=null, --startPhase=[0], --tempDir=[temp]}
12/05/25 19:26:03 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/reuters-kmeans-clusters
12/05/25 19:26:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
12/05/25 19:26:04 INFO compress.CodecPool: Got brand-new compressor
12/05/25 19:26:04 INFO kmeans.RandomSeedGenerator: Wrote 20 vectors to /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed
12/05/25 19:26:04 INFO kmeans.KMeansDriver: Input: /tmp/mahout-work-hudson/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
Clusters In: /tmp/mahout-work-hudson/reuters-kmeans-clusters/part-randomSeed Out: /tmp/mahout-work-hudson/reuters-kmeans
Distance: org.apache.mahout.common.distance.CosineDistanceMeasure
12/05/25 19:26:04 INFO kmeans.KMeansDriver: convergence: 0.5 max Iterations: 10 num Reduce
Tasks: org.apache.mahout.math.VectorWritable Input Vectors: {}
12/05/25 19:26:04 INFO compress.CodecPool: Got brand-new decompressor
Exception in thread "main" java.lang.IllegalStateException: No input clusters found. Check
your -c argument.
	at org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:218)
	at org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:149)
	at org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:108)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
	at org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:49)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
	at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
	at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
Build step 'Execute shell' marked build as failure

Mime
View raw message