hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Lucene-hadoop Wiki] Update of "HowToDebugMapReducePrograms" by Amareshwari
Date Thu, 11 Oct 2007 10:01:30 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Lucene-hadoop Wiki" for change notification.

The following page has been changed by Amareshwari:
http://wiki.apache.org/lucene-hadoop/HowToDebugMapReducePrograms

------------------------------------------------------------------------------
  [http://lucene.apache.org/hadoop/api/org/apache/hadoop/filecache/DistributedCache.html#addCacheFile(java.net.URI,%20org.apache.hadoop.conf.Configuration)
DistributedCache.addCacheFile(URI,conf)] and [http://lucene.apache.org/hadoop/api/org/apache/hadoop/filecache/DistributedCache.html#setCacheFiles
DistributedCache.setCacheFiles(URIs,conf)] where URI is of the form "hdfs://host:port/<absolutepath>#<script-name>".
  For Streaming, the file can be added through command line option -cacheFile.
  
+ == Default Behavior ==
+ 
+ For Java programs:
+ Stdout, stderr are shown on job UI. Stack trace is printed on diagnostics.
+ 
+ For Pipes:
+ Stdout, stderr are shown on the job UI.
+ Default gdb script is run which prints info abt threads: thread Id and function in which
it was running when task failed. 
+ And prints stack tarce where task has failed.
+ 
+ For Streaming:
+ Stdout, stderr are shown on the Job UI.
+ The exception details are shown on task diagnostics.
+ 

Mime
View raw message