Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 82845 invoked from network); 25 Aug 2009 00:42:40 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 25 Aug 2009 00:42:40 -0000 Received: (qmail 88383 invoked by uid 500); 25 Aug 2009 00:43:03 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 88288 invoked by uid 500); 25 Aug 2009 00:43:02 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 88276 invoked by uid 99); 25 Aug 2009 00:43:02 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 25 Aug 2009 00:43:02 +0000 X-ASF-Spam-Status: No, hits=3.4 required=10.0 tests=HTML_MESSAGE,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.221.178] (HELO mail-qy0-f178.google.com) (209.85.221.178) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 25 Aug 2009 00:42:53 +0000 Received: by qyk8 with SMTP id 8so391685qyk.2 for ; Mon, 24 Aug 2009 17:42:32 -0700 (PDT) MIME-Version: 1.0 Received: by 10.224.16.71 with SMTP id n7mr3464722qaa.162.1251160952157; Mon, 24 Aug 2009 17:42:32 -0700 (PDT) In-Reply-To: References: From: Aaron Kimball Date: Mon, 24 Aug 2009 17:42:12 -0700 Message-ID: Subject: Re: Writing to a db with DBOutputFormat spits out IOException Error To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=0016360f32471c3bc20471ec9cc3 X-Virus-Checked: Checked by ClamAV on apache.org --0016360f32471c3bc20471ec9cc3 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit As a more general note -- any jars needed by your mappers and reducers either need to be in your job jar in the lib/ directory of the .jar file, or in $HADOOP_HOME/lib/ on all tasktracker nodes where mappers and reducers get run. - Aaron On Fri, Aug 21, 2009 at 10:47 AM, ishwar ramani wrote: > For future reference. > > This is a class not found exception for the mysql driver. The > DBOuputFormat converts > it into an IO exception grrrrr. > > I had the mysql-connector in both $HADOOP/lib and $HADOOP_CLASSPATH. > That did not help. > > I had to pkg the mysql jar into my map reduce jar to fix this problem. > > Hope that saves a day for some one! > > On Thu, Aug 20, 2009 at 4:52 PM, ishwar ramani wrote: > > Hi, > > > > I am trying to run a simple map reduce that writes the result from the > > reducer to a mysql db. > > > > I Keep getting > > > > 09/08/20 15:44:59 INFO mapred.JobClient: Task Id : > > attempt_200908201210_0013_r_000000_0, Status : FAILED > > java.io.IOException: com.mysql.jdbc.Driver > > at > org.apache.hadoop.mapred.lib.db.DBOutputFormat.getRecordWriter(DBOutputFormat.java:162) > > at > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:435) > > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:413) > > at org.apache.hadoop.mapred.Child.main(Child.java:170) > > > > when the reducer is run. > > > > Here is my code. The user name and password are valid and works fine. > > Is there any way get more info on this exception? > > > > > > > > static class MyWritable implements Writable, DBWritable { > > long id; > > String description; > > > > MyWritable(long mid, String mdescription) { > > id = mid; > > description = mdescription; > > } > > > > public void readFields(DataInput in) throws IOException { > > this.id = in.readLong(); > > this.description = Text.readString(in); > > } > > > > public void readFields(ResultSet resultSet) > > throws SQLException { > > this.id = resultSet.getLong(1); > > this.description = resultSet.getString(2); > > } > > > > public void write(DataOutput out) throws IOException { > > out.writeLong(this.id); > > Text.writeString(out, this.description); > > } > > > > public void write(PreparedStatement stmt) throws SQLException { > > stmt.setLong(1, this.id); > > stmt.setString(2, this.description); > > } > > } > > > > > > > > > > > > > > public static class Reduce extends MapReduceBase implements > > Reducer { > > public void reduce(Text key, Iterator values, > > OutputCollector output, Reporter reporter) > > throws IOException { > > int sum = 0; > > while (values.hasNext()) { > > sum += values.next().get(); > > } > > > > output.collect(new MyWritable(sum, key.toString()), new > IntWritable(sum)); > > } > > } > > > > > > > > > > > > public static void main(String[] args) throws Exception { > > JobConf conf = new JobConf(WordCount.class); > > conf.setJobName("wordcount"); > > > > conf.setMapperClass(Map.class); > > > > conf.setReducerClass(Reduce.class); > > > > DBConfiguration.configureDB(conf, "com.mysql.jdbc.Driver", > > "jdbc:mysql://localhost:8100/testvmysqlsb", "dummy", "pass"); > > > > > > String fields[] = {"id", "description"}; > > DBOutputFormat.setOutput(conf, "funtable", fields); > > > > > > > > conf.setNumMapTasks(1); > > conf.setNumReduceTasks(1); > > > > conf.setMapOutputKeyClass(Text.class); > > conf.setMapOutputValueClass(IntWritable.class); > > > > > > conf.setOutputKeyClass(MyWritable.class); > > conf.setOutputValueClass(IntWritable.class); > > > > conf.setInputFormat(TextInputFormat.class); > > > > > > > > > > FileInputFormat.setInputPaths(conf, new Path(args[0])); > > > > > > JobClient.runJob(conf); > > } > > > --0016360f32471c3bc20471ec9cc3--