Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A142C11571 for ; Thu, 25 Sep 2014 12:10:21 +0000 (UTC) Received: (qmail 90593 invoked by uid 500); 25 Sep 2014 12:10:15 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 90475 invoked by uid 500); 25 Sep 2014 12:10:15 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 90465 invoked by uid 99); 25 Sep 2014 12:10:15 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 25 Sep 2014 12:10:15 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of Blanca.Hernandez@willhaben.at designates 195.12.209.79 as permitted sender) Received: from [195.12.209.79] (HELO srvsgr-smtp02.styria-it.com) (195.12.209.79) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 25 Sep 2014 12:10:08 +0000 Received: from srvsgr-hubcas01.AT.styria-it.net ([195.12.215.41]) by srvsgr-smtp02.styria-it.com (8.14.5/8.14.5) with ESMTP id s8PC9hT2028453 for ; Thu, 25 Sep 2014 14:09:43 +0200 Received: from SRVSGR-MBOX02.AT.styria-it.net ([fe80::b823:953a:d902:1ee0]) by srvsgr-hubcas01.AT.styria-it.net ([195.12.215.41]) with mapi id 14.03.0174.001; Thu, 25 Sep 2014 14:09:43 +0200 From: Blanca Hernandez To: "user@hadoop.apache.org" Subject: MRUnit tests with mongo hadoop Thread-Topic: MRUnit tests with mongo hadoop Thread-Index: Ac/YuJN3CMjvqmgMSt+Pj2s3iwML1Q== Date: Thu, 25 Sep 2014 12:09:43 +0000 Message-ID: Accept-Language: en-US, de-AT Content-Language: de-DE X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [195.12.192.51] Content-Type: multipart/alternative; boundary="_000_CF5C9CB4C4722244884762FA2936D75F14FC7777SRVSGRMBOX02ATs_" MIME-Version: 1.0 X-Virus-Checked: Checked by ClamAV on apache.org --_000_CF5C9CB4C4722244884762FA2936D75F14FC7777SRVSGRMBOX02ATs_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Hi! I am not sure if this question must be posted in the Hadoop forum or in the= mongoDB one. Let=B4s try: I am using the mongo-hadoop integration and wrote a MR job. I want to test = it, and foud the framework MRUnit (https://mrunit.apache.org/), which sound= s great. I am facing some difficulties with it, since apparently there mongoDB class= es are not supported by the framework (?). A simple example: public class MrUnitBasicTests { @Test public void testVeryBasicOneAttributeDocument() throws Exception { Mapper mapper =3D new M= apper(){ @SuppressWarnings("unchecked") @Override protected void map(Object key, BSONObject value, org.apache.had= oop.mapreduce.Mapper.Context context) throws IOException, InterruptedException { Object writeKey =3D createOutputKey(); Object writeValue =3D createOutputValue(); context.write(writeKey, writeValue); } }; BSONObject input =3D new BasicDBObject("key", "value"); // ParseMetadataAsTextIntoAvroMapper mapper =3D new ParseMetadataAsT= extIntoAvroMapper(); MapDriver mapDriver =3D= MapDriver.newMapDriver(mapper); mapDriver.withInput(new LongWritable(1), input); mapDriver.withOutput(createOutputKey(), createOutputValue()); mapDriver.runTest(); } private BasicDBObject createOutputKey() { return new BasicDBObject("zonid", new ObjectId("5179577adb2da69ad0e= e98e9")); } private BasicDBObject createOutputValue() { return new BasicDBObject("key", "value"); } } And the exception: java.lang.IllegalStateException: No applicable class implementing Serializa= tion in conf at io.serializations for class com.mongodb.BasicDBObject The io.serializations configuration property contains WritableSerialization= which implements org.apache.hadoop.io.serializer.Serialization I saw that the com.mongodb.BasicDBObject implements the java.io.Serializabl= e. Is there any connection among them? How could I go on with the tests? Any e= xperience with it? Many thanks --_000_CF5C9CB4C4722244884762FA2936D75F14FC7777SRVSGRMBOX02ATs_ Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable

Hi!

 

I am not sure if this question = must be posted in the Hadoop forum or in the mongoDB one. Let=B4s try:=

 

I am using the mongo-hadoop int= egration and wrote a MR job. I want to test it, and foud the framework MRUn= it (https://mrunit.apache.org/<= span lang=3D"EN-US">), which sounds great.

 

I am facing some difficulties w= ith it, since apparently there mongoDB classes are not supported by the fra= mework (?).

A simple example:

 

public class MrUnitBasicTests {=

 

    @Test

    public void = testVeryBasicOneAttributeDocument() throws Exception {

     &= nbsp;  Mapper<Object, BSONObject, BSONObject, BSONObject> mapper= =3D new Mapper<Object, BSONObject, BSONObject, BSONObject>(){

     &= nbsp;      @SuppressWarnings("unchecked"= )

     &= nbsp;      @Override

     &= nbsp;      protected void map(Object key, BSONObje= ct value, org.apache.hadoop.mapreduce.Mapper.Context context)

     &= nbsp;           &nbs= p;  throws IOException, InterruptedException {

     &= nbsp;          Object writeKey= =3D createOutputKey();

     &= nbsp;          Object writeVal= ue =3D createOutputValue();

     &= nbsp;          context.write(w= riteKey, writeValue);

     &= nbsp;      }

     &= nbsp;  };

     &= nbsp;  BSONObject input =3D new BasicDBObject("key", "v= alue");

//     = ;   ParseMetadataAsTextIntoAvroMapper mapper =3D new ParseMetadat= aAsTextIntoAvroMapper();

     &= nbsp;  MapDriver<Object, BSONObject, BSONObject, BSONObject> map= Driver =3D MapDriver.newMapDriver(mapper);

     &= nbsp;  mapDriver.withInput(new LongWritable(1), input);

     &= nbsp;  mapDriver.withOutput(createOutputKey(), createOutputValue());

     &= nbsp;  mapDriver.runTest();

    }=

 

    private Basi= cDBObject createOutputKey() {

     &= nbsp;  return new BasicDBObject("zonid", new ObjectId("= 5179577adb2da69ad0ee98e9"));

    }=

 

    private Basi= cDBObject createOutputValue() {

     &= nbsp;  return new BasicDBObject("key", "value");

    }=

}

 

 

And the exception: <= /span>

java.lang.IllegalStateException= : No applicable class implementing Serialization in conf at io.serializatio= ns for class com.mongodb.BasicDBObject

 

The io.serializations configura= tion property contains WritableSerialization which implements org.apache.hadoop.io.serializer.Serialization

I saw that the com.mongodb.Basi= cDBObject implements the java.io.Serializable.

 

Is there any connection among t= hem? How could I go on with the tests? Any experience with it?

 

Many thanks

 

 

--_000_CF5C9CB4C4722244884762FA2936D75F14FC7777SRVSGRMBOX02ATs_--