hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xian Boullosa" <xian.boull...@ctag.com>
Subject RE: Trying to serialize a image
Date Mon, 22 Aug 2011 15:48:12 GMT
I found out why that string was getting whipped.

It happens that on the function context.getInputValue(). Its the string
serialized as I wanted, but, I was copying as a new object instance
(std::string = context.getInputValue()) instead of reserve and memcpy the
content. Did it and it work like a charm :)


I'm running a 2 Node Hadoop (v 20.203).

I'm trying to do a distributed image processing. And now, I'm facing an

The job it's a Hadoop pipes job, with custom RecordReader, Maper and Reducer
(Standard record Writer yet).

The job itself it's using C++ and OpenCV as image processing library.

I got all the data I need on record reader (value of image pixels on an
Array), and doing a custom serialization to the Value variable on the .next

It goes well until that string gets in the map method.


It gets whipped, I mean. I got the binary data on the string, (it also has a
28 bit header). But when that string reach map, it's empty. 

To keep reserved space I use string->reserve () Method, and I'm sure that I
Reserve enough memory (did some local test on that and works like charm).

I do memcpy(&string[x],&Image->data[x],sizeof(char)) to write the values on
that string. And use capacity to measure how much do I have reserved.


I would appreciate any tip that could guide me to solve this problem.



Thank you all in advance :-)

View raw message