hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bryan Baugher <bjb...@gmail.com>
Subject Re: Custom HBase Filter : Error in readFields
Date Thu, 21 Feb 2013 04:38:02 GMT
Sure, http://pastebin.com/jeXUqhsP


On Wed, Feb 20, 2013 at 10:16 PM, Ted Yu <yuzhihong@gmail.com> wrote:

> Can you show us your updated code ?
>
> Thanks
>
> On Wed, Feb 20, 2013 at 7:46 PM, Bryan Baugher <bjbq4d@gmail.com> wrote:
>
> > I updated my code to use the Bytes class for serialization and added more
> > log messages. I see this[1] now. It is able to create the filter the
> first
> > time but when it gets to the second region (on the same region server) it
> > attempts to create the filter again but the data read in from readFields
> > seems corrupted.
> >
> > [1] - http://pastebin.com/TqNsUVSk
> >
> >
> > On Wed, Feb 20, 2013 at 8:48 PM, Ted Yu <yuzhihong@gmail.com> wrote:
> >
> > > Can you use code similar to the following for serialization ?
> > >   public void readFields(DataInput in) throws IOException {
> > >     this.prefix = Bytes.readByteArray(in);
> > >   }
> > >
> > > See src/main/java/org/apache/hadoop/hbase/filter/PrefixFilter.java
> > >
> > > Thanks
> > >
> > > On Wed, Feb 20, 2013 at 5:58 PM, Bryan Baugher <bjbq4d@gmail.com>
> wrote:
> > >
> > > > Here[1] is the code for the filter.
> > > >
> > > > -Bryan
> > > >
> > > > [1] - http://pastebin.com/5Qjas88z
> > > >
> > > > > Bryan:
> > > > > Looks like you may have missed adding unit test for your filter.
> > > > >
> > > > > Unit test should have caught this situation much earlier.
> > > > >
> > > > > Cheers
> > > > >
> > > > > On Wed, Feb 20, 2013 at 3:42 PM, Viral Bajaria <
> > > viral.bajaria@gmail.com
> > > > >wrote:
> > > > >
> > > > > > Also the readFields is your implementation of how to read the
> byte
> > > > array
> > > > > > transferred from the client. So I think there has to be some
> issue
> > in
> > > > how
> > > > > > you write the byte array to the network and what you are reading
> > out
> > > of
> > > > > > that i.e. the size of arrays might not be identical.
> > > > > >
> > > > > > But as Ted mentioned, looking at the code will help troubleshoot
> it
> > > > better.
> > > > > >
> > > > > > On Wed, Feb 20, 2013 at 3:32 PM, Ted Yu <yuzhihong@gmail.com>
> > wrote:
> > > > > >
> > > > > > > If you show us the code for RowRangeFilter, that would
help us
> > > > > > > troubleshoot.
> > > > > > >
> > > > > > > Cheers
> > > > > > >
> > > > > > > On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <
> bjbq4d@gmail.com
> > >
> > > > wrote:
> > > > > > >
> > > > > > > > Hi everyone,
> > > > > > > >
> > > > > > > > I am trying to write my own custom Filter but I have
been
> > having
> > > > > > issues.
> > > > > > > > When there is only 1 region in my table the scan works
as
> > > expected
> > > > but
> > > > > > > when
> > > > > > > > there is more, it attempts to create a new version
of my
> filter
> > > and
> > > > > > > > deserialize the information again but the data seems
to be
> > gone.
> > > I
> > > > am
> > > > > > > > running HBase 0.92.1-cdh4.1.1.
> > > > > > > >
> > > > > > > > 2013-02-20 15:39:53,220 DEBUG
> > > > com.cerner.kepler.filters.RowRangeFilter:
> > > > > > > > Reading fields
> > > > > > > > 2013-02-20 15:40:08,612 WARN
> > > org.apache.hadoop.hbase.util.Sleeper:
> > > > We
> > > > > > > slept
> > > > > > > > 15346ms instead of 3000ms, this is likely due to a
long
> garbage
> > > > > > > collecting
> > > > > > > > pause and it's usually bad, see
> > > > > > > >
> http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
> > > > > > > > 2013-02-20 15:40:09,142 ERROR
> > > > > > > > org.apache.hadoop.hbase.io.HbaseObjectWritable: Error
in
> > > readFields
> > > > > > > > java.lang.ArrayIndexOutOfBoundsException
> > > > > > > >         at java.lang.System.arraycopy(Native Method)
> > > > > > > >         at
> > > > > > >
> java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> > > > > > > >         at
> > > > java.io.DataInputStream.readFully(DataInputStream.java:178)
> > > > > > > >         at
> > > > java.io.DataInputStream.readFully(DataInputStream.java:152)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> > > > > > > >         at
> > > > > > org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> > > > > > > >         at
> > > > > > > >
> > > >
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > > > > > > >         at java.lang.Thread.run(Thread.java:662)
> > > > > > > > 2013-02-20 15:40:17,498 WARN
> org.apache.hadoop.ipc.HBaseServer:
> > > > Unable
> > > > > > to
> > > > > > > > read call parameters for client ***
> > > > > > > > java.io.IOException: Error in readFields
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
> > > > > > > >         at
> > > > > > > >
> > > >
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > > > > > > >         at java.lang.Thread.run(Thread.java:662)
> > > > > > > > Caused by: java.lang.ArrayIndexOutOfBoundsException
> > > > > > > >         at java.lang.System.arraycopy(Native Method)
> > > > > > > >         at
> > > > > > >
> java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> > > > > > > >         at
> > > > java.io.DataInputStream.readFully(DataInputStream.java:178)
> > > > > > > >         at
> > > > java.io.DataInputStream.readFully(DataInputStream.java:152)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> > > > > > > >         at
> > > > > > org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> > > > > > > >         at
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> > > > > > > >         ... 9 more
> > > > > > > >
> > > > > > > > -Bryan
> > > > > > > >
> > > > > > >
> > > > > >
> > > >
> > >
> >
> >
> >
> > --
> > -Bryan
> >
>



-- 
-Bryan

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message