drill-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jacques Nadeau (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (DRILL-1214) Query with filter conditions fail while querying against Views built against Hbase tables
Date Thu, 07 Aug 2014 04:00:16 GMT

    [ https://issues.apache.org/jira/browse/DRILL-1214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14088791#comment-14088791
] 

Jacques Nadeau commented on DRILL-1214:
---------------------------------------

Resolved by 8c66525 or earlier.

> Query with filter conditions fail while querying against Views built against Hbase tables
> -----------------------------------------------------------------------------------------
>
>                 Key: DRILL-1214
>                 URL: https://issues.apache.org/jira/browse/DRILL-1214
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Storage - HBase
>            Reporter: Amit Katti
>             Fix For: 0.5.0
>
>         Attachments: DRILL-1214.patch
>
>
> I created a Hbase table called hbase_student as follows in hbase shell:
> {code}
> create 'hbase_student', 'stats'
> put 'hbase_student', '1', 'stats:name', 'fred ovid'
> put 'hbase_student', '1', 'stats:age', '76'
> put 'hbase_student', '1', 'stats:gpa', '1.55'
> put 'hbase_student', '1', 'stats:studentnum', '692315658449'
> put 'hbase_student', '1', 'stats:create_time', '2014-05-27 00:26:07'
> put 'hbase_student', '2', 'stats:name', 'bob brown'
> put 'hbase_student', '2', 'stats:age', '63'
> put 'hbase_student', '2', 'stats:gpa', '3.18'
> put 'hbase_student', '2', 'stats:studentnum', '650706039334'
> put 'hbase_student', '2', 'stats:create_time', '2014-12-04 21:43:14'
> put 'hbase_student', '3', 'stats:name', 'bob hernandez'
> put 'hbase_student', '3', 'stats:age', '28'
> put 'hbase_student', '3', 'stats:gpa', '1.09'
> put 'hbase_student', '3', 'stats:studentnum', '293612255322'
> put 'hbase_student', '3', 'stats:create_time', '2014-05-31 14:33:06'
> {code}
> I then logged in to sqlline using dfs.root schema and am able to successfully create
a view & query it as long as filter conditions are not present
> {code}
> create view student_view as
> select cast(tbl.row_key as int)rownum, 
> cast(tbl.stats.name as varchar(20))name,
> cast(tbl.stats.age as int)age, 
> cast(tbl.stats.gpa as float)gpa,
> cast(tbl.stats.studentnum as bigint)studentnum, 
> cast(tbl.stats.create_time as varchar(20))create_time 
> from hbase.hbase_student tbl
> order by rownum;
> Select * from student_view;
> +------------+------------+------------+------------+------------+-------------+
> |   rownum   |    name    |    age     |    gpa     | studentnum | create_time |
> +------------+------------+------------+------------+------------+-------------+
> | 1          | fred ovid  | 76         | 1.55       | 692315658449 | 2014-05-27 00:26:07
|
> | 2          | bob brown  | 63         | 3.18       | 650706039334 | 2014-12-04 21:43:14
|
> | 3          | bob hernandez | 28         | 1.09       | 293612255322 | 2014-05-31 14:33:06
|
> +------------+------------+------------+------------+------------+-------------+
> Select name,age,create_time from student_view;
> +------------+------------+-------------+
> |    name    |    age     | create_time |
> +------------+------------+-------------+
> | fred ovid  | 76         | 2014-05-27 00:26:07 |
> | bob brown  | 63         | 2014-12-04 21:43:14 |
> | bob hernandez | 28         | 2014-05-31 14:33:06 |
> +------------+------------+-------------+
> {code}
> However if I have a filter condition, the query fails as follows:
> {code}
> select * from student_view where age > 50;
> error_type: 0
> message: "Screen received stop request sent. < SchemaChangeException:[ Failure while
attempting to load generated class ] < ClassTransformationException:[ Failure generating
transformation classes for value: 
>  
> package org.apache.drill.exec.test.generated;
> import org.apache.drill.exec.exception.SchemaChangeException;
> import org.apache.drill.exec.ops.FragmentContext;
> import org.apache.drill.exec.record.RecordBatch;
> import org.apache.drill.exec.vector.IntVector;
> import org.apache.drill.exec.vector.NullableBigIntVector;
> import org.apache.drill.exec.vector.NullableFloat4Vector;
> import org.apache.drill.exec.vector.NullableIntVector;
> import org.apache.drill.exec.vector.NullableVarCharVector;
> public class CopierGen287 {
>     IntVector vv0;
>     IntVector vv3;
>     NullableVarCharVector vv6;
>     NullableVarCharVector vv9;
>     NullableIntVector vv12;
>     NullableIntVector vv15;
>     NullableFloat4Vector vv18;
>     NullableFloat4Vector vv21;
>     NullableBigIntVector vv24;
>     NullableBigIntVector vv27;
>     NullableVarCharVector vv30;
>     NullableVarCharVector vv33;
>     public void doSetup(FragmentContext context, RecordBatch incoming, RecordBatch outgoing)
>         throws SchemaChangeException
>     {
>         {
>             int[] fieldIds1 = new int[ 1 ] ;
>             fieldIds1 [ 0 ] = 0;
>             Object tmp2 = (incoming).getValueAccessorById(IntVector.class, fieldIds1).getValueVector();
>             if (tmp2 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv0 with
id: org.apache.drill.exec.record.TypedFieldId@21b323dc.");
>             }
>             vv0 = ((IntVector) tmp2);
>             int[] fieldIds4 = new int[ 1 ] ;
>             fieldIds4 [ 0 ] = 0;
>             Object tmp5 = (outgoing).getValueAccessorById(IntVector.class, fieldIds4).getValueVector();
>             if (tmp5 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv3 with
id: org.apache.drill.exec.record.TypedFieldId@21b323dc.");
>             }
>             vv3 = ((IntVector) tmp5);
>             int[] fieldIds7 = new int[ 1 ] ;
>             fieldIds7 [ 0 ] = 1;
>             Object tmp8 = (incoming).getValueAccessorById(NullableVarCharVector.class,
fieldIds7).getValueVector();
>             if (tmp8 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv6 with
id: org.apache.drill.exec.record.TypedFieldId@f66d7cdd.");
>             }
>             vv6 = ((NullableVarCharVector) tmp8);
>             int[] fieldIds10 = new int[ 1 ] ;
>             fieldIds10 [ 0 ] = 1;
>             Object tmp11 = (outgoing).getValueAccessorById(NullableVarCharVector.class,
fieldIds10).getValueVector();
>             if (tmp11 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv9 with
id: org.apache.drill.exec.record.TypedFieldId@f66d7cdd.");
>             }
>             vv9 = ((NullableVarCharVector) tmp11);
>             int[] fieldIds13 = new int[ 1 ] ;
>             fieldIds13 [ 0 ] = 2;
>             Object tmp14 = (incoming).getValueAccessorById(NullableIntVector.class, fieldIds13).getValueVector();
>             if (tmp14 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv12 with
id: org.apache.drill.exec.record.TypedFieldId@2376fc9d.");
>             }
>             vv12 = ((NullableIntVector) tmp14);
>             int[] fieldIds16 = new int[ 1 ] ;
>             fieldIds16 [ 0 ] = 2;
>             Object tmp17 = (outgoing).getValueAccessorById(NullableIntVector.class, fieldIds16).getValueVector();
>             if (tmp17 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv15 with
id: org.apache.drill.exec.record.TypedFieldId@2376fc9d.");
>             }
>             vv15 = ((NullableIntVector) tmp17);
>             int[] fieldIds19 = new int[ 1 ] ;
>             fieldIds19 [ 0 ] = 3;
>             Object tmp20 = (incoming).getValueAccessorById(NullableFloat4Vector.class,
fieldIds19).getValueVector();
>             if (tmp20 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv18 with
id: org.apache.drill.exec.record.TypedFieldId@3d6b2cfd.");
>             }
>             vv18 = ((NullableFloat4Vector) tmp20);
>             int[] fieldIds22 = new int[ 1 ] ;
>             fieldIds22 [ 0 ] = 3;
>             Object tmp23 = (outgoing).getValueAccessorById(NullableFloat4Vector.class,
fieldIds22).getValueVector();
>             if (tmp23 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv21 with
id: org.apache.drill.exec.record.TypedFieldId@3d6b2cfd.");
>             }
>             vv21 = ((NullableFloat4Vector) tmp23);
>             int[] fieldIds25 = new int[ 1 ] ;
>             fieldIds25 [ 0 ] = 4;
>             Object tmp26 = (incoming).getValueAccessorById(NullableBigIntVector.class,
fieldIds25).getValueVector();
>             if (tmp26 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv24 with
id: org.apache.drill.exec.record.TypedFieldId@c6480360.");
>             }
>             vv24 = ((NullableBigIntVector) tmp26);
>             int[] fieldIds28 = new int[ 1 ] ;
>             fieldIds28 [ 0 ] = 4;
>             Object tmp29 = (outgoing).getValueAccessorById(NullableBigIntVector.class,
fieldIds28).getValueVector();
>             if (tmp29 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv27 with
id: org.apache.drill.exec.record.TypedFieldId@c6480360.");
>             }
>             vv27 = ((NullableBigIntVector) tmp29);
>             int[] fieldIds31 = new int[ 1 ] ;
>             fieldIds31 [ 0 ] = 5;
>             Object tmp32 = (incoming).getValueAccessorById(NullableVarCharVector.class,
fieldIds31).getValueVector();
>             if (tmp32 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv30 with
id: org.apache.drill.exec.record.TypedFieldId@fd40df59.");
>             }
>             vv30 = ((NullableVarCharVector) tmp32);
>             int[] fieldIds34 = new int[ 1 ] ;
>             fieldIds34 [ 0 ] = 5;
>             Object tmp35 = (outgoing).getValueAccessorById(NullableVarCharVector.class,
fieldIds34).getValueVector();
>             if (tmp35 == null) {
>                 throw new SchemaChangeException("Failure while loading vector vv33 with
id: org.apache.drill.exec.record.TypedFieldId@fd40df59.");
>             }
>             vv33 = ((NullableVarCharVector) tmp35);
>         }
>     }
>     public boolean doEval(int inIndex, int outIndex)
>         throws SchemaChangeException
>     {
>         {
>             if (!vv3 .copyFromSafe(((inIndex)& 65535), (outIndex), vv0 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>             if (!vv9 .copyFromSafe(((inIndex)& 65535), (outIndex), vv6 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>             if (!vv15 .copyFromSafe(((inIndex)& 65535), (outIndex), vv12 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>             if (!vv21 .copyFromSafe(((inIndex)& 65535), (outIndex), vv18 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>             if (!vv27 .copyFromSafe(((inIndex)& 65535), (outIndex), vv24 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>             if (!vv33 .copyFromSafe(((inIndex)& 65535), (outIndex), vv30 [((inIndex)>>>
16)])) {
>                 return false;
>             }
>         }
>         {
>             return true;
>         }
>     }
> }
>  ] < CompileException:[ Line 123, Column 36: No applicable constructor/method found
for actual parameters "int, int, java.lang.Object"; candidates are: "public boolean org.apache.drill.exec.vector.IntVector.copyFromSafe(int,
int, org.apache.drill.exec.vector.IntVector)" ]"
> ]
> {code}
> The Exception in the drillbit.log is:
> {code}
> 2014-07-28 17:33:31,496 [7b0e3219-8f05-480e-a5f5-bcf1e505c044:frag:0:0] ERROR o.a.d.e.w.f.AbstractStatusReporter
- Error 373d229a-9697-46c0-a776-a747d5b0cf7a: Failure while running fragment.
> org.codehaus.commons.compiler.CompileException: Line 123, Column 36: No applicable constructor/method
found for actual parameters "int, int, java.lang.Object"; candidates are: "public boolean
org.apache.drill.exec.vector.IntVector.copyFromSafe(int, int, org.apache.drill.exec.vector.IntVector)"
> 	at org.codehaus.janino.UnitCompiler.compileError(UnitCompiler.java:10056) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.findMostSpecificIInvocable(UnitCompiler.java:7466)
~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.findIMethod(UnitCompiler.java:7336) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.findIMethod(UnitCompiler.java:7239) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:3860) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$6900(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$10.visitMethodInvocation(UnitCompiler.java:3261)
~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3978) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:3288) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:4354) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileBoolean2(UnitCompiler.java:2854) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$4800(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$8.visitMethodInvocation(UnitCompiler.java:2815)
~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3978) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileBoolean(UnitCompiler.java:2842) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileBoolean2(UnitCompiler.java:2872) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$4900(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$8.visitUnaryOperation(UnitCompiler.java:2808) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$UnaryOperation.accept(Java.java:3651) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileBoolean(UnitCompiler.java:2842) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:1743) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$1200(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$4.visitIfStatement(UnitCompiler.java:941) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$IfStatement.accept(Java.java:2145) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:962) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1004) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:989) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$1000(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$4.visitBlock(UnitCompiler.java:939) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$Block.accept(Java.java:2005) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:962) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileStatements(UnitCompiler.java:1004) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:2284) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:826) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileDeclaredMethods(UnitCompiler.java:798) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:503) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:389) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:182) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:343)
~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1136) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:350) ~[janino-2.7.4.jar:2.7.4]
> 	at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:318) ~[janino-2.7.4.jar:2.7.4]
> 	at org.apache.drill.exec.compile.JaninoClassCompiler.getByteCode(JaninoClassCompiler.java:48)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.compile.AbstractClassCompiler.getClassByteCode(AbstractClassCompiler.java:43)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.compile.QueryClassLoader$ClassCompilerSelector.getClassByteCode(QueryClassLoader.java:127)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.compile.QueryClassLoader$ClassCompilerSelector.access$000(QueryClassLoader.java:100)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.compile.QueryClassLoader.getClassByteCode(QueryClassLoader.java:93)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.compile.ClassTransformer.getImplementationClass(ClassTransformer.java:254)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.ops.FragmentContext.getImplementationClass(FragmentContext.java:182)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.svremover.RemovingRecordBatch.getGenerated4Copier(RemovingRecordBatch.java:264)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.svremover.RemovingRecordBatch.getGenerated4Copier(RemovingRecordBatch.java:250)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.svremover.RemovingRecordBatch.setupNewSchema(RemovingRecordBatch.java:80)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext(AbstractSingleRecordBatch.java:66)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.svremover.RemovingRecordBatch.innerNext(RemovingRecordBatch.java:96)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:91)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.next(IteratorValidatorBatchIterator.java:116)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:72)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:65)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext(AbstractSingleRecordBatch.java:45)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:95)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:91)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.next(IteratorValidatorBatchIterator.java:116)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:58) ~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:97)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:48) ~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:100)
~[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:242)
[drill-java-exec-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar:1.0.0-m2-incubating-SNAPSHOT]
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_55]
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_55]
> 	at java.lang.Thread.run(Thread.java:744) [na:1.7.0_55]
> {code}
> But The same filter condition works if I apply it directly on Hbase table instead of
the view
> {code}
> select cast(tbl.row_key as int)rownum, 
> cast(tbl.stats.name as varchar(20))name,
> cast(tbl.stats.age as int)age, 
> cast(tbl.stats.gpa as float)gpa,
> cast(tbl.stats.studentnum as bigint)studentnum, 
> cast(tbl.stats.create_time as varchar(20))create_time 
> from hbase.hbase_student tbl
> where tbl.stats.age > 50;
> +------------+------------+------------+------------+------------+-------------+
> |   rownum   |    name    |    age     |    gpa     | studentnum | create_time |
> +------------+------------+------------+------------+------------+-------------+
> | 1          | fred ovid  | 76         | 1.55       | 692315658449 | 2014-05-27 00:26:07
|
> | 2          | bob brown  | 63         | 3.18       | 650706039334 | 2014-12-04 21:43:14
|
> +------------+------------+------------+------------+------------+-------------+
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message