drill-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From j..@apache.org
Subject drill git commit: DRILL-3492: Add support for encoding/decoding of to/from OrderedBytes format
Date Wed, 02 Sep 2015 23:48:24 GMT
Repository: drill
Updated Branches:
  refs/heads/master 4b8e85ad6 -> 95623912e


DRILL-3492: Add support for encoding/decoding of to/from OrderedBytes
format

Description:
This change allows encoding/decoding of data from/to 'double', 'float',
'bigint', 'int' and 'utf8' data types to/from OrderedBytes format.
It also allows for OrderedByte encoded row-keys to be stored in
ascending as well as descending order.

The following JIRA added the OrderedBytes encoding to HBase:
https://issues.apache.org/jira/browse/HBASE-8201

This encoding scheme will preserve the sort-order of the native
data-type when it is stored as sorted byte arrays on disk.
Thus, it will help the HBase storage plugin if the row-keys have been
encoded in OrderedBytes format.

This functionality allows us to prune the scan ranges, thus reading much
lesser data from the server.

Testing Done:
Added a new unit-test class TestOrderedBytesConvertFunctions.java which
derives from TestConvertFunctions.java class.
Also add new test cases to TestHBaseFilterPushDown class that will test
if we were able to push-down filters correctly and if the results are
correct.

DRILL-3492 - * Remove repeated allocations of byte arrays and PositionedByteRange objects
on heap(as suggested by Jason).
* Remove OrderedBytes encode/decode operations on UTF8 types.
Reasons -
1. These operations are slow and incur a lot of heap allocations
2. UTF8 types maintain their natural sort order when stored as binary arrays.

DRILL-3492 - Remove test code that creates test tables with UTF8 OrderedByte encoding.


Project: http://git-wip-us.apache.org/repos/asf/drill/repo
Commit: http://git-wip-us.apache.org/repos/asf/drill/commit/95623912
Tree: http://git-wip-us.apache.org/repos/asf/drill/tree/95623912
Diff: http://git-wip-us.apache.org/repos/asf/drill/diff/95623912

Branch: refs/heads/master
Commit: 95623912ebf348962fe8a8846c5f47c5fdcf2f78
Parents: 4b8e85a
Author: spanchamia <spanchamia@maprtech.com>
Authored: Wed Jul 29 22:53:04 2015 -0700
Committer: Jinfeng Ni <jni@apache.org>
Committed: Wed Sep 2 15:12:50 2015 -0700

----------------------------------------------------------------------
 contrib/storage-hbase/pom.xml                   |  31 +++
 .../conv/OrderedBytesBigIntConvertFrom.java     |  53 ++++
 .../impl/conv/OrderedBytesBigIntConvertTo.java  |  63 +++++
 .../conv/OrderedBytesBigIntDescConvertTo.java   |  63 +++++
 .../conv/OrderedBytesDoubleConvertFrom.java     |  53 ++++
 .../impl/conv/OrderedBytesDoubleConvertTo.java  |  63 +++++
 .../conv/OrderedBytesDoubleDescConvertTo.java   |  63 +++++
 .../impl/conv/OrderedBytesFloatConvertFrom.java |  53 ++++
 .../impl/conv/OrderedBytesFloatConvertTo.java   |  63 +++++
 .../conv/OrderedBytesFloatDescConvertTo.java    |  63 +++++
 .../impl/conv/OrderedBytesIntConvertFrom.java   |  53 ++++
 .../fn/impl/conv/OrderedBytesIntConvertTo.java  |  63 +++++
 .../impl/conv/OrderedBytesIntDescConvertTo.java |  63 +++++
 .../store/hbase/CompareFunctionsProcessor.java  |  70 +++++
 .../exec/store/hbase/HBaseFilterBuilder.java    |  59 +++-
 .../org/apache/drill/hbase/HBaseTestsSuite.java |  48 +++-
 .../drill/hbase/TestHBaseFilterPushDown.java    | 228 ++++++++++++++++
 .../hbase/TestOrderedBytesConvertFunctions.java | 150 ++++++++++
 .../apache/drill/hbase/TestTableGenerator.java  | 273 +++++++++++++++++++
 19 files changed, 1557 insertions(+), 18 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/pom.xml
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/pom.xml b/contrib/storage-hbase/pom.xml
index d027771..254035c 100644
--- a/contrib/storage-hbase/pom.xml
+++ b/contrib/storage-hbase/pom.xml
@@ -93,6 +93,37 @@
           </systemProperties>
         </configuration>
       </plugin>
+      <plugin>
+        <artifactId>maven-resources-plugin</artifactId>
+        <executions>
+          <execution>
+            <id>copy-java-sources</id>
+            <phase>process-sources</phase>
+            <goals>
+              <goal>copy-resources</goal>
+            </goals>
+            <configuration>
+              <outputDirectory>${basedir}/target/classes/org/apache/drill/exec/expr/fn/impl</outputDirectory>
+              <resources>
+                <resource>
+                  <directory>src/main/java/org/apache/drill/exec/expr/fn/impl</directory>
+                  <filtering>true</filtering>
+                </resource>
+                <resource>
+                  <directory>src/test/java</directory>
+                  <filtering>true</filtering>
+                </resource>
+                <resource>
+                  <directory>target/generated-sources</directory>
+                  <!-- <include>*/org</include> -->
+                  <filtering>true</filtering>
+                </resource>
+              </resources>
+            </configuration>
+          </execution>
+        </executions>
+      </plugin>
+
     </plugins>
   </build>
 

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertFrom.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertFrom.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertFrom.java
new file mode 100644
index 0000000..3b8391d
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertFrom.java
@@ -0,0 +1,53 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.BigIntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(names = {"convert_fromBIGINT_OB", "convert_fromBIGINT_OBD"},
+    scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesBigIntConvertFrom implements DrillSimpleFunc {
+
+  @Param VarBinaryHolder in;
+  @Output BigIntHolder out;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    org.apache.drill.exec.util.ByteBufUtil.checkBufferLength(in.buffer, in.start, in.end, 9);
+    in.buffer.getBytes(in.start, bytes, 0, 9);
+    br.set(bytes);
+    out.value = org.apache.hadoop.hbase.util.OrderedBytes.decodeInt64(br);
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertTo.java
new file mode 100644
index 0000000..d012531
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.BigIntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toBIGINT_OB", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesBigIntConvertTo implements DrillSimpleFunc {
+
+  @Param BigIntHolder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(9);
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br, in.value,
+            org.apache.hadoop.hbase.util.Order.ASCENDING);
+
+    buffer.setBytes(0, bytes, 0, 9);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 9;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntDescConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntDescConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntDescConvertTo.java
new file mode 100644
index 0000000..463483c
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesBigIntDescConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.BigIntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toBIGINT_OBD", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesBigIntDescConvertTo implements DrillSimpleFunc {
+
+  @Param BigIntHolder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(9);
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br, in.value,
+            org.apache.hadoop.hbase.util.Order.DESCENDING);
+
+    buffer.setBytes(0, bytes, 0, 9);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 9;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertFrom.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertFrom.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertFrom.java
new file mode 100644
index 0000000..b2ae268
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertFrom.java
@@ -0,0 +1,53 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float8Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(names = {"convert_fromDOUBLE_OB", "convert_fromDOUBLE_OBD"},
+    scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesDoubleConvertFrom implements DrillSimpleFunc {
+
+  @Param VarBinaryHolder in;
+  @Output Float8Holder out;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    org.apache.drill.exec.util.ByteBufUtil.checkBufferLength(in.buffer, in.start, in.end, 9);
+    in.buffer.getBytes(in.start, bytes, 0, 9);
+    br.set(bytes);
+    out.value = org.apache.hadoop.hbase.util.OrderedBytes.decodeFloat64(br);
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertTo.java
new file mode 100644
index 0000000..d90b620
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float8Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toDOUBLE_OB", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesDoubleConvertTo implements DrillSimpleFunc {
+
+  @Param Float8Holder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(9);
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br, in.value,
+            org.apache.hadoop.hbase.util.Order.ASCENDING);
+
+    buffer.setBytes(0, bytes, 0, 9);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 9;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleDescConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleDescConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleDescConvertTo.java
new file mode 100644
index 0000000..944b1d1
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesDoubleDescConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float8Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toDOUBLE_OBD", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesDoubleDescConvertTo implements DrillSimpleFunc {
+
+  @Param Float8Holder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(9);
+    bytes = new byte[9];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br, in.value,
+            org.apache.hadoop.hbase.util.Order.DESCENDING);
+
+    buffer.setBytes(0, bytes, 0, 9);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 9;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertFrom.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertFrom.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertFrom.java
new file mode 100644
index 0000000..a66e580
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertFrom.java
@@ -0,0 +1,53 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float4Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(names = {"convert_fromFLOAT_OB", "convert_fromFLOAT_OBD"},
+    scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesFloatConvertFrom implements DrillSimpleFunc {
+
+  @Param VarBinaryHolder in;
+  @Output Float4Holder out;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    org.apache.drill.exec.util.ByteBufUtil.checkBufferLength(in.buffer, in.start, in.end, 5);
+    in.buffer.getBytes(in.start, bytes, 0, 5);
+    br.set(bytes);
+    out.value = org.apache.hadoop.hbase.util.OrderedBytes.decodeFloat32(br);
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertTo.java
new file mode 100644
index 0000000..e41469c
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float4Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toFLOAT_OB", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesFloatConvertTo implements DrillSimpleFunc {
+
+  @Param Float4Holder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(5);
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br, in.value,
+            org.apache.hadoop.hbase.util.Order.ASCENDING);
+
+    buffer.setBytes(0, bytes, 0, 5);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 5;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatDescConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatDescConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatDescConvertTo.java
new file mode 100644
index 0000000..5c40e79
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesFloatDescConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.Float4Holder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toFLOAT_OBD", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesFloatDescConvertTo implements DrillSimpleFunc {
+
+  @Param Float4Holder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(5);
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br, in.value,
+            org.apache.hadoop.hbase.util.Order.DESCENDING);
+
+    buffer.setBytes(0, bytes, 0, 5);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 5;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertFrom.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertFrom.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertFrom.java
new file mode 100644
index 0000000..6c15947
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertFrom.java
@@ -0,0 +1,53 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.IntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(names = {"convert_fromINT_OB", "convert_fromINT_OBD"},
+    scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesIntConvertFrom implements DrillSimpleFunc {
+
+  @Param VarBinaryHolder in;
+  @Output IntHolder out;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    org.apache.drill.exec.util.ByteBufUtil.checkBufferLength(in.buffer, in.start, in.end, 5);
+    in.buffer.getBytes(in.start, bytes, 0, 5);
+    br.set(bytes);
+    out.value = org.apache.hadoop.hbase.util.OrderedBytes.decodeInt32(br);
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertTo.java
new file mode 100644
index 0000000..d703318
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.IntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toINT_OB", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesIntConvertTo implements DrillSimpleFunc {
+
+  @Param IntHolder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(5);
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br, in.value,
+            org.apache.hadoop.hbase.util.Order.ASCENDING);
+
+    buffer.setBytes(0, bytes, 0, 5);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 5;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntDescConvertTo.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntDescConvertTo.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntDescConvertTo.java
new file mode 100644
index 0000000..6ed4fbf
--- /dev/null
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/expr/fn/impl/conv/OrderedBytesIntDescConvertTo.java
@@ -0,0 +1,63 @@
+/*******************************************************************************
+
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ ******************************************************************************/
+package org.apache.drill.exec.expr.fn.impl.conv;
+
+import io.netty.buffer.DrillBuf;
+
+import javax.inject.Inject;
+
+import org.apache.drill.exec.expr.DrillSimpleFunc;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.FunctionScope;
+import org.apache.drill.exec.expr.annotations.FunctionTemplate.NullHandling;
+import org.apache.drill.exec.expr.annotations.Output;
+import org.apache.drill.exec.expr.annotations.Param;
+import org.apache.drill.exec.expr.annotations.Workspace;
+import org.apache.drill.exec.expr.holders.IntHolder;
+import org.apache.drill.exec.expr.holders.VarBinaryHolder;
+
+@FunctionTemplate(name = "convert_toINT_OBD", scope = FunctionScope.SIMPLE, nulls = NullHandling.NULL_IF_NULL)
+public class OrderedBytesIntDescConvertTo implements DrillSimpleFunc {
+
+  @Param IntHolder in;
+  @Output VarBinaryHolder out;
+  @Inject DrillBuf buffer;
+  @Workspace byte[] bytes;
+  @Workspace org.apache.hadoop.hbase.util.PositionedByteRange br;
+
+  @Override
+  public void setup() {
+    buffer = buffer.reallocIfNeeded(5);
+    bytes = new byte[5];
+    br = new org.apache.hadoop.hbase.util.SimplePositionedByteRange();
+  }
+
+  @Override
+  public void eval() {
+    buffer.clear();
+    br.set(bytes);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br, in.value,
+            org.apache.hadoop.hbase.util.Order.DESCENDING);
+
+    buffer.setBytes(0, bytes, 0, 5);
+    out.buffer = buffer;
+    out.start = 0;
+    out.end = 5;
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/CompareFunctionsProcessor.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/CompareFunctionsProcessor.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/CompareFunctionsProcessor.java
index 87eb42e..2527e8d 100644
--- a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/CompareFunctionsProcessor.java
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/CompareFunctionsProcessor.java
@@ -38,6 +38,10 @@ import org.apache.drill.common.expression.ValueExpressions.QuotedString;
 import org.apache.drill.common.expression.ValueExpressions.TimeExpression;
 import org.apache.drill.common.expression.ValueExpressions.TimeStampExpression;
 import org.apache.drill.common.expression.visitors.AbstractExprVisitor;
+import org.apache.hadoop.hbase.util.Order;
+import org.apache.hadoop.hbase.util.PositionedByteRange;
+import org.apache.hadoop.hbase.util.SimplePositionedByteRange;
+
 import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.filter.Filter;
 import org.apache.hadoop.hbase.filter.PrefixFilter;
@@ -52,6 +56,7 @@ class CompareFunctionsProcessor extends AbstractExprVisitor<Boolean, LogicalExpr
   private boolean isEqualityFn;
   private SchemaPath path;
   private String functionName;
+  private boolean sortOrderAscending;
 
   // Fields for row-key prefix comparison
   // If the query is on row-key prefix, we cannot use a standard template to identify startRow, stopRow and filter
@@ -93,6 +98,7 @@ class CompareFunctionsProcessor extends AbstractExprVisitor<Boolean, LogicalExpr
     this.isEqualityFn = COMPARE_FUNCTIONS_TRANSPOSE_MAP.containsKey(functionName)
         && COMPARE_FUNCTIONS_TRANSPOSE_MAP.get(functionName).equals(functionName);
     this.isRowKeyPrefixComparison = false;
+    this.sortOrderAscending = true;
   }
 
   public byte[] getValue() {
@@ -127,6 +133,10 @@ class CompareFunctionsProcessor extends AbstractExprVisitor<Boolean, LogicalExpr
   return rowKeyPrefixFilter;
   }
 
+  public boolean isSortOrderAscending() {
+    return sortOrderAscending;
+  }
+
   @Override
   public Boolean visitCastExpression(CastExpression e, LogicalExpression valueArg) throws RuntimeException {
     if (e.getInput() instanceof CastExpression || e.getInput() instanceof SchemaPath) {
@@ -240,6 +250,66 @@ class CompareFunctionsProcessor extends AbstractExprVisitor<Boolean, LogicalExpr
             bb.writeByte(((BooleanExpression)valueArg).getBoolean() ? 1 : 0);
           }
           break;
+        case "DOUBLE_OB":
+        case "DOUBLE_OBD":
+          if (valueArg instanceof DoubleExpression) {
+            bb = newByteBuf(9, true);
+            PositionedByteRange br = new SimplePositionedByteRange(bb.array(), 0, 9);
+            if (encodingType.endsWith("_OBD")) {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br,
+                  ((DoubleExpression)valueArg).getDouble(), Order.DESCENDING);
+              this.sortOrderAscending = false;
+            } else {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br,
+                  ((DoubleExpression)valueArg).getDouble(), Order.ASCENDING);
+            }
+          }
+          break;
+        case "FLOAT_OB":
+        case "FLOAT_OBD":
+          if (valueArg instanceof FloatExpression) {
+            bb = newByteBuf(5, true);
+            PositionedByteRange br = new SimplePositionedByteRange(bb.array(), 0, 5);
+            if (encodingType.endsWith("_OBD")) {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br,
+                  ((FloatExpression)valueArg).getFloat(), Order.DESCENDING);
+              this.sortOrderAscending = false;
+            } else {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br,
+                        ((FloatExpression)valueArg).getFloat(), Order.ASCENDING);
+            }
+          }
+          break;
+        case "BIGINT_OB":
+        case "BIGINT_OBD":
+          if (valueArg instanceof LongExpression) {
+            bb = newByteBuf(9, true);
+            PositionedByteRange br = new SimplePositionedByteRange(bb.array(), 0, 9);
+            if (encodingType.endsWith("_OBD")) {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br,
+                        ((LongExpression)valueArg).getLong(), Order.DESCENDING);
+              this.sortOrderAscending = false;
+            } else {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br,
+                  ((LongExpression)valueArg).getLong(), Order.ASCENDING);
+            }
+          }
+          break;
+        case "INT_OB":
+        case "INT_OBD":
+          if (valueArg instanceof IntExpression) {
+            bb = newByteBuf(5, true);
+            PositionedByteRange br = new SimplePositionedByteRange(bb.array(), 0, 5);
+            if (encodingType.endsWith("_OBD")) {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br,
+                  ((IntExpression)valueArg).getInt(), Order.DESCENDING);
+              this.sortOrderAscending = false;
+            } else {
+              org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br,
+                        ((IntExpression)valueArg).getInt(), Order.ASCENDING);
+            }
+          }
+          break;
         case "UTF8":
           // let visitSchemaPath() handle this.
           return e.getInput().accept(this, valueArg);

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/HBaseFilterBuilder.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/HBaseFilterBuilder.java b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/HBaseFilterBuilder.java
index 623846f..0e25fa6 100644
--- a/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/HBaseFilterBuilder.java
+++ b/contrib/storage-hbase/src/main/java/org/apache/drill/exec/store/hbase/HBaseFilterBuilder.java
@@ -155,6 +155,7 @@ public class HBaseFilterBuilder extends AbstractExprVisitor<HBaseScanSpec, Void,
     String functionName = processor.getFunctionName();
     SchemaPath field = processor.getPath();
     byte[] fieldValue = processor.getValue();
+    boolean sortOrderAscending = processor.isSortOrderAscending();
     boolean isRowKey = field.getAsUnescapedPath().equals(ROW_KEY);
     if (!(isRowKey
         || (!field.getRootSegment().isLastPath()
@@ -191,29 +192,59 @@ public class HBaseFilterBuilder extends AbstractExprVisitor<HBaseScanSpec, Void,
       compareOp = CompareOp.NOT_EQUAL;
       break;
     case "greater_than_or_equal_to":
-      compareOp = CompareOp.GREATER_OR_EQUAL;
-      if (isRowKey) {
-        startRow = fieldValue;
+      if (sortOrderAscending) {
+        compareOp = CompareOp.GREATER_OR_EQUAL;
+        if (isRowKey) {
+          startRow = fieldValue;
+        }
+      } else {
+        compareOp = CompareOp.LESS_OR_EQUAL;
+        if (isRowKey) {
+          // stopRow should be just greater than 'value'
+          stopRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+        }
       }
       break;
     case "greater_than":
-      compareOp = CompareOp.GREATER;
-      if (isRowKey) {
-        // startRow should be just greater than 'value'
-        startRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+      if (sortOrderAscending) {
+        compareOp = CompareOp.GREATER;
+        if (isRowKey) {
+          // startRow should be just greater than 'value'
+          startRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+        }
+      } else {
+        compareOp = CompareOp.LESS;
+        if (isRowKey) {
+          stopRow = fieldValue;
+        }
       }
       break;
     case "less_than_or_equal_to":
-      compareOp = CompareOp.LESS_OR_EQUAL;
-      if (isRowKey) {
-        // stopRow should be just greater than 'value'
-        stopRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+      if (sortOrderAscending) {
+        compareOp = CompareOp.LESS_OR_EQUAL;
+        if (isRowKey) {
+          // stopRow should be just greater than 'value'
+          stopRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+        }
+      } else {
+        compareOp = CompareOp.GREATER_OR_EQUAL;
+        if (isRowKey) {
+          startRow = fieldValue;
+        }
       }
       break;
     case "less_than":
-      compareOp = CompareOp.LESS;
-      if (isRowKey) {
-        stopRow = fieldValue;
+      if (sortOrderAscending) {
+        compareOp = CompareOp.LESS;
+        if (isRowKey) {
+          stopRow = fieldValue;
+        }
+      } else {
+        compareOp = CompareOp.GREATER;
+        if (isRowKey) {
+          // startRow should be just greater than 'value'
+          startRow = Arrays.copyOf(fieldValue, fieldValue.length+1);
+        }
       }
       break;
     case "isnull":

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/HBaseTestsSuite.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/HBaseTestsSuite.java b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/HBaseTestsSuite.java
index a5dbc6f..2063503 100644
--- a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/HBaseTestsSuite.java
+++ b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/HBaseTestsSuite.java
@@ -53,6 +53,14 @@ public class HBaseTestsSuite {
   protected static final String TEST_TABLE_COMPOSITE_DATE = "TestTableCompositeDate";
   protected static final String TEST_TABLE_COMPOSITE_TIME = "TestTableCompositeTime";
   protected static final String TEST_TABLE_COMPOSITE_INT = "TestTableCompositeInt";
+  protected static final String TEST_TABLE_DOUBLE_OB = "TestTableDoubleOB";
+  protected static final String TEST_TABLE_FLOAT_OB = "TestTableFloatOB";
+  protected static final String TEST_TABLE_BIGINT_OB = "TestTableBigIntOB";
+  protected static final String TEST_TABLE_INT_OB = "TestTableIntOB";
+  protected static final String TEST_TABLE_DOUBLE_OB_DESC = "TestTableDoubleOBDesc";
+  protected static final String TEST_TABLE_FLOAT_OB_DESC = "TestTableFloatOBDesc";
+  protected static final String TEST_TABLE_BIGINT_OB_DESC = "TestTableBigIntOBDesc";
+  protected static final String TEST_TABLE_INT_OB_DESC = "TestTableIntOBDesc";
 
   private static Configuration conf;
 
@@ -138,10 +146,18 @@ public class HBaseTestsSuite {
   }
 
   private static boolean tablesExist() throws IOException {
-    return admin.tableExists(TEST_TABLE_1) && admin.tableExists(TEST_TABLE_3) &&
-           admin.tableExists(TEST_TABLE_COMPOSITE_DATE) &&
-           admin.tableExists(TEST_TABLE_COMPOSITE_TIME) &&
-           admin.tableExists(TEST_TABLE_COMPOSITE_INT);
+    return admin.tableExists(TEST_TABLE_1) && admin.tableExists(TEST_TABLE_3)
+           && admin.tableExists(TEST_TABLE_COMPOSITE_DATE)
+           && admin.tableExists(TEST_TABLE_COMPOSITE_TIME)
+           && admin.tableExists(TEST_TABLE_COMPOSITE_INT)
+           && admin.tableExists(TEST_TABLE_DOUBLE_OB)
+           && admin.tableExists(TEST_TABLE_FLOAT_OB)
+           && admin.tableExists(TEST_TABLE_BIGINT_OB)
+           && admin.tableExists(TEST_TABLE_INT_OB)
+           && admin.tableExists(TEST_TABLE_DOUBLE_OB_DESC)
+           && admin.tableExists(TEST_TABLE_FLOAT_OB_DESC)
+           && admin.tableExists(TEST_TABLE_BIGINT_OB_DESC)
+           && admin.tableExists(TEST_TABLE_INT_OB_DESC);
   }
 
   private static void createTestTables() throws Exception {
@@ -155,6 +171,14 @@ public class HBaseTestsSuite {
     TestTableGenerator.generateHBaseDatasetCompositeKeyDate(admin, TEST_TABLE_COMPOSITE_DATE, 1);
     TestTableGenerator.generateHBaseDatasetCompositeKeyTime(admin, TEST_TABLE_COMPOSITE_TIME, 1);
     TestTableGenerator.generateHBaseDatasetCompositeKeyInt(admin, TEST_TABLE_COMPOSITE_INT, 1);
+    TestTableGenerator.generateHBaseDatasetDoubleOB(admin, TEST_TABLE_DOUBLE_OB, 1);
+    TestTableGenerator.generateHBaseDatasetFloatOB(admin, TEST_TABLE_FLOAT_OB, 1);
+    TestTableGenerator.generateHBaseDatasetBigIntOB(admin, TEST_TABLE_BIGINT_OB, 1);
+    TestTableGenerator.generateHBaseDatasetIntOB(admin, TEST_TABLE_INT_OB, 1);
+    TestTableGenerator.generateHBaseDatasetDoubleOBDesc(admin, TEST_TABLE_DOUBLE_OB_DESC, 1);
+    TestTableGenerator.generateHBaseDatasetFloatOBDesc(admin, TEST_TABLE_FLOAT_OB_DESC, 1);
+    TestTableGenerator.generateHBaseDatasetBigIntOBDesc(admin, TEST_TABLE_BIGINT_OB_DESC, 1);
+    TestTableGenerator.generateHBaseDatasetIntOBDesc(admin, TEST_TABLE_INT_OB_DESC, 1);
   }
 
   private static void cleanupTestTables() throws IOException {
@@ -168,6 +192,22 @@ public class HBaseTestsSuite {
     admin.deleteTable(TEST_TABLE_COMPOSITE_TIME);
     admin.disableTable(TEST_TABLE_COMPOSITE_INT);
     admin.deleteTable(TEST_TABLE_COMPOSITE_INT);
+    admin.disableTable(TEST_TABLE_DOUBLE_OB);
+    admin.deleteTable(TEST_TABLE_DOUBLE_OB);
+    admin.disableTable(TEST_TABLE_FLOAT_OB);
+    admin.deleteTable(TEST_TABLE_FLOAT_OB);
+    admin.disableTable(TEST_TABLE_BIGINT_OB);
+    admin.deleteTable(TEST_TABLE_BIGINT_OB);
+    admin.disableTable(TEST_TABLE_INT_OB);
+    admin.deleteTable(TEST_TABLE_INT_OB);
+    admin.disableTable(TEST_TABLE_DOUBLE_OB_DESC);
+    admin.deleteTable(TEST_TABLE_DOUBLE_OB_DESC);
+    admin.disableTable(TEST_TABLE_FLOAT_OB_DESC);
+    admin.deleteTable(TEST_TABLE_FLOAT_OB_DESC);
+    admin.disableTable(TEST_TABLE_BIGINT_OB_DESC);
+    admin.deleteTable(TEST_TABLE_BIGINT_OB_DESC);
+    admin.disableTable(TEST_TABLE_INT_OB_DESC);
+    admin.deleteTable(TEST_TABLE_INT_OB_DESC);
   }
 
   public static int getZookeeperPort() {

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestHBaseFilterPushDown.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestHBaseFilterPushDown.java b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestHBaseFilterPushDown.java
index 5c3a463..05fb0b7 100644
--- a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestHBaseFilterPushDown.java
+++ b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestHBaseFilterPushDown.java
@@ -226,6 +226,234 @@ public class TestHBaseFilterPushDown extends BaseHBaseTest {
   }
 
   @Test
+  public void testFilterPushDownDoubleOB() throws Exception {
+    setColumnWidths(new int[] {8, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'DOUBLE_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableDoubleOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'DOUBLE_OB') > cast(95.54 as DOUBLE)"
+        , 6);
+  }
+
+  @Test
+  public void testFilterPushDownDoubleOBPlan() throws Exception {
+    setColumnWidths(new int[] {8, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'DOUBLE_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableDoubleOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'DOUBLE_OB') > cast(95.54 as DOUBLE)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownDoubleOBDesc() throws Exception {
+    setColumnWidths(new int[] {8, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'DOUBLE_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableDoubleOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'DOUBLE_OBD') > cast(95.54 as DOUBLE)"
+        , 6);
+  }
+
+  @Test
+  public void testFilterPushDownDoubleOBDescPlan() throws Exception {
+    setColumnWidths(new int[] {8, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'DOUBLE_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableDoubleOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'DOUBLE_OBD') > cast(95.54 as DOUBLE)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownIntOB() throws Exception {
+    setColumnWidths(new int[] {15, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'INT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableIntOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'INT_OB') >= cast(-32 as INT) AND"
+        + "  CONVERT_FROM(row_key, 'INT_OB') < cast(59 as INT)"
+        , 91);
+  }
+
+  @Test
+  public void testFilterPushDownIntOBDesc() throws Exception {
+    setColumnWidths(new int[] {15, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'INT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableIntOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'INT_OBD') >= cast(-32 as INT) AND"
+        + "  CONVERT_FROM(row_key, 'INT_OBD') < cast(59 as INT)"
+        , 91);
+  }
+
+  @Test
+  public void testFilterPushDownIntOBPlan() throws Exception {
+    setColumnWidths(new int[] {15, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'INT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableIntOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'INT_OB') > cast(-23 as INT) AND"
+        + "  CONVERT_FROM(row_key, 'INT_OB') < cast(14 as INT)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownIntOBDescPlan() throws Exception {
+    setColumnWidths(new int[] {15, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'INT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableIntOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'INT_OBD') > cast(-23 as INT) AND"
+        + "  CONVERT_FROM(row_key, 'INT_OBD') < cast(14 as INT)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownBigIntOB() throws Exception {
+    setColumnWidths(new int[] {15, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'BIGINT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableBigIntOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OB') > cast(1438034423063 as BIGINT) AND"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OB') <= cast(1438034423097 as BIGINT)"
+        , 34);
+  }
+
+  @Test
+  public void testFilterPushDownBigIntOBPlan() throws Exception {
+    setColumnWidths(new int[] {15, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'BIGINT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableBigIntOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OB') > cast(1438034423063 as BIGINT) AND"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OB') < cast(1438034423097 as BIGINT)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownFloatOB() throws Exception {
+    setColumnWidths(new int[] {8, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'FLOAT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableFloatOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OB') > cast(95.74 as FLOAT) AND"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OB') < cast(99.5 as FLOAT)"
+        , 5);
+  }
+
+  @Test
+  public void testFilterPushDownFloatOBPlan() throws Exception {
+    setColumnWidths(new int[] {8, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'FLOAT_OB') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableFloatOB` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OB') > cast(95.54 as FLOAT) AND"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OB') < cast(99.77 as FLOAT)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownBigIntOBDesc() throws Exception {
+    setColumnWidths(new int[] {15, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'BIGINT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableBigIntOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OBD') > cast(1438034423063 as BIGINT) AND"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OBD') <= cast(1438034423097 as BIGINT)"
+        , 34);
+  }
+
+  @Test
+  public void testFilterPushDownBigIntOBDescPlan() throws Exception {
+    setColumnWidths(new int[] {15, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'BIGINT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableBigIntOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OBD') > cast(1438034423063 as BIGINT) AND"
+        + "  CONVERT_FROM(row_key, 'BIGINT_OBD') < cast(1438034423097 as BIGINT)"
+        , 1);
+  }
+
+  @Test
+  public void testFilterPushDownFloatOBDesc() throws Exception {
+    setColumnWidths(new int[] {8, 25});
+    runHBaseSQLVerifyCount("SELECT\n"
+        + " convert_from(t.row_key, 'FLOAT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableFloatOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OBD') > cast(95.74 as FLOAT) AND"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OBD') < cast(99.5 as FLOAT)"
+        , 5);
+  }
+
+  @Test
+  public void testFilterPushDownFloatOBDescPlan() throws Exception {
+    setColumnWidths(new int[] {8, 2000});
+    runHBaseSQLVerifyCount("EXPLAIN PLAN FOR\n"
+        + "SELECT\n"
+        + " convert_from(t.row_key, 'FLOAT_OBD') rk,\n"
+        + " convert_from(t.`f`.`c`, 'UTF8') val\n"
+        + "FROM\n"
+        + "  hbase.`TestTableFloatOBDesc` t\n"
+        + "WHERE\n"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OBD') > cast(95.54 as FLOAT) AND"
+        + "  CONVERT_FROM(row_key, 'FLOAT_OBD') < cast(99.77 as FLOAT)"
+        , 1);
+  }
+
+  @Test
   public void testFilterPushDownRowKeyLike() throws Exception {
     setColumnWidths(new int[] {8, 22});
     final String sql = "SELECT\n"

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestOrderedBytesConvertFunctions.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestOrderedBytesConvertFunctions.java b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestOrderedBytesConvertFunctions.java
new file mode 100644
index 0000000..96c3668
--- /dev/null
+++ b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestOrderedBytesConvertFunctions.java
@@ -0,0 +1,150 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.drill.hbase;
+
+import static org.apache.drill.TestBuilder.listOf;
+import static org.apache.drill.TestBuilder.mapOf;
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.assertTrue;
+import static org.junit.Assert.fail;
+import io.netty.buffer.DrillBuf;
+
+import java.util.ArrayList;
+import java.util.List;
+
+import mockit.Injectable;
+
+import org.apache.drill.BaseTestQuery;
+import org.apache.drill.TestBuilder;
+import org.apache.drill.exec.compile.ClassTransformer;
+import org.apache.drill.exec.compile.ClassTransformer.ScalarReplacementOption;
+import org.apache.drill.exec.expr.fn.impl.DateUtility;
+import org.apache.drill.exec.proto.UserBitShared.QueryType;
+import org.apache.drill.exec.record.RecordBatchLoader;
+import org.apache.drill.exec.rpc.RpcException;
+import org.apache.drill.exec.rpc.user.QueryDataBatch;
+import org.apache.drill.exec.rpc.user.UserServer;
+import org.apache.drill.exec.server.Drillbit;
+import org.apache.drill.exec.server.DrillbitContext;
+import org.apache.drill.exec.server.options.OptionManager;
+import org.apache.drill.exec.server.options.OptionValue;
+import org.apache.drill.exec.server.options.OptionValue.OptionType;
+import org.apache.drill.exec.util.ByteBufUtil.HadoopWritables;
+import org.apache.drill.exec.util.VectorUtil;
+import org.apache.drill.exec.vector.ValueVector;
+import org.apache.drill.exec.vector.VarCharVector;
+import org.joda.time.DateTime;
+import org.junit.BeforeClass;
+import org.junit.Ignore;
+import org.junit.Test;
+
+import com.google.common.base.Charsets;
+import com.google.common.io.Resources;
+
+public class TestOrderedBytesConvertFunctions extends BaseTestQuery {
+
+  private static final String CONVERSION_TEST_PHYSICAL_PLAN = "functions/conv/conversionTestWithPhysicalPlan.json";
+  private static final float DELTA = (float) 0.0001;
+
+  String textFileContent;
+
+  @Test
+  public void testOrderedBytesDouble() throws Throwable {
+    verifyPhysicalPlan("convert_to(4.9e-324, 'DOUBLE_OB')", new byte[] {0x31, (byte)0x80, 0, 0, 0, 0, 0, 0, 0x01});
+  }
+
+  @Test
+  public void testOrderedBytesDoubleConvertFrom() throws Throwable {
+    verifyPhysicalPlan("convert_from(binary_string('\\x31\\x80\\x00\\x00\\x00\\x00\\x00\\x00\\x01'), 'DOUBLE_OB')", new Double(4.9e-324));
+  }
+
+  protected <T> void verifyPhysicalPlan(String expression, T expectedResults) throws Throwable {
+    expression = expression.replace("\\", "\\\\\\\\"); // "\\\\\\\\" => Java => "\\\\" => JsonParser => "\\" => AntlrParser "\"
+
+    if (textFileContent == null) {
+      textFileContent = Resources.toString(Resources.getResource(CONVERSION_TEST_PHYSICAL_PLAN), Charsets.UTF_8);
+    }
+    String planString = textFileContent.replace("__CONVERT_EXPRESSION__", expression);
+
+    verifyResults(expression, expectedResults, getRunResult(QueryType.PHYSICAL, planString));
+  }
+
+  protected Object[] getRunResult(QueryType queryType, String planString) throws Exception {
+    List<QueryDataBatch> resultList = testRunAndReturn(queryType, planString);
+
+    List<Object> res = new ArrayList<Object>();
+    RecordBatchLoader loader = new RecordBatchLoader(getAllocator());
+    for(QueryDataBatch result : resultList) {
+      if (result.getData() != null) {
+        loader.load(result.getHeader().getDef(), result.getData());
+        ValueVector v = loader.iterator().next().getValueVector();
+        for (int j = 0; j < v.getAccessor().getValueCount(); j++) {
+          if  (v instanceof VarCharVector) {
+            res.add(new String(((VarCharVector) v).getAccessor().get(j)));
+          } else {
+            res.add(v.getAccessor().getObject(j));
+          }
+        }
+        loader.clear();
+        result.release();
+      }
+    }
+
+    return res.toArray();
+  }
+
+  protected <T> void verifyResults(String expression, T expectedResults, Object[] actualResults) throws Throwable {
+    String testName = String.format("Expression: %s.", expression);
+    assertEquals(testName, 1, actualResults.length);
+    assertNotNull(testName, actualResults[0]);
+    if (expectedResults.getClass().isArray()) {
+      assertArraysEquals(testName, expectedResults, actualResults[0]);
+    } else {
+      assertEquals(testName, expectedResults, actualResults[0]);
+    }
+  }
+
+  protected void assertArraysEquals(Object expected, Object actual) {
+    assertArraysEquals(null, expected, actual);
+  }
+
+  protected void assertArraysEquals(String message, Object expected, Object actual) {
+    if (expected instanceof byte[] && actual instanceof byte[]) {
+      assertArrayEquals(message, (byte[]) expected, (byte[]) actual);
+    } else if (expected instanceof Object[] && actual instanceof Object[]) {
+      assertArrayEquals(message, (Object[]) expected, (Object[]) actual);
+    } else if (expected instanceof char[] && actual instanceof char[]) {
+      assertArrayEquals(message, (char[]) expected, (char[]) actual);
+    } else if (expected instanceof short[] && actual instanceof short[]) {
+      assertArrayEquals(message, (short[]) expected, (short[]) actual);
+    } else if (expected instanceof int[] && actual instanceof int[]) {
+      assertArrayEquals(message, (int[]) expected, (int[]) actual);
+    } else if (expected instanceof long[] && actual instanceof long[]) {
+      assertArrayEquals(message, (long[]) expected, (long[]) actual);
+    } else if (expected instanceof float[] && actual instanceof float[]) {
+      assertArrayEquals(message, (float[]) expected, (float[]) actual, DELTA);
+    } else if (expected instanceof double[] && actual instanceof double[]) {
+      assertArrayEquals(message, (double[]) expected, (double[]) actual, DELTA);
+    } else {
+      fail(String.format("%s: Error comparing arrays of type '%s' and '%s'",
+          expected.getClass().getName(), (actual == null ? "null" : actual.getClass().getName())));
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/drill/blob/95623912/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestTableGenerator.java
----------------------------------------------------------------------
diff --git a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestTableGenerator.java b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestTableGenerator.java
index 07ae697..e738bba 100644
--- a/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestTableGenerator.java
+++ b/contrib/storage-hbase/src/test/java/org/apache/drill/hbase/TestTableGenerator.java
@@ -331,4 +331,277 @@ public class TestTableGenerator {
     table.flushCommits();
     table.close();
   }
+
+  public static void generateHBaseDatasetDoubleOB(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (double i = 0.5; i <= 100.00; i += 0.75) {
+        byte[] bytes = new byte[9];
+        org.apache.hadoop.hbase.util.PositionedByteRange br =
+                new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 9);
+        org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br, i,
+                org.apache.hadoop.hbase.util.Order.ASCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %03f", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetFloatOB(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (float i = (float)0.5; i <= 100.00; i += 0.75) {
+      byte[] bytes = new byte[5];
+      org.apache.hadoop.hbase.util.PositionedByteRange br =
+              new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 5);
+      org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br, i,
+              org.apache.hadoop.hbase.util.Order.ASCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %03f", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetBigIntOB(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+   HTableDescriptor desc = new HTableDescriptor(tableName);
+  desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+  if (numberRegions > 1) {
+    admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+  } else {
+    admin.createTable(desc);
+  }
+
+  HTable table = new HTable(admin.getConfiguration(), tableName);
+  long startTime = (long)1438034423 * 1000;
+  for (long i = startTime; i <= startTime + 100; i ++) {
+    byte[] bytes = new byte[9];
+    org.apache.hadoop.hbase.util.PositionedByteRange br =
+            new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 9);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br, i,
+            org.apache.hadoop.hbase.util.Order.ASCENDING);
+    Put p = new Put(bytes);
+    p.add(FAMILY_F, COLUMN_C, String.format("value %d", i).getBytes());
+    table.put(p);
+  }
+
+  table.flushCommits();
+  table.close();
+
+  admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetIntOB(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (int i = -49; i <= 100; i ++) {
+      byte[] bytes = new byte[5];
+      org.apache.hadoop.hbase.util.PositionedByteRange br =
+              new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 5);
+      org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br, i,
+              org.apache.hadoop.hbase.util.Order.ASCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %d", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetDoubleOBDesc(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (double i = 0.5; i <= 100.00; i += 0.75) {
+        byte[] bytes = new byte[9];
+        org.apache.hadoop.hbase.util.PositionedByteRange br =
+                new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 9);
+        org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat64(br, i,
+                org.apache.hadoop.hbase.util.Order.DESCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %03f", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetFloatOBDesc(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (float i = (float)0.5; i <= 100.00; i += 0.75) {
+      byte[] bytes = new byte[5];
+      org.apache.hadoop.hbase.util.PositionedByteRange br =
+              new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 5);
+      org.apache.hadoop.hbase.util.OrderedBytes.encodeFloat32(br, i,
+              org.apache.hadoop.hbase.util.Order.DESCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %03f", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
+
+  public static void generateHBaseDatasetBigIntOBDesc(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+   HTableDescriptor desc = new HTableDescriptor(tableName);
+  desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+  if (numberRegions > 1) {
+    admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+  } else {
+    admin.createTable(desc);
+  }
+
+  HTable table = new HTable(admin.getConfiguration(), tableName);
+  long startTime = (long)1438034423 * 1000;
+  for (long i = startTime; i <= startTime + 100; i ++) {
+    byte[] bytes = new byte[9];
+    org.apache.hadoop.hbase.util.PositionedByteRange br =
+            new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 9);
+    org.apache.hadoop.hbase.util.OrderedBytes.encodeInt64(br, i,
+            org.apache.hadoop.hbase.util.Order.DESCENDING);
+    Put p = new Put(bytes);
+    p.add(FAMILY_F, COLUMN_C, String.format("value %d", i).getBytes());
+    table.put(p);
+  }
+
+  table.flushCommits();
+  table.close();
+
+  admin.flush(tableName);
+  }
+
+
+  public static void generateHBaseDatasetIntOBDesc(HBaseAdmin admin, String tableName, int numberRegions) throws Exception {
+    if (admin.tableExists(tableName)) {
+      admin.disableTable(tableName);
+      admin.deleteTable(tableName);
+    }
+
+    HTableDescriptor desc = new HTableDescriptor(tableName);
+    desc.addFamily(new HColumnDescriptor(FAMILY_F));
+
+    if (numberRegions > 1) {
+      admin.createTable(desc, Arrays.copyOfRange(SPLIT_KEYS, 0, numberRegions-1));
+    } else {
+      admin.createTable(desc);
+    }
+
+    HTable table = new HTable(admin.getConfiguration(), tableName);
+
+    for (int i = -49; i <= 100; i ++) {
+      byte[] bytes = new byte[5];
+      org.apache.hadoop.hbase.util.PositionedByteRange br =
+              new org.apache.hadoop.hbase.util.SimplePositionedByteRange(bytes, 0, 5);
+      org.apache.hadoop.hbase.util.OrderedBytes.encodeInt32(br, i,
+              org.apache.hadoop.hbase.util.Order.DESCENDING);
+      Put p = new Put(bytes);
+      p.add(FAMILY_F, COLUMN_C, String.format("value %d", i).getBytes());
+      table.put(p);
+    }
+
+    table.flushCommits();
+    table.close();
+
+    admin.flush(tableName);
+  }
 }


Mime
View raw message