drill-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From bridg...@apache.org
Subject [1/3] drill git commit: DRILL-2586
Date Tue, 31 Mar 2015 01:14:50 GMT
Repository: drill
Updated Branches:
  refs/heads/gh-pages f53226a3e -> 85e056237


http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-linux/004-odbc-driver-conf.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-linux/004-odbc-driver-conf.md b/_docs/interfaces/odbc-linux/004-odbc-driver-conf.md
index 26df71e..2717910 100644
--- a/_docs/interfaces/odbc-linux/004-odbc-driver-conf.md
+++ b/_docs/interfaces/odbc-linux/004-odbc-driver-conf.md
@@ -2,8 +2,6 @@
 title: "Driver Configuration Options"
 parent: "Using the MapR ODBC Driver on Linux and Mac OS X"
 ---
-[Previous](/docs/configuring-odbc-connections-for-linux-and-mac-os-x)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/configuring-odbc-connections-for-linux-and-mac-os-x)
-
 You can use various configuration options to control the behavior of the MapR
 Drill ODBC Driver. You can use these options in a connection string or in the
 `odbc.ini` configuration file for the Mac OS X version or the driver.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-linux/005-odbc-connect-str.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-linux/005-odbc-connect-str.md b/_docs/interfaces/odbc-linux/005-odbc-connect-str.md
index b8810fa..70f3858 100644
--- a/_docs/interfaces/odbc-linux/005-odbc-connect-str.md
+++ b/_docs/interfaces/odbc-linux/005-odbc-connect-str.md
@@ -2,8 +2,6 @@
 title: "Configuring ODBC Connections for Linux and Mac OS X"
 parent: "Using the MapR ODBC Driver on Linux and Mac OS X"
 ---
-[Previous](/docs/driver-configuration-options)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/advanced-properties)
-
 You can use a connection string to connect to your data source. For a list of
 all the properties that you can use in connection strings, see [Driver
 Configuration

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-linux/006-odbc-adv-prop.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-linux/006-odbc-adv-prop.md b/_docs/interfaces/odbc-linux/006-odbc-adv-prop.md
index 6d58eb8..1528746 100644
--- a/_docs/interfaces/odbc-linux/006-odbc-adv-prop.md
+++ b/_docs/interfaces/odbc-linux/006-odbc-adv-prop.md
@@ -2,8 +2,6 @@
 title: "Advanced Properties"
 parent: "Using the MapR ODBC Driver on Linux and Mac OS X"
 ---
-[Previous](/docs/configuring-odbc-connections-for-linux-and-mac-os-x)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/testing-the-odbc-connection-on-linux-and-mac-os-x)
-
 When you use advanced properties, you must separate them using a semi-colon
 (;).
 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-linux/007-odbc-connections-test.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-linux/007-odbc-connections-test.md b/_docs/interfaces/odbc-linux/007-odbc-connections-test.md
index e615e8b..bc7749a 100644
--- a/_docs/interfaces/odbc-linux/007-odbc-connections-test.md
+++ b/_docs/interfaces/odbc-linux/007-odbc-connections-test.md
@@ -2,8 +2,6 @@
 title: "Testing the ODBC Connection on Linux and Mac OS X"
 parent: "Using the MapR ODBC Driver on Linux and Mac OS X"
 ---
-[Previous](/docs/advanced-properties)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/using-the-mapr-odbc-driver-on-windows)
-
 To test the connection, you can use an ODBC-enabled client application. For a
 basic connection test, you can also use the test utilities that are packaged
 with your driver manager installation.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-win/001-install-odbc-win.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-win/001-install-odbc-win.md b/_docs/interfaces/odbc-win/001-install-odbc-win.md
index 6b7d7a4..7ff770d 100644
--- a/_docs/interfaces/odbc-win/001-install-odbc-win.md
+++ b/_docs/interfaces/odbc-win/001-install-odbc-win.md
@@ -2,8 +2,6 @@
 title: "Step 1. Install the MapR Drill ODBC Driver on Windows"
 parent: "Using the MapR ODBC Driver on Windows"
 ---
-[Previous](/docs/using-the-mapr-odbc-driver-on-windows)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/step-2-configure-odbc-connections-to-drill-data-sources)
-
 The MapR Drill ODBC Driver installer is available for 32-bit and 64-bit
 applications. Both versions of the driver can be installed on a 64-bit
 machine.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-win/002-conf-odbc-win.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-win/002-conf-odbc-win.md b/_docs/interfaces/odbc-win/002-conf-odbc-win.md
index a81b79c..5fe40b2 100644
--- a/_docs/interfaces/odbc-win/002-conf-odbc-win.md
+++ b/_docs/interfaces/odbc-win/002-conf-odbc-win.md
@@ -2,8 +2,6 @@
 title: "Step 2. Configure ODBC Connections to Drill Data Sources"
 parent: "Using the MapR ODBC Driver on Windows"
 ---
-[Previous](/docs/step-1-install-the-mapr-drill-odbc-driver-on-windows)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/step-3-connect-to-drill-data-sources-from-a-bi-tool)
-
 Complete one of the following steps to create an ODBC connection to Drill data
 sources:
 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-win/003-connect-odbc-win.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-win/003-connect-odbc-win.md b/_docs/interfaces/odbc-win/003-connect-odbc-win.md
index 4325b29..3a887c3 100644
--- a/_docs/interfaces/odbc-win/003-connect-odbc-win.md
+++ b/_docs/interfaces/odbc-win/003-connect-odbc-win.md
@@ -2,8 +2,6 @@
 title: "Step 3. Connect to Drill Data Sources from a BI Tool"
 parent: "Using the MapR ODBC Driver on Windows"
 ---
-[Previous](/docs/step-2-configure-odbc-connections-to-drill-data-sources)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/tableau-examples)
-
 After you create the ODBC DSN, you can use ODBC to directly connect to data
 that is defined by a schema, such as Hive, and data that is self-describing.
 Examples of self-describing data include HBase, Parquet, JSON, CSV,and TSV.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-win/004-tableau-examples.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-win/004-tableau-examples.md b/_docs/interfaces/odbc-win/004-tableau-examples.md
index 87eee83..e543d63 100644
--- a/_docs/interfaces/odbc-win/004-tableau-examples.md
+++ b/_docs/interfaces/odbc-win/004-tableau-examples.md
@@ -2,8 +2,6 @@
 title: "Tableau Examples"
 parent: "Using the MapR ODBC Driver on Windows"
 ---
-[Previous](/docs/step-3-connect-to-drill-data-sources-from-a-bi-tool)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/using-drill-explorer-to-browse-data-and-create-views)
-
 You can generate reports in Tableau using ODBC connections to Drill data
 sources. Each example in this section takes you through the steps to create a
 DSN to a Drill data source and then access the data in Tableau 8.1.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/interfaces/odbc-win/005-browse-view.md
----------------------------------------------------------------------
diff --git a/_docs/interfaces/odbc-win/005-browse-view.md b/_docs/interfaces/odbc-win/005-browse-view.md
index a7e6f9c..98bb511 100644
--- a/_docs/interfaces/odbc-win/005-browse-view.md
+++ b/_docs/interfaces/odbc-win/005-browse-view.md
@@ -2,8 +2,6 @@
 title: "Using Drill Explorer to Browse Data and Create Views"
 parent: "Using the MapR ODBC Driver on Windows"
 ---
-[Previous](/docs/tableau-examples)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/query-data)
-
 Drill Explorer is a simple user interface that is embedded within the ODBC
 DSN. Drill Explorer enables users to understand the metadata and data before
 visualizing the data in a BI tool. Use Drill Explorer to browse Drill data

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/query/001-get-started.md
----------------------------------------------------------------------
diff --git a/_docs/query/001-get-started.md b/_docs/query/001-get-started.md
index f60b50e..8fce9d9 100644
--- a/_docs/query/001-get-started.md
+++ b/_docs/query/001-get-started.md
@@ -2,8 +2,6 @@
 title: "Getting Started Tutorial"
 parent: "Query Data"
 ---
-[Previous](/docs/query-data)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/lesson-1-connect-to-data-sources)
-
 ## Goal
 
 This tutorial covers how to query a file and a directory on your local file

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/001-data-types.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/001-data-types.md b/_docs/sql-ref/001-data-types.md
index 7e370e4..cf353c7 100644
--- a/_docs/sql-ref/001-data-types.md
+++ b/_docs/sql-ref/001-data-types.md
@@ -9,11 +9,11 @@ After Drill reads schema-less data into SQL tables, you need to cast data types
 Differences in casting depend on the data source. The following list describes how Drill treats data types from various data sources:
 
 * HBase  
-  No implicit casting to SQL types. Convert data to appropriate types as shown in ["Querying HBase."](/docs/querying-hbase/)
+  Does not implicitly cast input to SQL types. Convert data to appropriate types as shown in ["Querying HBase."](/docs/querying-hbase/)
 * Hive  
   Implicitly casts Hive types to SQL types as shown in the Hive [type mapping example](/docs/hive-to-drill-data-type-mapping#type-mapping-example)
 * JSON  
-  Implicitly casts JSON data to its [corresponding SQL types](/docs/json-data-model#data-type-mapping) or to VARCHAR if Drillis in all text mode. 
+  Implicitly casts JSON data to its [corresponding SQL types](/docs/json-data-model#data-type-mapping) or to VARCHAR if Drill is in all text mode. 
 * MapR-DB  
   Implicitly casts MapR-DB data to SQL types when you use [the maprdb format](/docs/mapr-db-format) for reading MapR-DB data. The dfs storage plugin defines the format when you install Drill from the mapr-drill package on a MapR node.
 * Parquet  
@@ -24,11 +24,13 @@ Differences in casting depend on the data source. The following list describes h
 ## Implicit Casting
 
 
-In general, Drill implicitly casts (promotes) one type to another type based on the order of precedence shown in the following table. Drill also considers the performance cost of implicit casting to one type versus another. Drill usually implicitly casts a type from a lower precedence to a type having higher precedence. For instance, NULL can be promoted to any other type; SMALLINT can be promoted into INT. INT is not promoted to SMALLINT due to possible precision loss.
+Generally, Drill performs implicit casting based on the order of precedence shown in the implicit casting preference table. Drill usually implicitly casts a type from a lower precedence to a type having higher precedence. For instance, NULL can be promoted to any other type; SMALLINT can be promoted into INT. INT is not promoted to SMALLINT due to possible precision loss. Drill might deviate from these precedence rules for performance reasons.
 
-Under certain circumstances, such as queries involving  substr and concat functions, Drill reverses the order of precedence and allows a cast to VARCHAR from a type of higher precedence than VARCHAR, such as BIGINT. 
+Under certain circumstances, such as queries involving substr and concat functions, Drill reverses the order of precedence and allows a cast to VARCHAR from a type of higher precedence than VARCHAR, such as BIGINT. 
 
-The following table lists data types top to bottom, in descending precedence. Drill implicitly casts to more data types than are currently supported for explicit casting.
+The following table lists data types top to bottom, in descending order of precedence. Drill implicitly casts to more data types than are currently supported for explicit casting.
+
+### Implicit Casting Precedence
 
 <table>
   <tr>
@@ -41,102 +43,152 @@ The following table lists data types top to bottom, in descending precedence. Dr
     <td>1</td>
     <td>INTERVAL</td>
     <td>13</td>
-    <td>BIGINT</td>
+    <td>UINT4</td>
   </tr>
   <tr>
     <td>2</td>
     <td>INTERVALYEAR</td>
     <td>14</td>
-    <td>UINT4</td>
+    <td>INT</td>
   </tr>
   <tr>
     <td>3</td>
     <td>INTERVLADAY</td>
     <td>15</td>
-    <td>INT</td>
+    <td>UINT2</td>
   </tr>
   <tr>
     <td>4</td>
     <td>TIMESTAMPTZ</td>
     <td>16</td>
-    <td>UINT2</td>
+    <td>SMALLINT</td>
   </tr>
   <tr>
     <td>5</td>
     <td>TIMETZ</td>
     <td>17</td>
-    <td>SMALLINT</td>
+    <td>UINT1</td>
   </tr>
   <tr>
     <td>6</td>
     <td>TIMESTAMP</td>
     <td>18</td>
-    <td>UINT1</td>
+    <td>VAR16CHAR</td>
   </tr>
   <tr>
     <td>7</td>
     <td>DATE</td>
     <td>19</td>
-    <td>VAR16CHAR</td>
+    <td>FIXED16CHAR</td>
   </tr>
   <tr>
     <td>8</td>
     <td>TIME</td>
     <td>20</td>
-    <td>FIXED16CHAR</td>
+    <td>VARCHAR</td>
   </tr>
   <tr>
     <td>9</td>
-    <td>FLOAT8</td>
+    <td>DOUBLE</td>
     <td>21</td>
-    <td>VARCHAR</td>
+    <td>CHAR</td>
   </tr>
   <tr>
     <td>10</td>
     <td>DECIMAL</td>
     <td>22</td>
-    <td>FIXEDCHAR</td>
+    <td>VARBINARY*</td>
   </tr>
   <tr>
     <td>11</td>
-    <td>MONEY</td>
+    <td>UINT8</td>
     <td>23</td>
-    <td>VARBINARY</td>
+    <td>FIXEDBINARY*</td>
   </tr>
   <tr>
     <td>12</td>
-    <td>UINT8</td>
+    <td>BIGINT</td>
     <td>24</td>
-    <td>FIXEDBINARY</td>
-  </tr>
-  <tr>
-    <td></td>
-    <td></td>
-    <td>25</td>
     <td>NULL</td>
   </tr>
 </table>
 
+\* The Drill Parquet reader supports these types.
+
 ## Explicit Casting
 
-Drill supports a number of functions to cast and convert compatible data types:
+In a textual file, such as CSV, Drill interprets every field as a VARCHAR, as previously mentioned. To handle textual data, you can use the following functions to cast and convert compatible data types:
 
-* CAST  
+* [CAST](/docs/data-type-fmt#cast)  
   Casts data from one data type to another.
-* CONVERT_TO and CONVERT_FROM  
+* [CONVERT_TO and CONVERT_FROM](/docs/data-type-fmt#convert-to-and-convert-from)  
   Converts data, including binary data, from one data type to another.
-* TO_CHAR
+* [TO_CHAR]()  
   Converts a TIMESTAMP, INTERVAL, INTEGER, DOUBLE, or DECIMAL to a string.
-* TO_DATE
+* [TO_DATE]()  
   Converts a string to DATE.
-* TO_NUMBER
+* [TO_NUMBER]()  
   Converts a string to a DECIMAL.
-* TO_TIMESTAMP
+* [TO_TIMESTAMP]()  
   Converts a string to TIMESTAMP.
 
+If the SELECT statement includes a WHERE clause that compares a column of an unknown data type, cast both the value of the column and the comparison value in the WHERE clause.
+
+## Supported Data Types for Casting
+You use the following data types in queries that involve casting/converting data types:
+
+* BIGINT  
+  8-byte signed integer. the range is -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807.
+
+* BOOLEAN  
+  True or false  
+
+* DATE  
+  Years, months, and days in YYYY-MM-DD format
+
+* DECIMAL(p,s), or DEC(p,s), NUMERIC(p,s) 
+  38-digit precision number, precision is p, and scale is s. Example: DECIMAL(6,2) has 4 digits before the decimal point and 2 digits after the decimal point. 
+
+* FLOAT  
+  4-byte single precision floating point number
+
+* DOUBLE, DOUBLE PRECISION  
+  8-byte double precision floating point number. 
+
+* INTEGER or INT  
+  4-byte signed integer. The range is -2,147,483,648 to 2,147,483,647.
+
+* INTERVAL  
+  Integer fields representing a period of time in years, months, days hours, minutes, seconds and optional milliseconds using ISO 8601 format.
+
+* INTERVALDAY  
+  A simple version of the interval type expressing a period of time in days, hours, minutes, and seconds only.
+
+* INTERVALYEAR  
+  A simple version of interval representing a period of time in years and months only.
+
+* SMALLINT  
+  2-byte signed integer. The range is -32,768 to 32,767. Supported in Drill 0.9 and later. See DRILL-2135.
+
+* TIME  
+  Hours, minutes, seconds in the form HH:mm:ss, 24-hour based
+
+* TIMESTAMP  
+  JDBC timestamp in year, month, date hour, minute, second, and optional milliseconds: yyyy-MM-dd HH:mm:ss.SSS
+
+* CHARACTER VARYING, CHARACTER, CHAR, or VARCHAR  
+  Character string optionally declared with a length that indicates the maximum number of characters to use. For example, CHAR(30) casts data to a 30-character string maximum. The default limit is 1 character. The maximum character limit is 255.
+
+You specify a DECIMAL using a precision and scale. The precision (p) is the total number of digits required to represent the number.
+. The scale (s) is the number of decimal digits to the right of the decimal point. Subtract s from p to determine the maximum number of digits to the left of the decimal point. Scale is a value from 0 through p. Scale is specified only if precision is specified. The default scale is 0.
+
+For more information about and examples of casting, see [CAST]().
+
+### Explicit Type Casting Maps
+
 The following tables show data types that Drill can cast to/from other data types. Not all types are available for explicit casting in the current release.
 
-### Explicit Type Casting: Numeric and Character types
+#### Numerical and Character Data Types
 
 <table>
   <tr>
@@ -150,17 +202,15 @@ The following tables show data types that Drill can cast to/from other data type
     <th></th>
     <th></th>
     <th></th>
-    <th></th>
   </tr>
   <tr>
     <td>From:</td>
     <td>SMALLINT</td>
     <td>INT</td>
-    <td>BIGINT/UINT</td>
+    <td>BIGINT</td>
     <td>DECIMAL</td>
-    <td>FLOAT4</td>
-    <td>FLOAT8</td>
-    <td>FIXEDCHAR</td>
+    <td>FLOAT</td>
+    <td>CHAR</td>
     <td>FIXEDBINARY</td>
     <td>VARCHAR</td>
     <td>VARBINARY</td>
@@ -176,7 +226,6 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
-    <td>yes</td>
   </tr>
   <tr>
     <td>INT</td>
@@ -189,11 +238,9 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
-    <td>yes</td>
   </tr>
   <tr>
-    <td>BIGINT/UINT</td>
-    <td>yes</td>
+    <td>BIGINT</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
@@ -215,37 +262,33 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
-    <td>yes</td>
   </tr>
   <tr>
-    <td>FLOAT8</td>
+    <td>DOUBLE</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
-    <td>no</td>
     <td>yes</td>
     <td>no</td>
     <td>yes</td>
     <td>no</td>
   </tr>
   <tr>
-    <td>FLOAT4</td>
+    <td>FLOAT</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
     <td>no</td>
-    <td>no</td>
     <td>yes</td>
     <td>no</td>
     <td>yes</td>
     <td>no</td>
   </tr>
   <tr>
-    <td>FIXEDCHAR</td>
-    <td>yes</td>
+    <td>CHAR</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
@@ -257,8 +300,7 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
   </tr>
   <tr>
-    <td>FIXEDBINARY</td>
-    <td>yes</td>
+    <td>FIXEDBINARY*</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
@@ -270,8 +312,7 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
   </tr>
   <tr>
-    <td>VARCHAR</td>
-    <td>yes</td>
+    <td>VARCHAR**</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
@@ -283,8 +324,7 @@ The following tables show data types that Drill can cast to/from other data type
     <td>yes</td>
   </tr>
   <tr>
-    <td>VARBINARY</td>
-    <td>yes</td>
+    <td>VARBINARY*</td>
     <td>yes</td>
     <td>yes</td>
     <td>yes</td>
@@ -297,7 +337,11 @@ The following tables show data types that Drill can cast to/from other data type
   </tr>
 </table>
 
-### Explicit Type Casting: Date/Time types
+\* For use with CONVERT_TO/FROM to cast binary data coming to/from sources such as MapR-DB/HBase.
+
+\*\* You cannot convert a character string having a decimal point to an INT or BIGINT.
+
+#### Date and Time Data Types
 
 <table>
   <tr>
@@ -321,7 +365,7 @@ The following tables show data types that Drill can cast to/from other data type
     <td>INTERVALDAY</td>
   </tr>
   <tr>
-    <td>FIXEDCHAR</td>
+    <td>CHAR</td>
     <td>Yes</td>
     <td>Yes</td>
     <td>Yes</td>
@@ -432,44 +476,3 @@ The following tables show data types that Drill can cast to/from other data type
   </tr>
 </table>
 
-### Using CAST
-
-Embed a CAST function in a query using this syntax:
-
-    cast <expression> AS <data type> 
-
-* expression  
-  An entity that has single data value, such as a column name, of the data type you want to cast to a different type
-* data type  
-  The target data type, such as INTEGER or DATE
-
-Example: Inspect INTEGER data and cast the data to the DECIMAL type
-
-    SELECT c_row, c_int FROM mydata WHERE c_row = 9;
-
-    c_row | c_int
-    ------+------------
-        9 | -2147483648
-    (1 row)
-
-    SELECT c_row, CAST(c_int AS DECIMAL(28,8)) FROM my_data WHERE c_row = 9;
-
-    c_row | c_int
-    ------+---------------------
-    9     | -2147483648.00000000
-    (1 row)
-
-If the SELECT statement includes a WHERE clause that compares a column of an unknown data type, cast both the value of the column and the comparison value in the WHERE clause. For example:
-
-    SELECT c_row, CAST(c_int AS DECIMAL(28,8)) FROM mydata WHERE CAST(c_int AS CECIMAL(28,8)) > -3.0
-
-Do not use CAST to handle binary data conversions. Use CONVERT_TO and CONVERT_FROM for these conversions.
-
-### Using CONVERT_TO and CONVERT_FROM
-
-CONVERT_TO converts an SQL data type to complex types, including Hbase byte arrays, JSON and Parquet arrays and mapsTo query HBase data in Drill, convert every column of an HBase table to/from byte arrays from/to an [SQL data type](/docs/data-types/) that Drill supports when writing/reading data. For examples of how to use these functions, see ["Convert and Cast Functions".](/docs/sql-functions#convert-and-cast-functions)
-
-CONVERT_FROM converts from complex types, including Hbase byte arrays, JSON and Parquet arrays and maps to an SQL data type.
-
-## Handling Textual Data
-In a textual file, such as CSV, Drill interprets every field as a VARCHAR, as previously mentioned. In addition to using the CAST function, you can also use [to_char](link), [to_date](line), [to_number](link), and [to_timestamp](link). If the SELECT statement includes a WHERE clause that compares a column of an unknown data type, cast both the value of the column and the comparison value in the WHERE clause.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/data-types/001-date.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/data-types/001-date.md b/_docs/sql-ref/data-types/001-date.md
index 87f93ba..8e3b53c 100644
--- a/_docs/sql-ref/data-types/001-date.md
+++ b/_docs/sql-ref/data-types/001-date.md
@@ -6,9 +6,9 @@ Using familiar date and time formats, listed in the [SQL data types table](/docs
 
 DATE, TIME, and TIMESTAMP store values in Coordinated Universal Time (UTC). Currently, Drill does not support casting a TIMESTAMP with time zone, but you can use the TO_TIMESTAMP function (link to example) in a query to use time stamp data having a time zone.
 
+Before running a query, you can check the formatting of your dates and times as shown in the following examples. The examples refer to a dummy JSON file in the FROM clause. The dummy JSON file has following contents.
 
-Before running a query, you can check the formatting of your dates and times. First, create a dummy JSON file to use in the FROM clause for testing queries as shown in the following examples. 
-    {"dummy" : "data"}. 
+    {"dummy" : "data"}
 
 Next, use the following literals in a SELECT statement. 
 
@@ -99,28 +99,6 @@ You can run the dummy query described earlier to check the formatting of the fie
     +------------+
     1 row selected (0.076 seconds)
 
-To cast INTERVAL data use the following syntax:
-
-    CAST (column_name AS INTERVAL)
-    CAST (column_name AS INTERVAL DAY)
-    CAST (column_name AS INTERVAL YEAR)
-
-## Interval Example
-A JSON file contains the following objects:
-
-    { "INTERVALYEAR_col":"P1Y", "INTERVALDAY_col":"P1D", "INTERVAL_col":"P1Y1M1DT1H1M" }
-    { "INTERVALYEAR_col":"P2Y", "INTERVALDAY_col":"P2D", "INTERVAL_col":"P2Y2M2DT2H2M" }
-    { "INTERVALYEAR_col":"P3Y", "INTERVALDAY_col":"P3D", "INTERVAL_col":"P3Y3M3DT3H3M" }
-
-The following CTAS statement shows how to cast text from a JSON file to INTERVAL data types in a Parquet table:
-
-    CREATE TABLE dfs.tmp.parquet_intervals AS 
-    (SELECT cast (INTERVAL_col as interval),
-           cast( INTERVALYEAR_col as interval year) INTERVALYEAR_col, 
-           cast( INTERVALDAY_col as interval day) INTERVALDAY_col 
-    FROM `/user/root/intervals.json`);
-
-<!-- Text and include output -->
-
+For information about casting interval data, see the ["CAST"](/docs/data-type-fmt#cast) function.
 
 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/functions/001-math.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/functions/001-math.md b/_docs/sql-ref/functions/001-math.md
index 829611d..d077016 100644
--- a/_docs/sql-ref/functions/001-math.md
+++ b/_docs/sql-ref/functions/001-math.md
@@ -53,6 +53,11 @@ Exceptions are the LSHIFT and RSHIFT functions, which take all types except the
     <td>Converts x radians to degrees.</td>
   </tr>
   <tr>
+    <td>E()</td>
+    <td>FLOAT8</td>
+    <td>Returns 2.718281828459045.</td>
+  </tr>
+  <tr>
     <td>EXP(x)</td>
     <td>FLOAT8</td>
     <td>Returns e (Euler's number) to the power of x.</td>
@@ -65,7 +70,17 @@ Exceptions are the LSHIFT and RSHIFT functions, which take all types except the
   <tr>
     <td>LOG(x)</td>
     <td>FLOAT8</td>
-    <td>Returns the log value of x.</td>
+    <td>Returns the natural log (base e) of x.</td>
+  </tr>
+  <tr>
+    <td>LOG(x, y)</td>
+    <td>FLOAT8</td>
+    <td>Returns log base x to the y power.</td>
+  </tr>
+  <tr>
+    <td>LOG10(x)</td>
+    <td>FLOAT8</td>
+    <td>Returns the common log of x.</td>
   </tr>
   <tr>
     <td>LSHIFT(x, y)</td>
@@ -73,11 +88,36 @@ Exceptions are the LSHIFT and RSHIFT functions, which take all types except the
     <td>Shifts the binary x by y times to the left.</td>
   </tr>
   <tr>
+    <td>MOD(x, y)</td>
+    <td>FLOAT8</td>
+    <td>Returns the remainder of x divided by y. Requires a cast to DECIMAL for consistent results when x and y are FLOAT or DOUBLE.</td>
+  </tr>
+  <tr>
+    <td>NEGATIVE(x)</td>
+    <td>Same as input</td>
+    <td>Returns x as a negative number.</td>
+  </tr>
+  <tr>
+    <td>PI</td>
+    <td>FLOAT8</td>
+    <td>Returns pi.</td>
+  </tr>
+  <tr>
+    <td>POW(x, y)</td>
+    <td>FLOAT8</td>
+    <td>Returns the value of x to the y power.</td>
+  </tr>
+  <tr>
     <td>RADIANS</td>
     <td>FLOAT8</td>
     <td>Converts x degress to radians.</td>
   </tr>
   <tr>
+    <td>RAND</td>
+    <td>FLOAT8</td>
+    <td>Returns a random number from 0-1.</td>
+  </tr>
+  <tr>
     <td>ROUND(x)</td>
     <td>Same as input</td>
     <td>Rounds to the nearest integer.</td>
@@ -103,9 +143,9 @@ Exceptions are the LSHIFT and RSHIFT functions, which take all types except the
     <td>Returns the square root of x.</td>
   </tr>
   <tr>
-    <td>TRUNC(x)</td>
+    <td>TRUNC(x, y)</td>
     <td>Same as input</td>
-    <td>Truncates x toward zero.</td>
+    <td>Truncates x to y decimal places. Specifying y is optional. Default is 1.</td>
   </tr>
   <tr>
     <td>TRUNC(x, y)</td>
@@ -123,7 +163,7 @@ Examples in this section use the following files:
 
 Download the `input2.json` file from the [Drill source code](https://github.com/apache/drill/tree/master/exec/java-exec/src/test/resources/jsoninput) page. On the Mac, for example, right-click input2.json and choose Save Link As, and then click Save.
 
-Create the a dummy JSON file having the following contents:
+The following examples refer to a dummy JSON file in the FROM clause. The dummy JSON file has following contents.
 
     {"dummy" : "data"}
 
@@ -207,7 +247,7 @@ Open input2.json and change the first float value from 17.4 to 3.14159. Get valu
 * Rounded to the nearest integer.
 * Rounded to the fourth decimal place.
 
-        SELECT ROUND(`float`) FROM dfs.`/Users/khahn/Documents/test_files_source/input2.json`;
+        SELECT ROUND(`float`) FROM dfs.`/Users/drill/input2.json`;
 
         +------------+
         |   EXPR$0   |

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/functions/002-data-type-fmt.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/functions/002-data-type-fmt.md b/_docs/sql-ref/functions/002-data-type-fmt.md
new file mode 100644
index 0000000..59f58be
--- /dev/null
+++ b/_docs/sql-ref/functions/002-data-type-fmt.md
@@ -0,0 +1,343 @@
+---
+title: "Casting/Converting Data Types"
+parent: "SQL Functions"
+---
+Drill supports the following functions for casting and converting data types:
+
+* [CAST](/docs/data-type-fmt#cast)
+* [CONVERT TO/FROM](/docs/data-type-fmt#convert-to-and-convert-from)
+* [Other data type conversion functions](/docs/data-type-fmt#other-data-type-conversion-functions)
+
+# CAST
+
+The CAST function converts an entity having a single data value, such as a column name, from one type to another.
+
+## Syntax
+
+cast (<expression> AS <data type>)
+
+*expression*
+
+An entity that evaluates to one or more values, such as a column name or literal
+
+*data type*
+
+The target data type, such as INTEGER or DATE, to which to cast the expression
+
+## Usage Notes
+
+If the SELECT statement includes a WHERE clause that compares a column of an unknown data type, cast both the value of the column and the comparison value in the WHERE clause. For example:
+
+    SELECT c_row, CAST(c_int AS DECIMAL(28,8)) FROM mydata WHERE CAST(c_int AS DECIMAL(28,8)) > -3.0
+
+Do not use the CAST function for converting binary data types to other types. Although CAST works for converting VARBINARY to VARCHAR, CAST does not work in other cases for converting binary data. Use CONVERT_TO and CONVERT_FROM for converting to or from binary data. 
+
+Refer to the following tables for information about the data types to use for casting:
+
+* [Supported Data Types for Casting](/docs/supported-data-types-for-casting)
+* [Explicit Type Casting Maps](/docs/explicit-type-casting-maps)
+
+
+## Examples
+
+The following examples refer to a dummy JSON file in the FROM clause. The dummy JSON file has following contents.
+
+    {"dummy" : "data"}
+
+### Casting a character string to a number
+You cannot cast a character string that includes a decimal point to an INT or BIGINT. For example, if you have "1200.50" in a JSON file, attempting to select and cast the string to an INT fails. As a workaround, cast to a float or decimal type, and then to an integer type. 
+
+The following example shows how to cast a character to a DECIMAL having two decimal places.
+
+    SELECT CAST('1' as DECIMAL(28, 2)) FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 1.00       |
+    +------------+
+
+### Casting a number to a character string
+The first example shows that Drill uses a default limit of 1 character if you omit the VARCHAR limit: The result is truncated to 1 character.  The second example casts the same number to a VARCHAR having a limit of 3 characters: The result is a 3-character string, 456. The third example shows that you can use CHAR as an alias for VARCHAR. You can also use CHARACTER or CHARACTER VARYING.
+
+    SELECT CAST(456 as VARCHAR) FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 4          |
+    +------------+
+    1 row selected (0.063 seconds)
+
+    SELECT CAST(456 as VARCHAR(3)) FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 456        |
+    +------------+
+    1 row selected (0.08 seconds)
+
+    SELECT CAST(456 as CHAR(3)) FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 456        |
+    +------------+
+    1 row selected (0.093 seconds)
+
+### Casting from One Numerical Type to Another
+
+Cast an integer to a decimal.
+
+    SELECT CAST(-2147483648 AS DECIMAL(28,8)) FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | -2.147483648E9 |
+    +------------+
+    1 row selected (0.08 seconds)
+
+## Casting Intervals
+
+To cast INTERVAL data use the following syntax:
+
+    CAST (column_name AS INTERVAL)
+    CAST (column_name AS INTERVAL DAY)
+    CAST (column_name AS INTERVAL YEAR)
+
+A JSON file contains the following objects:
+
+    { "INTERVALYEAR_col":"P1Y", "INTERVALDAY_col":"P1D", "INTERVAL_col":"P1Y1M1DT1H1M" }
+    { "INTERVALYEAR_col":"P2Y", "INTERVALDAY_col":"P2D", "INTERVAL_col":"P2Y2M2DT2H2M" }
+    { "INTERVALYEAR_col":"P3Y", "INTERVALDAY_col":"P3D", "INTERVAL_col":"P3Y3M3DT3H3M" }
+
+The following CTAS statement shows how to cast text from a JSON file to INTERVAL data types in a Parquet table:
+
+    CREATE TABLE dfs.tmp.parquet_intervals AS 
+    (SELECT cast (INTERVAL_col as interval),
+           cast( INTERVALYEAR_col as interval year) INTERVALYEAR_col, 
+           cast( INTERVALDAY_col as interval day) INTERVALDAY_col 
+    FROM `/user/root/intervals.json`);
+
+<!-- Text and include output -->
+
+# CONVERT_TO and CONVERT_FROM
+
+The CONVERT_TO and CONVERT_FROM functions encode and decode
+data, respectively.
+
+## Syntax  
+
+CONVERT_TO (type, expression)
+
+You can use CONVERT functions to convert any compatible data type to any other type. HBase stores data as encoded byte arrays (VARBINARY data). To query HBase data in Drill, convert every column of an HBase table to/from byte arrays from/to an SQL data type that Drill supports when writing/reading data.  The CONVERT fumctions are more efficient than CAST when your data sources return binary data. 
+
+## Usage Notes
+Use the CONVERT_TO function to change the data type to bytes when sending data back to HBase from a Drill query. CONVERT_TO converts an SQL data type to complex types, including Hbase byte arrays, JSON and Parquet arrays and maps. CONVERT_FROM converts from complex types, including Hbase byte arrays, JSON and Parquet arrays and maps to an SQL data type. 
+
+## Example
+
+A common use case for CONVERT_FROM is to convert complex data embedded in
+a HBase column to a readable type. The following example converts VARBINARY data in col1 from HBase or MapR-DB table to JSON data. 
+
+    SELECT CONVERT_FROM(col1, 'JSON') 
+    FROM hbase.table1
+    ...
+
+
+# Other Data Type Conversions
+In addition to the CAST, CONVERT_TO, and CONVERT_FROM functions, Drill supports data type conversion functions to perform the following conversions:
+
+* A timestamp, integer, decimal, or double to a character string.
+* A character string to a date
+* A character string to a number
+* A character string to a timestamp with time zone
+* A decimal type to a timestamp with time zone
+
+# TO_CHAR
+
+TO_CHAR converts a date, time, timestamp, timestamp with timezone, or numerical expression to a character string.
+
+## Syntax
+
+    TO_CHAR (expression, 'format');
+
+*expression* is a float, integer, decimal, date, time, or timestamp expression. 
+
+* 'format'* is format specifier enclosed in single quotation marks that sets a pattern for the output formatting. 
+
+## Usage Notes
+For information about specifying a format, refer to one of the following format specifier documents:
+
+* [Java DecimalFormat class](http://docs.oracle.com/javase/7/docs/api/java/text/DecimalFormat.html) format specifiers 
+* [Java DateTimeFormat class](http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html)
+
+
+## Examples
+
+Convert a float to a character string.
+
+    SELECT TO_CHAR(125.789383, '#,###.###') FROM dfs.`/Users/Drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 125.789    |
+    +------------+
+
+Convert an integer to a character string.
+
+    SELECT TO_CHAR(125, '#,###.###') FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 125        |
+    +------------+
+    1 row selected (0.083 seconds)
+
+Convert a date to a character string.
+
+    SELECT to_char((cast('2008-2-23' as date)), 'yyyy-MMM-dd') FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 2008-Feb-23 |
+    +------------+
+
+Convert a time to a string.
+
+    SELECT to_char(cast('12:20:30' as time), 'HH mm ss') FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 12 20 30   |
+    +------------+
+    1 row selected (0.07 seconds)
+
+
+Convert a timestamp to a string.
+
+    SELECT to_char(cast('2015-2-23 12:00:00' as timestamp), 'yyyy MMM dd HH:mm:ss') FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 2015 Feb 23 12:00:00 |
+    +------------+
+    1 row selected (0.075 seconds)
+
+# TO_DATE
+Converts a character string or a UNIX epoch timestamp to a date.
+
+## Syntax
+
+    TO_DATE (expression[, 'format']);
+
+*expression* is a character string enclosed in single quotation marks or a UNIX epoch timestamp not enclosed in single quotation marks. 
+
+* 'format'* is format specifier enclosed in single quotation marks that sets a pattern for the output formatting. Use this option only when the expression is a character string. 
+
+## Usage 
+Specify a format using patterns defined in [Java DateTimeFormat class](http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html).
+
+
+## Examples
+The first example converts a character string to a date. The second example extracts the year to verify that Drill recognizes the date as a date type.
+
+    SELECT TO_DATE('2015-FEB-23', 'yyyy-MMM-dd') FROM dfs.`/Users/drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 2015-02-23 |
+    +------------+
+    1 row selected (0.077 seconds)
+
+    SELECT EXTRACT(year from mydate) `extracted year` FROM (SELECT TO_DATE('2015-FEB-23', 'yyyy-MMM-dd') AS mydate FROM dfs.`/Users/drill/dummy.json`);
+
+    +------------+
+    |   myyear   |
+    +------------+
+    | 2015       |
+    +------------+
+    1 row selected (0.128 seconds)
+
+# TO_NUMBER
+
+TO_NUMBER converts a character string to a formatted number using a format specification.
+
+## Syntax
+
+    TO_NUMBER ('string', 'format');
+
+*'string'* is a character string enclosed in single quotation marks. 
+
+* 'format'* is one or more [Java DecimalFormat class](http://docs.oracle.com/javase/7/docs/api/java/text/DecimalFormat.html) format specifiers enclosed in single quotation marks that set a pattern for the output formatting.
+
+
+## Usage Notes
+The data type of the output of TO_NUMBER is a numeric. You can use the following [Java DecimalFormat class](http://docs.oracle.com/javase/7/docs/api/java/text/DecimalFormat.html) format specifiers to set the output formatting. 
+
+* #  
+  Digit place holder. 
+
+* 0  
+  Digit place holder. If a value has a digit in the position where the '0' appears in the format string, that digit appears in the output; otherwise, a '0' appears in that position in the output.
+
+* .  
+  Decimal point. Make the first '.' character in the format string the location of the decimal separator in the value; ignore any additional '.' characters.
+
+* ,  
+  Comma grouping separator. 
+
+* E
+  Exponent. Separates mantissa and exponent in scientific notation. 
+
+## Examples
+
+    SELECT TO_NUMBER('987,966', '######') FROM dfs.`/Users/Drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 987.0      |
+    +------------+
+
+    SELECT TO_NUMBER('987.966', '###.###') FROM dfs.`/Users/Drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 987.966    |
+    +------------+
+    1 row selected (0.063 seconds)
+
+    SELECT TO_NUMBER('12345', '##0.##E0') FROM dfs.`/Users/Drill/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 12345.0    |
+    +------------+
+    1 row selected (0.069 seconds)
+
+# TO_TIME
+
+    SELECT to_time('12:20:30', 'HH:mm:ss') FROM dfs.`/Users/khahn/Documents/test_files_source/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 12:20:30   |
+    +------------+
+    1 row selected (0.067 seconds)
+
+
+    # TO_TIMESTAMP
+
+    SELECT to_timestamp('2008-2-23 12:00:00', 'yyyy-MM-dd HH:mm:ss') FROM dfs.`/Users/khahn/Documents/test_files_source/dummy.json`;
+    +------------+
+    |   EXPR$0   |
+    +------------+
+    | 2008-02-23 12:00:00.0 |
+    +------------+
+
+
+
+
+<!-- Apache Drill    
+Apache DrillDRILL-1141
+ISNUMERIC should be implemented as a SQL function
+SELECT count(columns[0]) as number FROM dfs.`bla` WHERE ISNUMERIC(columns[0])=1
+ -->
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/nested/001-flatten.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/nested/001-flatten.md b/_docs/sql-ref/nested/001-flatten.md
index 0a6b7aa..579d716 100644
--- a/_docs/sql-ref/nested/001-flatten.md
+++ b/_docs/sql-ref/nested/001-flatten.md
@@ -1,5 +1,5 @@
 ---
-title: "FLATTEN Function"
+title: "FLATTEN"
 parent: "Nested Data Functions"
 ---
 FLATTEN separates the elements in a repeated field into individual records.

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/nested/002-kvgen.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/nested/002-kvgen.md b/_docs/sql-ref/nested/002-kvgen.md
index 97c3e76..ba7da19 100644
--- a/_docs/sql-ref/nested/002-kvgen.md
+++ b/_docs/sql-ref/nested/002-kvgen.md
@@ -1,8 +1,8 @@
 ---
-title: "KVGEN Function"
+title: "KVGEN"
 parent: "Nested Data Functions"
 ---
-Return a list of the keys that exist in the map.
+Returns a list of the keys that exist in the map.
 
 ## Syntax
 
@@ -12,7 +12,60 @@ Return a list of the keys that exist in the map.
 
 ## Usage Notes
 
-KVGEN stands for _key-value generation_. This function is useful when complex
+You use KVGEN (Key-Value Generation) to query maps that have keys instead of a schema to
+represent data. For example, you store statistics
+about the number of interactions between the users on a social network
+in a JSON document store. User records appear as follows:
+
+	{
+	   user_id : 12345,
+	   user_interactions : {
+	       "12633" : 10,
+	       "25678" : 25,
+	       "11111" : 5
+	   }
+	}
+
+This record summarizes the interaction of one user with others. The record contains the user_id and a map between other
+users' ids and the number of interactions recorded between them. A complete dataset stores the summary of user interactions and includes 12345 listed under user_interactions for user records 12633 and 25678. For example:
+
+	{
+	        user_id: 12633,
+	        user_interactions : {
+	            "12345" : 10,
+	            "27569" : 104,
+	            "93033" : 52
+	    }
+	}
+	{       user_id: 25678,
+	        user_interactions : {
+	            "12345" : 25,
+	            "37886" : 14,
+	            "87394" : 5
+	    }
+	}
+
+To list the users that interact most, you need to use a subquery; otherwise, Drill operates on the data before the flattening and key generation occurs:
+
+    SELECT t.flat_interactions.key, t.flat_interactions.`value` from (select flatten(kvgen(user_interactions)) as flat_interactions from dfs.`/Users/khahn/Documents/test_files_source/user_table.json`) as t order by t.flat_interactions.`value` DESC;
+
+    +------------+------------+
+	|   EXPR$0   |   EXPR$1   |
+	+------------+------------+
+	| 27569      | 104        |
+	| 93033      | 52         |
+	| 25678      | 25         |
+	| 12345      | 25         |
+	| 37886      | 14         |
+	| 12633      | 10         |
+	| 12345      | 10         |
+	| 11111      | 5          |
+	| 87394      | 5          |
+	+------------+------------+
+	9 rows selected (0.093 seconds)
+
+
+KVGEN is useful when complex
 data files contain arbitrary maps that consist of relatively "unknown" column
 names. Instead of having to specify columns in the map to access the data, you
 can use KVGEN to return a list of the keys that exist in the map. KVGEN turns
@@ -42,6 +95,8 @@ this data would return:
     {"key": "c", "value": "valC"}
     {"key": "d", "value": "valD"}
 
+## Example: Different Data Type Values
+
 Assume that a JSON file called `kvgendata.json` includes multiple records that
 look like this one:
 
@@ -112,7 +167,7 @@ look like this one:
 
 A SELECT * query against this specific record returns the following row:
 
-    0: jdbc:drill:zk=local> select * from dfs.yelp.`kvgendata.json` where rownum=1;
+    0: jdbc:drill:zk=local> select * from dfs.`kvgendata.json` where rownum=1;
  
 	+------------+---------------+------------+------------+------------+------------+
 	|   rownum   | bigintegercol | varcharcol |  boolcol   | float8col  |  complex   |
@@ -154,4 +209,5 @@ distinct rows:
 	+------------+
 	9 rows selected (0.151 seconds)
 
+
 For more examples of KVGEN and FLATTEN, see the examples in the section, ["JSON Data Model"](/docs/json-data-model).
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/nested/003-repeated-cnt.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/nested/003-repeated-cnt.md b/_docs/sql-ref/nested/003-repeated-cnt.md
index 531c8ad..2aad994 100644
--- a/_docs/sql-ref/nested/003-repeated-cnt.md
+++ b/_docs/sql-ref/nested/003-repeated-cnt.md
@@ -1,5 +1,5 @@
 ---
-title: "REPEATED_COUNT Function"
+title: "REPEATED_COUNT"
 parent: "Nested Data Functions"
 ---
 This function counts the values in an array. 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/sql-ref/nested/004-repeated-contains.md
----------------------------------------------------------------------
diff --git a/_docs/sql-ref/nested/004-repeated-contains.md b/_docs/sql-ref/nested/004-repeated-contains.md
index cd12760..1c0f8f1 100644
--- a/_docs/sql-ref/nested/004-repeated-contains.md
+++ b/_docs/sql-ref/nested/004-repeated-contains.md
@@ -1,5 +1,5 @@
 ---
-title: "REPEATED_CONTAINS Function"
+title: "REPEATED_CONTAINS"
 parent: "Nested Data Functions"
 ---
 REPEATED CONTAINS searches for a keyword in an array. 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/001-install-sandbox.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/001-install-sandbox.md b/_docs/tutorial/001-install-sandbox.md
index a4e7e5b..26360ff 100644
--- a/_docs/tutorial/001-install-sandbox.md
+++ b/_docs/tutorial/001-install-sandbox.md
@@ -2,8 +2,6 @@
 title: "Installing the Apache Drill Sandbox"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/apache-drill-tutorial)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/installing-the-mapr-sandbox-with-apache-drill-on-vmware-player-vmware-fusion)
-
 ## Prerequisites
 
 The MapR Sandbox with Apache Drill runs on VMware Player and VirtualBox, free

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/002-get2kno-sb.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/002-get2kno-sb.md b/_docs/tutorial/002-get2kno-sb.md
index 27e98f6..ee2b5dd 100644
--- a/_docs/tutorial/002-get2kno-sb.md
+++ b/_docs/tutorial/002-get2kno-sb.md
@@ -2,8 +2,6 @@
 title: "Getting to Know the Drill Sandbox"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/installing-the-mapr-sandbox-with-apache-drill-on-virtualbox)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/lession-1-learn-about-the-data-set)
-
 This section covers key information about the Apache Drill tutorial. After [installing the Drill sandbox](/docs/installing-the-apache-drill-sandbox) and starting the sandbox, you can open another terminal window (Linux) or Command Prompt (Windows) and use the secure shell (ssh) to connect to the VM, assuming ssh is installed. Use the following login name and password: mapr/mapr. For
 example:
 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/003-lesson1.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/003-lesson1.md b/_docs/tutorial/003-lesson1.md
index eca2121..d92a8be 100644
--- a/_docs/tutorial/003-lesson1.md
+++ b/_docs/tutorial/003-lesson1.md
@@ -2,8 +2,6 @@
 title: "Lession 1: Learn about the Data Set"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/getting-to-know-the-drill-sandbox)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/lession-2-run-queries-with-ansi-sql)
-
 ## Goal
 
 This lesson is simply about discovering what data is available, in what

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/004-lesson2.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/004-lesson2.md b/_docs/tutorial/004-lesson2.md
index eca22bb..8dfdca1 100644
--- a/_docs/tutorial/004-lesson2.md
+++ b/_docs/tutorial/004-lesson2.md
@@ -2,8 +2,6 @@
 title: "Lession 2: Run Queries with ANSI SQL"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/lession-1-learn-about-the-data-set)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/lession-3-run-queries-on-complex-data-types)
-
 ## Goal
 
 This lesson shows how to do some standard SQL analysis in Apache Drill: for

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/005-lesson3.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/005-lesson3.md b/_docs/tutorial/005-lesson3.md
index df61a40..e913952 100644
--- a/_docs/tutorial/005-lesson3.md
+++ b/_docs/tutorial/005-lesson3.md
@@ -2,8 +2,6 @@
 title: "Lession 3: Run Queries on Complex Data Types"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/lession-2-run-queries-with-ansi-sql)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/summary)
-
 ## Goal
 
 This lesson focuses on queries that exercise functions and operators on self-

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/006-summary.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/006-summary.md b/_docs/tutorial/006-summary.md
index d77c4ef..6ebae37 100644
--- a/_docs/tutorial/006-summary.md
+++ b/_docs/tutorial/006-summary.md
@@ -2,8 +2,6 @@
 title: "Summary"
 parent: "Apache Drill Tutorial"
 ---
-[Previous](/docs/lession-3-run-queries-on-complex-data-types)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/analyzing-yelp-json-data-with-apache-drill)
-
 This tutorial introduced Apache Drill and its ability to run ANSI SQL queries
 against various data sources, including Hive tables, MapR-DB/HBase tables, and
 file system directories. The tutorial also showed how to work with and

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/install-sandbox/001-install-mapr-vm.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/install-sandbox/001-install-mapr-vm.md b/_docs/tutorial/install-sandbox/001-install-mapr-vm.md
index 27705a1..73daa6d 100644
--- a/_docs/tutorial/install-sandbox/001-install-mapr-vm.md
+++ b/_docs/tutorial/install-sandbox/001-install-mapr-vm.md
@@ -2,8 +2,6 @@
 title: "Installing the MapR Sandbox with Apache Drill on VMware Player/VMware Fusion"
 parent: "Installing the Apache Drill Sandbox"
 ---
-[Previous](/docs/installing-the-apache-drill-sandbox)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/installing-the-mapr-sandbox-with-apache-drill-on-virtualbox)
-
 Complete the following steps to install the MapR Sandbox with Apache Drill on
 VMware Player or VMware Fusion:
 

http://git-wip-us.apache.org/repos/asf/drill/blob/85e05623/_docs/tutorial/install-sandbox/002-install-mapr-vb.md
----------------------------------------------------------------------
diff --git a/_docs/tutorial/install-sandbox/002-install-mapr-vb.md b/_docs/tutorial/install-sandbox/002-install-mapr-vb.md
index 06fb4d6..e72abf9 100644
--- a/_docs/tutorial/install-sandbox/002-install-mapr-vb.md
+++ b/_docs/tutorial/install-sandbox/002-install-mapr-vb.md
@@ -2,8 +2,6 @@
 title: "Installing the MapR Sandbox with Apache Drill on VirtualBox"
 parent: "Installing the Apache Drill Sandbox"
 ---
-[Previous](/docs/installing-the-mapr-sandbox-with-apache-drill-on-vmware-player-vmware-fusion)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Back to Table of Contents](/docs)<code>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</code>[Next](/docs/getting-to-know-the-drill-sandbox)
-
 The MapR Sandbox for Apache Drill on VirtualBox comes with NAT port forwarding
 enabled, which allows you to access the sandbox using localhost as hostname.
 


Mime
View raw message