- +
@@ -1007,11 +1007,9 @@
-

JReport is an embeddable BI solution that empowers users to analyze data and create reports and dashboards. JReport accesses data from Hadoop systems, such as the MapR Distribution through Apache Drill, as well as other big data and transactional data sources. By visualizing data through Drill, users can perform their own reporting and data discovery for agile, on-the-fly decision-making.

+

JReport is an embeddable BI solution that empowers users to analyze data and create reports and dashboards. JReport accesses data from Hadoop systems through Apache Drill. By visualizing data through Drill, users can perform their own reporting and data discovery for agile, on-the-fly decision-making.

-

You can use JReport 13.1 and the the Apache Drill JDBC Driver to easily extract data from the MapR Distribution and visulaize it, creating reports and dashboards that you can embed into your own applications.

- -

Complete the following simple steps to use Apache Drill with JReport:

+

You can use JReport 13.1 and the Apache Drill JDBC Driver to easily extract data and visualize it, creating reports and dashboards that you can embed into your own applications. Complete the following simple steps to use Apache Drill with JReport:

  1. Install the Drill JDBC Driver with JReport.
  2. @@ -1023,26 +1021,19 @@

    Step 1: Install the Drill JDBC Driver with JReport

    -

    Drill provides standard JDBC connectivity to easily integrate with JReport. JReport 13.1 requires Drill 1.0 or later.

    - -

    For general instructions on installing the Drill JDBC driver, see Using JDBC.

    +

    Drill provides standard JDBC connectivity to integrate with JReport. JReport 13.1 requires Drill 1.0 or later. +For general instructions on installing the Drill JDBC driver, see Using JDBC.

      -
    1. Locate the JDBC driver in the Drill installation directory on any node where Drill is installed on the cluster:

      -
      <drill-home>/jars/jdbc-driver/drill-jdbc-all-<drill-version>.jar 
      -
      -

      For example:

      -
      /opt/mapr/drill/drill-1.0.0/jars/jdbc-driver/drill-jdbc-all-1.0.0.jar
      -
    2. -
    3. Copy the Drill JDBC driver into the JReport lib folder:

      -
      %REPORTHOME%\lib\
      -
      -

      For example, on Windows, copy the Drill JDBC driver jar file into:

      +
    4. Locate the JDBC driver in the Drill installation directory on any node where Drill is installed on the cluster: + /jars/jdbc-driver/drill-jdbc-all-.jar

    5. +
    6. Copy the Drill JDBC driver into the JReport lib folder: + %REPORTHOME%\lib\ +For example, on Windows, copy the Drill JDBC driver jar file into:

      C:\JReport\Designer\lib\drill-jdbc-all-1.0.0.jar
       
    7. -
    8. Add the location of the JAR file to the JReport CLASSPATH variable. On Windows, edit the C:\JReport\Designer\bin\setenv.bat file:

      - -

      drill query flow

    9. +
    10. Add the location of the JAR file to the JReport CLASSPATH variable. On Windows, edit the C:\JReport\Designer\bin\setenv.bat file: +drill query flow

    11. Verify that the JReport system can resolve the hostnames of the ZooKeeper nodes of the Drill cluster. You can do this by configuring DNS for all of the systems. Alternatively, you can edit the hosts file on the JReport system to include the hostnames and IP addresses of all the ZooKeeper nodes used with the Drill cluster. For Linux systems, the hosts file is located at /etc/hosts. For Windows systems, the hosts file is located at %WINDIR%\system32\drivers\etc\hosts Here is an example of a Windows hosts file: drill query flow

    @@ -1068,14 +1059,11 @@
    1. In the Catalog Browser, right-click Queries and select Add Query…
    2. Define a JReport query by using the Query Editor. You can also import your own SQL statements. drill query flow
    3. -
    4. Click OK to close the Query Editor, and click the Save Catalog button to save your progress to the catalog file.

      - -

      Note: If the report returns errors, you may need to edit the query and add the schema in front of the table name: select column from schema.table_name You can do this by clicking the SQL button on the Query Editor.

    5. -
    6. Use JReport Designer to query the data and create a report. drill query flow

      - -

      drill query flow

      - -

      drill query flow

    7. +
    8. Click OK to close the Query Editor, and click the Save Catalog button to save your progress to the catalog file. +Note: If the report returns errors, you may need to edit the query and add the schema in front of the table name: select column from schema.table_name You can do this by clicking the SQL button on the Query Editor.

    9. +
    10. Use JReport Designer to query the data and create a report. drill query flow +drill query flow +drill query flow

    http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-multitenant-resources/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-multitenant-resources/index.html b/docs/configuring-multitenant-resources/index.html index e2b8020..7bf377b 100644 --- a/docs/configuring-multitenant-resources/index.html +++ b/docs/configuring-multitenant-resources/index.html @@ -436,6 +436,8 @@
  3. Using Apache Drill with Tableau 9 Server
  4. +
  5. Configuring JReport with Drill
  6. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-odbc-on-linux/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-odbc-on-linux/index.html b/docs/configuring-odbc-on-linux/index.html index efbf615..44d8d3d 100644 --- a/docs/configuring-odbc-on-linux/index.html +++ b/docs/configuring-odbc-on-linux/index.html @@ -436,6 +436,8 @@
  7. Using Apache Drill with Tableau 9 Server
  8. +
  9. Configuring JReport with Drill
  10. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-odbc-on-mac-os-x/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-odbc-on-mac-os-x/index.html b/docs/configuring-odbc-on-mac-os-x/index.html index b8ce66e..2f9ddbb 100644 --- a/docs/configuring-odbc-on-mac-os-x/index.html +++ b/docs/configuring-odbc-on-mac-os-x/index.html @@ -436,6 +436,8 @@
  11. Using Apache Drill with Tableau 9 Server
  12. +
  13. Configuring JReport with Drill
  14. + @@ -1044,6 +1046,11 @@ on Mac OS X, copy the following configuration files in /opt/mapr/drillobdc

    Depending on the driver manager you use, the user DSN in one of these files will be effective.

    +
    +

    Note

    +

    The System and User DSN use different ini files in different locations on OS X.

    +
    +

    Step 1: Set Environment Variables

    http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-odbc-on-windows/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-odbc-on-windows/index.html b/docs/configuring-odbc-on-windows/index.html index 67e2bbd..96ba064 100644 --- a/docs/configuring-odbc-on-windows/index.html +++ b/docs/configuring-odbc-on-windows/index.html @@ -436,6 +436,8 @@
  15. Using Apache Drill with Tableau 9 Server
  16. +
  17. Configuring JReport with Drill
  18. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-odbc/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-odbc/index.html b/docs/configuring-odbc/index.html index a3199f4..932fe70 100644 --- a/docs/configuring-odbc/index.html +++ b/docs/configuring-odbc/index.html @@ -436,6 +436,8 @@
  19. Using Apache Drill with Tableau 9 Server
  20. +
  21. Configuring JReport with Drill
  22. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-resources-for-a-shared-drillbit/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-resources-for-a-shared-drillbit/index.html b/docs/configuring-resources-for-a-shared-drillbit/index.html index babda97..dcd97c8 100644 --- a/docs/configuring-resources-for-a-shared-drillbit/index.html +++ b/docs/configuring-resources-for-a-shared-drillbit/index.html @@ -436,6 +436,8 @@
  23. Using Apache Drill with Tableau 9 Server
  24. +
  25. Configuring JReport with Drill
  26. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-the-drill-shell/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-the-drill-shell/index.html b/docs/configuring-the-drill-shell/index.html index 44b4a27..97366d6 100644 --- a/docs/configuring-the-drill-shell/index.html +++ b/docs/configuring-the-drill-shell/index.html @@ -436,6 +436,8 @@
  27. Using Apache Drill with Tableau 9 Server
  28. +
  29. Configuring JReport with Drill
  30. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-tibco-spotfire-server-with-drill/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-tibco-spotfire-server-with-drill/index.html b/docs/configuring-tibco-spotfire-server-with-drill/index.html index 7910269..309be78 100644 --- a/docs/configuring-tibco-spotfire-server-with-drill/index.html +++ b/docs/configuring-tibco-spotfire-server-with-drill/index.html @@ -436,6 +436,8 @@
  31. Using Apache Drill with Tableau 9 Server
  32. +
  33. Configuring JReport with Drill
  34. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-user-authentication/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-user-authentication/index.html b/docs/configuring-user-authentication/index.html index 28709ef..b575231 100644 --- a/docs/configuring-user-authentication/index.html +++ b/docs/configuring-user-authentication/index.html @@ -436,6 +436,8 @@
  35. Using Apache Drill with Tableau 9 Server
  36. +
  37. Configuring JReport with Drill
  38. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-user-impersonation-with-hive-authorization/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-user-impersonation-with-hive-authorization/index.html b/docs/configuring-user-impersonation-with-hive-authorization/index.html index ed5852d..13fa2d5 100644 --- a/docs/configuring-user-impersonation-with-hive-authorization/index.html +++ b/docs/configuring-user-impersonation-with-hive-authorization/index.html @@ -436,6 +436,8 @@
  39. Using Apache Drill with Tableau 9 Server
  40. +
  41. Configuring JReport with Drill
  42. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/configuring-user-impersonation/index.html ---------------------------------------------------------------------- diff --git a/docs/configuring-user-impersonation/index.html b/docs/configuring-user-impersonation/index.html index 19c1aeb..8677a56 100644 --- a/docs/configuring-user-impersonation/index.html +++ b/docs/configuring-user-impersonation/index.html @@ -436,6 +436,8 @@
  43. Using Apache Drill with Tableau 9 Server
  44. +
  45. Configuring JReport with Drill
  46. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/connect-a-data-source-introduction/index.html ---------------------------------------------------------------------- diff --git a/docs/connect-a-data-source-introduction/index.html b/docs/connect-a-data-source-introduction/index.html index 7090d83..2f1a2fb 100644 --- a/docs/connect-a-data-source-introduction/index.html +++ b/docs/connect-a-data-source-introduction/index.html @@ -436,6 +436,8 @@
  47. Using Apache Drill with Tableau 9 Server
  48. +
  49. Configuring JReport with Drill
  50. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/connect-a-data-source/index.html ---------------------------------------------------------------------- diff --git a/docs/connect-a-data-source/index.html b/docs/connect-a-data-source/index.html index 8798f4a..30a504e 100644 --- a/docs/connect-a-data-source/index.html +++ b/docs/connect-a-data-source/index.html @@ -436,6 +436,8 @@
  51. Using Apache Drill with Tableau 9 Server
  52. +
  53. Configuring JReport with Drill
  54. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/connecting-drill-explorer-to-data/index.html ---------------------------------------------------------------------- diff --git a/docs/connecting-drill-explorer-to-data/index.html b/docs/connecting-drill-explorer-to-data/index.html index 7e2410b..0fa719c 100644 --- a/docs/connecting-drill-explorer-to-data/index.html +++ b/docs/connecting-drill-explorer-to-data/index.html @@ -436,6 +436,8 @@
  55. Using Apache Drill with Tableau 9 Server
  56. +
  57. Configuring JReport with Drill
  58. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/contribute-to-drill/index.html ---------------------------------------------------------------------- diff --git a/docs/contribute-to-drill/index.html b/docs/contribute-to-drill/index.html index d163059..b3142cb 100644 --- a/docs/contribute-to-drill/index.html +++ b/docs/contribute-to-drill/index.html @@ -436,6 +436,8 @@
  59. Using Apache Drill with Tableau 9 Server
  60. +
  61. Configuring JReport with Drill
  62. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/controlling-parallelization-to-balance-performance-with-multi-tenancy/index.html ---------------------------------------------------------------------- diff --git a/docs/controlling-parallelization-to-balance-performance-with-multi-tenancy/index.html b/docs/controlling-parallelization-to-balance-performance-with-multi-tenancy/index.html index b7070aa..960c957 100644 --- a/docs/controlling-parallelization-to-balance-performance-with-multi-tenancy/index.html +++ b/docs/controlling-parallelization-to-balance-performance-with-multi-tenancy/index.html @@ -436,6 +436,8 @@
  63. Using Apache Drill with Tableau 9 Server
  64. +
  65. Configuring JReport with Drill
  66. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/core-modules/index.html ---------------------------------------------------------------------- diff --git a/docs/core-modules/index.html b/docs/core-modules/index.html index d95b925..dea232b 100644 --- a/docs/core-modules/index.html +++ b/docs/core-modules/index.html @@ -436,6 +436,8 @@
  67. Using Apache Drill with Tableau 9 Server
  68. +
  69. Configuring JReport with Drill
  70. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/create-table-as-ctas/index.html ---------------------------------------------------------------------- diff --git a/docs/create-table-as-ctas/index.html b/docs/create-table-as-ctas/index.html index 982914d..1362224 100644 --- a/docs/create-table-as-ctas/index.html +++ b/docs/create-table-as-ctas/index.html @@ -436,6 +436,8 @@
  71. Using Apache Drill with Tableau 9 Server
  72. +
  73. Configuring JReport with Drill
  74. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/create-view/index.html ---------------------------------------------------------------------- diff --git a/docs/create-view/index.html b/docs/create-view/index.html index 5df876c..319666c 100644 --- a/docs/create-view/index.html +++ b/docs/create-view/index.html @@ -436,6 +436,8 @@
  75. Using Apache Drill with Tableau 9 Server
  76. +
  77. Configuring JReport with Drill
  78. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/custom-function-interfaces/index.html ---------------------------------------------------------------------- diff --git a/docs/custom-function-interfaces/index.html b/docs/custom-function-interfaces/index.html index 1dbe1c6..ef38957 100644 --- a/docs/custom-function-interfaces/index.html +++ b/docs/custom-function-interfaces/index.html @@ -436,6 +436,8 @@
  79. Using Apache Drill with Tableau 9 Server
  80. +
  81. Configuring JReport with Drill
  82. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/data-sources-and-file-formats-introduction/index.html ---------------------------------------------------------------------- diff --git a/docs/data-sources-and-file-formats-introduction/index.html b/docs/data-sources-and-file-formats-introduction/index.html index 3a6a21f..1e0aafc 100644 --- a/docs/data-sources-and-file-formats-introduction/index.html +++ b/docs/data-sources-and-file-formats-introduction/index.html @@ -436,6 +436,8 @@
  83. Using Apache Drill with Tableau 9 Server
  84. +
  85. Configuring JReport with Drill
  86. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/data-sources-and-file-formats/index.html ---------------------------------------------------------------------- diff --git a/docs/data-sources-and-file-formats/index.html b/docs/data-sources-and-file-formats/index.html index 5d12829..63853cb 100644 --- a/docs/data-sources-and-file-formats/index.html +++ b/docs/data-sources-and-file-formats/index.html @@ -436,6 +436,8 @@
  87. Using Apache Drill with Tableau 9 Server
  88. +
  89. Configuring JReport with Drill
  90. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/data-type-conversion/index.html ---------------------------------------------------------------------- diff --git a/docs/data-type-conversion/index.html b/docs/data-type-conversion/index.html index d0e2574..15bf816 100644 --- a/docs/data-type-conversion/index.html +++ b/docs/data-type-conversion/index.html @@ -436,6 +436,8 @@
  91. Using Apache Drill with Tableau 9 Server
  92. +
  93. Configuring JReport with Drill
  94. + @@ -1031,11 +1033,12 @@

    CAST Usage Notes

    -

    Use CONVERT_TO and CONVERT_FROM instead of the CAST function for converting binary data types with one exception: When converting an INT or BIGINT number, having a byte count in the destination/source that does not match the byte count of the number in the VARBINARY source/destination, use CAST.

    +

    Use CONVERT_TO and CONVERT_FROM instead of the CAST function for converting binary data types.

    -

    Refer to the following tables for information about the data types to use for casting:

    +

    See the following tables for information about the data types to use for casting:

    @@ -1144,8 +1147,7 @@ FROM dfs.`/Users/drill/intervals.json`);

    CONVERT_TO and CONVERT_FROM

    -

    The CONVERT_TO and CONVERT_FROM functions encode and decode -data to and from another data type.

    +

    The CONVERT_TO and CONVERT_FROM functions convert binary data to/from Drill internal types based on the little or big endian encoding of the data.

    CONVERT_TO and CONVERT_FROM Syntax

    CONVERT_TO (column, type)
    @@ -1154,20 +1156,21 @@ CONVERT_FROM(column, type)
     

    column is the name of a column Drill reads.

    -

    type is one of the data types listed in the CONVERT_TO/FROM Data Types table.

    +

    type is one of the encoding types listed in the CONVERT_TO/FROM Data Types table.

    CONVERT_TO and CONVERT_FROM Usage Notes

    -

    CONVERT_FROM and CONVERT_TO methods transform a known binary representation/encoding to a Drill internal format. Use CONVERT_TO and CONVERT_FROM instead of the CAST function for converting binary data types with one exception: When converting data represented as a string in HBase to an INT or BIGINT number, use CAST. CONVERT_TO/FROM functions work for data in a binary representation and are more efficient to use than CAST. For example, HBase stores -data as encoded VARBINARY data. To read HBase data in Drill, convert every column of an HBase table from binary to an Drill internal type. To write HBase or Parquet binary data, convert SQL data to binary data and store the data in an HBase or Parquet while creating a table as a selection (CTAS).

    +

    CONVERT_FROM and CONVERT_TO methods transform a known binary representation/encoding to a Drill internal format. Use CONVERT_TO and CONVERT_FROM instead of the CAST function for converting binary data types. CONVERT_TO/FROM functions work for data in a binary representation and are more efficient to use than CAST.

    + +

    Drill can optimize scans of HBase tables when you use the *_BE encoded types shown in section "CONVERT_TO and CONVERT_FROM Data Types" on big endian-encoded data. You need to use the HBase storage plugin and query data as described in "Querying Hbase". To write Parquet binary data, convert SQL data to binary data and store the data in a Parquet table while creating a table as a selection (CTAS).

    CONVERT_TO also converts an SQL data type to complex types, including HBase byte arrays, JSON and Parquet arrays, and maps. CONVERT_FROM converts from complex types, including HBase arrays, JSON and Parquet arrays and maps to an SQL data type.

    -

    Use the BINARY_STRING and STRING_BINARY custom Drill functions with CONVERT_TO and CONVERT_FROM to get meaningful results.

    +

    You can use STRING_BINARY and BINARY_STRING custom Drill functions with CONVERT_TO and CONVERT_FROM to get meaningful results.

    Conversion of Data Types Examples

    -

    This example shows how to use the CONVERT_FROM function to convert complex HBase data to a readable type. The example summarizes and continues the "Query HBase" example. The "Query HBase" example stores the following data in the students table on the Drill Sandbox:

    +

    This example shows how to use the CONVERT_FROM function to convert HBase data to a SQL type. The example summarizes and continues the "Query HBase" example. The "Query HBase" example stores the following data in the students table on the Drill Sandbox:

    USE maprdb;
     
     SELECT * FROM students;
    @@ -1182,7 +1185,7 @@ SELECT * FROM students;
     +-------------+---------------------+---------------------------------------------------------------------------+
     4 rows selected (1.335 seconds)
     
    -

    You use the CONVERT_FROM function to decode the binary data to render it readable, selecting a data type to use from the list of supported types. JSON supports strings. To convert binary to strings, use the UTF8 type.:

    +

    You use the CONVERT_FROM function to decode the binary data, selecting a data type to use from the list of supported types. JSON supports strings. To convert bytes to strings, use the UTF8 type:

    SELECT CONVERT_FROM(row_key, 'UTF8') AS studentid, 
            CONVERT_FROM(students.account.name, 'UTF8') AS name, 
            CONVERT_FROM(students.address.state, 'UTF8') AS state, 
    @@ -1199,7 +1202,7 @@ SELECT * FROM students;
     +------------+------------+------------+------------------+------------+
     4 rows selected (0.504 seconds)
     
    -

    This example converts from VARCHAR to a JSON map:

    +

    This example converts VARCHAR data to a JSON map:

    SELECT CONVERT_FROM('{x:100, y:215.6}' ,'JSON') AS MYCOL FROM (VALUES(1));
     +----------------------+
     |        MYCOL         |
    @@ -1208,7 +1211,7 @@ SELECT * FROM students;
     +----------------------+
     1 row selected (0.163 seconds)
     
    -

    This example uses a list of BIGINT as input and returns a repeated list of vectors:

    +

    This example uses a list of BIGINT data as input and returns a repeated list of vectors:

    SELECT CONVERT_FROM('[ [1, 2], [3, 4], [5]]' ,'JSON') AS MYCOL1 FROM (VALUES(1));
     +------------+
     |   mycol1   |
    @@ -1232,7 +1235,7 @@ SELECT * FROM students;
     
     
    1. Copy/paste the dfs storage plugin definition to a newly created plugin called myplugin.

    2. -
    3. Change the root location to "/mapr/demo.mapr.com/tables". This change allows you to query tables for reading in the tables directory by workspace.table name. This change allows you to read a table in the tables directory. You can write a converted version of the table in the tmp directory because the writable property is true.

      +
    4. Change the root location to "/mapr/demo.mapr.com/tables". After this change, you can read a table in the tables directory. You can write a converted version of the table in the tmp directory because the writable property is true.

      {
         "type": "file",
         "enabled": true,
      @@ -1359,7 +1362,7 @@ FROM tmp.`to_json`;
       +-------------+-------------+-------------+-------------+-------------+
       4 rows selected (0.12 seconds)
       
    5. -
    6. Use CONVERT_FROM to convert the Parquet data to a readable format:

      +
    7. Use CONVERT_FROM to read the Parquet data:

      SELECT CONVERT_FROM(id, 'UTF8') AS id, 
              CONVERT_FROM(name, 'UTF8') AS name, 
              CONVERT_FROM(state, 'UTF8') AS state, 
      
      http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/data-types/index.html
      ----------------------------------------------------------------------
      diff --git a/docs/data-types/index.html b/docs/data-types/index.html
      index 9b89a51..dddbc04 100644
      --- a/docs/data-types/index.html
      +++ b/docs/data-types/index.html
      @@ -436,6 +436,8 @@
                     
                       
    8. Using Apache Drill with Tableau 9 Server
    9. +
    10. Configuring JReport with Drill
    11. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/date-time-and-timestamp/index.html ---------------------------------------------------------------------- diff --git a/docs/date-time-and-timestamp/index.html b/docs/date-time-and-timestamp/index.html index 9d1e5fc..bfeaef6 100644 --- a/docs/date-time-and-timestamp/index.html +++ b/docs/date-time-and-timestamp/index.html @@ -436,6 +436,8 @@
    12. Using Apache Drill with Tableau 9 Server
    13. +
    14. Configuring JReport with Drill
    15. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/date-time-functions-and-arithmetic/index.html ---------------------------------------------------------------------- diff --git a/docs/date-time-functions-and-arithmetic/index.html b/docs/date-time-functions-and-arithmetic/index.html index 292e2c2..79e273f 100644 --- a/docs/date-time-functions-and-arithmetic/index.html +++ b/docs/date-time-functions-and-arithmetic/index.html @@ -436,6 +436,8 @@
    16. Using Apache Drill with Tableau 9 Server
    17. +
    18. Configuring JReport with Drill
    19. + http://git-wip-us.apache.org/repos/asf/drill-site/blob/31fb6486/docs/deploying-and-using-a-hive-udf/index.html ---------------------------------------------------------------------- diff --git a/docs/deploying-and-using-a-hive-udf/index.html b/docs/deploying-and-using-a-hive-udf/index.html index 5cc06c6..30afc7d 100644 --- a/docs/deploying-and-using-a-hive-udf/index.html +++ b/docs/deploying-and-using-a-hive-udf/index.html @@ -436,6 +436,8 @@
    20. Using Apache Drill with Tableau 9 Server
    21. +
    22. Configuring JReport with Drill
    23. + @@ -1055,7 +1057,7 @@ drill-module.conf

      Using a UDF

      -

      Use a Hive UDF just as you would use a Drill custom function. For example, to query using a Hive UDF named upper-to-lower that takes a column.value argument, the SELECT statement looks something like this:

      +

      Use a Hive UDF just as you would use a Drill custom function. For example, to query using a Hive UDF named MY_UPPER, the SELECT statement looks something like this:

      SELECT MY_UPPER('abc') from (VALUES(1));
       +---------+
       | EXPR$0  |