Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id EF33A200BB3 for ; Wed, 19 Oct 2016 01:53:24 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id EE4F0160AE5; Tue, 18 Oct 2016 23:53:24 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 3FA53160AFB for ; Wed, 19 Oct 2016 01:53:24 +0200 (CEST) Received: (qmail 78193 invoked by uid 500); 18 Oct 2016 23:53:23 -0000 Mailing-List: contact dev-help@hawq.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hawq.incubator.apache.org Delivered-To: mailing list dev@hawq.incubator.apache.org Received: (qmail 77862 invoked by uid 99); 18 Oct 2016 23:53:22 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 Oct 2016 23:53:22 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 15B061A07DF for ; Tue, 18 Oct 2016 23:53:22 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -6.219 X-Spam-Level: X-Spam-Status: No, score=-6.219 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, RCVD_IN_DNSWL_HI=-5, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RP_MATCHES_RCVD=-2.999] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id 3bgI54k778aI for ; Tue, 18 Oct 2016 23:53:19 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with SMTP id 900D45FC23 for ; Tue, 18 Oct 2016 23:53:18 +0000 (UTC) Received: (qmail 77814 invoked by uid 99); 18 Oct 2016 23:53:17 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 Oct 2016 23:53:17 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 66DF1E0579; Tue, 18 Oct 2016 23:53:17 +0000 (UTC) From: dyozie To: dev@hawq.incubator.apache.org Reply-To: dev@hawq.incubator.apache.org References: In-Reply-To: Subject: [GitHub] incubator-hawq-docs pull request #23: HAWQ-1095 - enhance database api docs Content-Type: text/plain Message-Id: <20161018235317.66DF1E0579@git1-us-west.apache.org> Date: Tue, 18 Oct 2016 23:53:17 +0000 (UTC) archived-at: Tue, 18 Oct 2016 23:53:25 -0000 Github user dyozie commented on a diff in the pull request: https://github.com/apache/incubator-hawq-docs/pull/23#discussion_r83974424 --- Diff: clientaccess/g-database-application-interfaces.html.md.erb --- @@ -1,8 +1,96 @@ --- -title: ODBC/JDBC Application Interfaces +title: HAWQ Database Drivers and APIs --- +You may want to connect your existing Business Intelligence (BI) or Analytics applications with HAWQ. The database application programming interfaces most commonly used with HAWQ are the Postgres and ODBC and JDBC APIs. -You may want to deploy your existing Business Intelligence (BI) or Analytics applications with HAWQ. The most commonly used database application programming interfaces with HAWQ are the ODBC and JDBC APIs. +HAWQ provides the following connectivity tools for connecting to the database: + + - ODBC driver + - JDBC driver + - `libpq` - PostgreSQL C API + +## HAWQ Drivers + +ODBC and JDBC drivers for HAWQ are available as a separate download from Pivotal Network [Pivotal Network](https://network.pivotal.io/products/pivotal-hdb). + +### ODBC Driver + +The ODBC API specifies a standard set of C interfaces for accessing database management systems. For additional information on using the ODBC API, refer to the [ODBC Programmer's Reference](https://msdn.microsoft.com/en-us/library/ms714177(v=vs.85).aspx) documentation. + +HAWQ supports the DataDirect ODBC Driver. Installation instructions for this driver are provided on the Pivotal Network driver download page. Refer to [HAWQ ODBC Driver](http://media.datadirect.com/download/docs/odbc/allodbc/#page/odbc%2Fthe-greenplum-wire-protocol-driver.html%23) for HAWQ-specific ODBC driver information. + +#### Connection Data Source +The information required by the HAWQ ODBC driver to connect to a database is typically stored in a named data source. Depending on your platform, you may use [GUI](http://media.datadirect.com/download/docs/odbc/allodbc/index.html#page/odbc%2FData_Source_Configuration_through_a_GUI_14.html%23) or [command line](http://media.datadirect.com/download/docs/odbc/allodbc/index.html#page/odbc%2FData_Source_Configuration_in_the_UNIX_2fLinux_odbc_13.html%23) tools to create your data source definition. On Linux, ODBC data sources are typically defined in a file named `odbc.ini`. + +Commonly-specified HAWQ ODBC data source connection properties include: + +| Property Name | Value Description | +|-------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| Database | name of the database to which you want to connect | +| Driver | full path to the ODBC driver library file | +| HostName | HAWQ master host name | +| MaxLongVarcharSize | maximum size of columns of type long varchar | +| Password | password used to connect to the specified database | +| PortNumber | HAWQ master database port number | --- End diff -- Let's initial-capitalize the second column. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastructure@apache.org or file a JIRA ticket with INFRA. ---