incubator-hcatalog-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ga...@apache.org
Subject svn commit: r1091509 [2/8] - in /incubator/hcatalog/trunk: ./ bin/ ivy/ src/ src/docs/ src/docs/src/ src/docs/src/documentation/ src/docs/src/documentation/classes/ src/docs/src/documentation/conf/ src/docs/src/documentation/content/ src/docs/src/docum...
Date Tue, 12 Apr 2011 17:30:12 GMT
Added: incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/loadstore.xml
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/loadstore.xml?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/loadstore.xml (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/loadstore.xml Tue Apr 12 17:30:08 2011
@@ -0,0 +1,276 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN" "http://forrest.apache.org/dtd/document-v20.dtd">
+
+<document>
+  <header>
+    <title>HCatalog Load and Store Interfaces</title>
+  </header>
+  <body>
+ 
+ <!-- ==================================================================== --> 
+  <section>
+  <title>Set Up</title>
+  
+<p>The HCatLoader and HCatStorer interfaces are used with Pig scripts to read and write data in HCatalog managed tables. If you run your Pig script using the "pig" command (the bin/pig Perl script) no set up is required. </p>
+<source>
+$ pig mypig.script
+</source>    
+    
+   <p> If you run your Pig script using the "java" command (java -cp pig.jar...), then the hcat jar needs to be included in the classpath of the java command line (using the -cp option). Additionally, the following properties are required in the command line: </p>
+    <ul>
+		<li>-Dhcat.metastore.uri=thrift://&lt;hcatalog server hostname&gt;:9080 </li>
+		<li>-Dhcat.metastore.principal=&lt;hcatalog server kerberos principal&gt; </li>
+	</ul>
+	
+<source>
+$ java -cp pig.jar hcatalog.jar
+     -Dhcat.metastore.uri=thrift://&lt;hcatalog server hostname&gt;:9080 
+     -Dhcat.metastore.principal=&lt;hcatalog server kerberos principal&gt; myscript.pig
+</source>
+<p></p>
+<p><strong>Authentication</strong></p>
+<table>
+	<tr>
+	<td><p>If a failure results in a message like "2010-11-03 16:17:28,225 WARN hive.metastore ... - Unable to connect metastore with URI thrift://..." in /tmp/&lt;username&gt;/hive.log, then make sure you have run "kinit &lt;username&gt;@FOO.COM" to get a kerberos ticket and to be able to authenticate to the HCatalog server. </p></td>
+	</tr>
+</table>
+
+</section>
+  
+      
+<!-- ==================================================================== -->
+     <section>
+		<title>HCatLoader</title>
+		<p>HCatLoader is used with Pig scripts to read data from HCatalog managed tables.</p>  
+<section> 
+<title>Usage</title>
+<p>HCatLoader is accessed via a Pig load statement.</p>	
+<source>
+A = LOAD 'dbname.tablename' USING org.apache.hcatalog.pig.HCatLoader(); 
+</source>
+
+    <p><strong>Assumptions</strong></p>	  
+    <p>You must specify the database name and table name using this format: 'dbname.tablename'. Both the database and table must be created prior to running your Pig script. The Hive metastore lets you create tables without specifying a database; if you created tables this way, then the database name is 'default' and the string becomes 'default.tablename'. </p>
+    <p>If the table is partitioned, you can indicate which partitions to scan by immediately following the load statement with a partition filter statement 
+    (see <a href="#Examples">Examples</a>). </p>
+ </section>   
+<section> 
+<title>HCatalog Data Types</title>
+<p>Restrictions apply to the types of columns HCatLoader can read.</p>
+<p>HCatLoader  can read <strong>only</strong> the data types listed in the table. 
+The table shows how Pig will interpret the HCatalog data type.</p>
+<p>(Note: HCatalog does not support type Boolean.)</p>
+   <table>
+        <tr>
+            <td>
+               <p><strong>HCatalog Data Type</strong></p>
+            </td>
+            <td>
+               <p><strong>Pig Data Type</strong></p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>primitives (int, long, float, double, string) </p>
+            </td>
+            <td>
+               <p>int, long, float, double <br></br> string to chararray</p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>map (key type should be string, valuetype can be a primitive listed above)</p>
+            </td>
+            <td>
+               <p>map </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>List&lt;primitive&gt; or List&lt;map&gt; where map is of the type noted above </p>
+            </td>
+            <td>
+               <p>bag, with the primitive or map type as the field in each tuple of the bag </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>struct&lt;primitive fields&gt; </p>
+            </td>
+            <td>
+               <p>tuple </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>List&lt;struct&lt;primitive fields&gt;&gt; </p>
+            </td>
+            <td>
+               <p>bag, where each tuple in the bag maps to struct &lt;primitive fields&gt; </p>
+            </td>
+    </tr>
+ </table>
+</section> 
+
+<section> 
+<title>Examples</title>
+<p>This load statement will load all partitions of the specified table.</p>
+<source>
+/* myscript.pig */
+A = LOAD 'dbname.tablename' USING org.apache.hcatalog.pig.HCatLoader(); 
+...
+...
+</source>
+<p>If only some partitions of the specified table are needed, include a partition filter statement <strong>immediately</strong> following the load statement. 
+The filter statement can include conditions on partition as well as non-partition columns.</p>
+<source>
+/* myscript.pig */
+A = LOAD 'dbname.tablename' USING  org.apache.hcatalog.pig.HCatLoader();
+ 
+B = filter A by date == '20100819' and by age &lt; 30; -- datestamp is a partition column; age is not
+ 
+C = filter A by date == '20100819' and by country == 'US'; -- datestamp and country are partition columns
+...
+...
+</source>
+
+<p>Certain combinations of conditions on partition and non-partition columns are not allowed in filter statements.
+For example, the following script results in this error message:  <br></br> <br></br>
+<code>ERROR 1112: Unsupported query: You have an partition column (datestamp ) in a construction like: (pcond and ...) or ( pcond and ...) where pcond is a condition on a partition column.</code> <br></br> <br></br>
+A workaround is to restructure the filter condition by splitting it into multiple filter conditions, with the first condition immediately following the load statement.
+</p>
+
+<source>
+/* This script produces an ERROR */
+
+A = LOAD 'default.search_austria' USING org.apache.hcatalog.pig.HCatLoader();
+B = FILTER A BY
+    (   (datestamp &lt; '20091103' AND browser &lt; 50)
+     OR (action == 'click' and browser &gt; 100)
+    );
+...
+...
+</source>
+
+</section> 
+</section> 
+	
+<!-- ==================================================================== -->	
+	<section>
+		<title>HCatStorer</title>
+		<p>HCatStorer is used with Pig scripts to write data to HCatalog managed tables.</p>	
+
+	
+	<section>
+	<title>Usage</title>
+	
+<p>HCatStorer is accessed via a Pig store statement.</p>	
+
+<source>
+A = LOAD ...
+B = FOREACH A ...
+...
+...
+my_processed_data = ...
+
+STORE my_processed_data INTO 'dbname.tablename' 
+    USING org.apache.hcatalog.pig.HCatStorer('month=12,date=25,hour=0300','a:int,b:chararray,c:map[]');
+</source>
+
+<p><strong>Assumptions</strong></p>
+
+<p>You must specify the database name and table name using this format: 'dbname.tablename'. Both the database and table must be created prior to running your Pig script. The Hive metastore lets you create tables without specifying a database; if you created tables this way, then the database name is 'default' and string becomes 'default.tablename'. </p>
+
+<p>For the USING clause, you can have two string arguments: </p>	
+<ul>
+<li>The first string argument represents key/value pairs for partition. This is a mandatory argument. In the above example, month, date and hour are columns on which table is partitioned. 
+The values for partition keys should NOT be quoted, even if the partition key is defined to be of string type. 
+</li>
+<li>The second string argument is the Pig schema for the data that will be written. This argument is optional, and if no schema is specified, a schema will be computed by Pig. If a schema is provided, it must match with the schema computed by Pig. (See also: <a href="inputoutput.html#Partition+Schema+Semantics">Partition Schema Semantics</a>.)</li>
+</ul>
+<p></p>
+<p></p>
+
+	</section>
+	
+    <section>
+	<title>HCatalog Data Types</title>
+	<p>Restrictions apply to the types of columns HCatStorer can write.</p>
+<p>HCatStorer can write <strong>only</strong> the data types listed in the table. 
+The table shows how Pig will interpret the HCatalog data type.</p>
+<p>(Note: HCatalog does not support type Boolean.)</p>
+   <table>
+        <tr>
+            <td>
+               <p><strong>HCatalog Data Type</strong></p>
+            </td>
+            <td>
+               <p><strong>Pig Data Type</strong></p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>primitives (int, long, float, double, string) </p>
+            </td>
+            <td>
+               <p>int, long, float, double, string <br></br><br></br>
+               <strong>Note:</strong> HCatStorer does NOT support writing table columns of type smallint or tinyint. 
+               To be able to write form Pig using the HCatalog storer, table columns must by of type int or bigint.
+               </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>map (key type should be string, valuetype can be a primitive listed above)</p>
+            </td>
+            <td>
+               <p>map </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>List&lt;primitive&gt; or List&lt;map&gt; where map is of the type noted above </p>
+            </td>
+            <td>
+               <p>bag, with the primitive or map type as the field in each tuple of the bag </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>struct&lt;primitive fields&gt; </p>
+            </td>
+            <td>
+               <p>tuple </p>
+            </td>
+    </tr>
+    <tr>
+            <td>
+               <p>List&lt;struct&lt;primitive fields&gt;&gt; </p>
+            </td>
+            <td>
+               <p>bag, where each tuple in the bag maps to struct &lt;primitive fields&gt; </p>
+            </td>
+    </tr>
+ </table>
+	</section>
+	
+		</section>
+	
+  </body>
+</document>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/site.xml
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/site.xml?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/site.xml (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/site.xml Tue Apr 12 17:30:08 2011
@@ -0,0 +1,49 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!--
+Forrest site.xml
+
+This file contains an outline of the site's information content.  It is used to:
+- Generate the website menus (though these can be overridden - see docs)
+- Provide semantic, location-independent aliases for internal 'site:' URIs, eg
+<link href="site:changes"> links to changes.html (or ../changes.html if in
+  subdir).
+- Provide aliases for external URLs in the external-refs section.  Eg, <link
+  href="ext:cocoon"> links to http://cocoon.apache.org/ 
+
+See http://forrest.apache.org/docs/linking.html for more info
+-->
+<!-- The label attribute of the outer "site" element will only show 
+  in the linkmap (linkmap.html).
+  Use elements project-name and group-name in skinconfig to change name of 
+  your site or project that is usually shown at the top of page.
+  No matter what you configure for the href attribute, Forrest will
+  always use index.html when you request http://yourHost/
+  See FAQ: "How can I use a start-up-page other than index.html?"
+-->
+<site label="HCatalog" href="" xmlns="http://apache.org/forrest/linkmap/1.0" tab="">
+
+  <docs label="HCatalog"> 
+    <index label="Overview" href="index.html" />
+    <index label="Pig Load &amp; Store " href="loadstore.html" />
+    <index label="MapReduce Input &amp; Output " href="inputoutput.html" />
+    <index label="Cmd Line Interface " href="cli.html" />
+    <index label="Supported data formats" href="supportedformats.html" />
+    </docs>  
+
+</site>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/supportedformats.xml
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/supportedformats.xml?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/supportedformats.xml (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/supportedformats.xml Tue Apr 12 17:30:08 2011
@@ -0,0 +1,29 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN" "http://forrest.apache.org/dtd/document-v20.dtd">
+
+<document>
+  <header>
+    <title>Supported storage formats</title>
+  </header>
+  <body>
+  <p>HCatalog can read PigStorage, ULT (Yahoo proprietary) and RCFile formatted files. The input drivers for the formats are PigStorageInputDriver, ULTInputDriver and RCFileInputDriver respectively. HCatalog currently produces only RCFile formatted output. The output driver for the same is RCFileOutputDriver. </p>
+
+<p>Hive and HCatalog applications can interoperate (each can read the output of the other) as long as they use a common format. Currently, the only common format is RCFile.</p>
+ </body>
+</document>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/tabs.xml
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/tabs.xml?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/tabs.xml (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/content/xdocs/tabs.xml Tue Apr 12 17:30:08 2011
@@ -0,0 +1,35 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!DOCTYPE tabs PUBLIC "-//APACHE//DTD Cocoon Documentation Tab V1.1//EN" "http://forrest.apache.org/dtd/tab-cocoon-v11.dtd">
+<tabs software="The Forresters"
+  title="The Forresters"
+  copyright="The Apache Software Foundation"
+  xmlns:xlink="http://www.w3.org/1999/xlink">
+<!-- The rules for tabs are:
+    @dir will always have '/@indexfile' added.
+    @indexfile gets appended to @dir if the tab is selected. Defaults to 'index.html'
+    @href is not modified unless it is root-relative and obviously specifies a
+    directory (ends in '/'), in which case /index.html will be added
+    If @id's are present, site.xml entries with a matching @tab will be in that tab.
+
+   Tabs can be embedded to a depth of two. The second level of tabs will only 
+    be displayed when their parent tab is selected.    
+  -->
+  <tab label="HCatalog 0.1.0 Documentation" dir="" type="visible" /> 
+
+</tabs>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/resources/images/ellipse-2.svg
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/resources/images/ellipse-2.svg?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/resources/images/ellipse-2.svg (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/resources/images/ellipse-2.svg Tue Apr 12 17:30:08 2011
@@ -0,0 +1,30 @@
+<?xml version="1.0" standalone="no"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.0//EN"
+"http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd">
+<svg xmlns="http://www.w3.org/2000/svg" version="1.0"
+   width="300" height="150"
+   viewBox="0 0 1500 1000"
+>
+  <desc>Ellipse</desc>
+  <rect x="1" y="1" width="1495" height="995"
+      fill="none" stroke="blue" stroke-width="5"/>
+  <ellipse transform="translate(200 200) rotate(-45)" 
+      rx="200" ry="100"
+      fill="none" stroke="red" stroke-width="20"/>
+</svg>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/sitemap.xmap
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/sitemap.xmap?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/sitemap.xmap (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/sitemap.xmap Tue Apr 12 17:30:08 2011
@@ -0,0 +1,66 @@
+<?xml version="1.0"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<map:sitemap xmlns:map="http://apache.org/cocoon/sitemap/1.0">
+  <map:components>
+    <map:actions>
+      <map:action logger="sitemap.action.sourcetype" name="sourcetype" src="org.apache.forrest.sourcetype.SourceTypeAction">
+        <sourcetype name="hello-v1.0">
+          <document-declaration public-id="-//Acme//DTD Hello Document V1.0//EN" />
+        </sourcetype>
+      </map:action>
+    </map:actions>
+    <map:selectors default="parameter">
+      <map:selector logger="sitemap.selector.parameter" name="parameter" src="org.apache.cocoon.selection.ParameterSelector" />
+    </map:selectors>
+  </map:components>
+  <map:resources>
+    <map:resource name="transform-to-document">
+      <map:act type="sourcetype" src="{src}">
+        <map:select type="parameter">
+          <map:parameter name="parameter-selector-test" value="{sourcetype}" />
+          <map:when test="hello-v1.0">
+            <map:generate src="{properties:content.xdocs}{../../1}.xml" />
+            <map:transform src="{properties:resources.stylesheets}/hello2document.xsl" />
+            <map:serialize type="xml-document"/>
+          </map:when>
+        </map:select>
+      </map:act>
+    </map:resource>
+  </map:resources>
+  <map:pipelines>
+    <map:pipeline>
+      <map:match pattern="old_site/*.html">
+        <map:select type="exists">
+          <map:when test="{properties:content}{1}.html">
+            <map:read src="{properties:content}{1}.html" mime-type="text/html"/>
+<!--
+          Use this instead if you want JTidy to clean up your HTML
+          <map:generate type="html" src="{properties:content}/{0}" />
+          <map:serialize type="html"/>
+        -->
+          </map:when>
+        </map:select>
+      </map:match>
+      <map:match pattern="**.xml">
+        <map:call resource="transform-to-document">
+          <map:parameter name="src" value="{properties:content.xdocs}{1}.xml" />
+        </map:call>
+      </map:match>
+    </map:pipeline>
+  </map:pipelines>
+</map:sitemap>

Added: incubator/hcatalog/trunk/src/docs/src/documentation/skinconf.xml
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/docs/src/documentation/skinconf.xml?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/docs/src/documentation/skinconf.xml (added)
+++ incubator/hcatalog/trunk/src/docs/src/documentation/skinconf.xml Tue Apr 12 17:30:08 2011
@@ -0,0 +1,433 @@
+<?xml version="1.0"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<!--
+Skin configuration file. This file contains details of your project,
+which will be used to configure the chosen Forrest skin.
+-->
+
+
+<!DOCTYPE skinconfig PUBLIC "-//APACHE//DTD Skin Configuration V0.8-1//EN" "http://forrest.apache.org/dtd/skinconfig-v08-1.dtd">
+
+<skinconfig>
+<!-- To enable lucene search add provider="lucene" (default is google).
+    Add box-location="alt" to move the search box to an alternate location
+    (if the skin supports it) and box-location="all" to show it in all
+    available locations on the page.  Remove the <search> element to show
+    no search box. @domain will enable sitesearch for the specific domain with google.
+    In other words google will search the @domain for the query string.
+  -->
+
+<!-- Disable the print link? If enabled, invalid HTML 4.0.1 -->
+  <disable-print-link>true</disable-print-link>
+<!-- Disable the PDF link? -->
+  <disable-pdf-link>false</disable-pdf-link>
+<!-- Disable the POD link? -->
+  <disable-pod-link>true</disable-pod-link>
+<!-- Disable the Text link? FIXME: NOT YET IMPLEMENETED. -->
+  <disable-txt-link>true</disable-txt-link>
+<!-- Disable the xml source link? -->
+<!-- The xml source link makes it possible to access the xml rendition
+    of the source frim the html page, and to have it generated statically.
+    This can be used to enable other sites and services to reuse the
+    xml format for their uses. Keep this disabled if you don't want other
+    sites to easily reuse your pages.-->
+  <disable-xml-link>true</disable-xml-link>
+<!-- Disable navigation icons on all external links? -->
+  <disable-external-link-image>true</disable-external-link-image>
+<!-- Disable w3c compliance links? 
+    Use e.g. align="center" to move the compliance links logos to 
+    an alternate location default is left.
+    (if the skin supports it) -->
+  <disable-compliance-links>true</disable-compliance-links>
+<!-- Render mailto: links unrecognisable by spam harvesters? -->
+  <obfuscate-mail-links>true</obfuscate-mail-links>
+  <obfuscate-mail-value>.at.</obfuscate-mail-value>
+<!-- Disable the javascript facility to change the font size -->
+  <disable-font-script>true</disable-font-script>
+
+<!-- mandatory project logo
+       default skin: renders it at the top -->
+  <project-name>HCatalog</project-name>
+  <project-description>A table abstraction on top of data for use with java MapReduce programs, Pig scripts and Hive queryies.</project-description>
+  <project-url></project-url>
+  <project-logo>images/hcat-box.jpg</project-logo>
+
+<!-- Alternative static image:
+  <project-logo>images/project-logo.gif</project-logo> -->
+
+<!-- optional group logo
+       default skin: renders it at the top-left corner -->
+  <group-name>HCatalog</group-name>
+  <group-description></group-description>
+  <group-url></group-url>
+  <group-logo>images/hcat.jpg</group-logo>
+
+<!-- Alternative static image:
+  <group-logo>images/group-logo.gif</group-logo> -->
+
+<!-- optional host logo (e.g. sourceforge logo)
+       default skin: renders it at the bottom-left corner -->
+  <host-url></host-url>
+  <host-logo></host-logo>
+<!-- relative url of a favicon file, normally favicon.ico -->
+  <favicon-url></favicon-url>
+  
+  
+<!-- The following are used to construct a copyright statement -->
+  <disable-copyright-footer>false</disable-copyright-footer>
+
+<!-- @inception enable automatic generation of a date-range to current date -->
+  <year inception="true">2011</year>
+  <vendor>Yahoo! Inc.</vendor>
+ 
+    
+<!-- The optional copyright-link URL will be used as a link in the
+    copyright statement 
+  <copyright-link>http://www.example.org/</copyright-link>
+  -->
+
+  
+<!-- Some skins use this to form a 'breadcrumb trail' of links.
+    Use location="alt" to move the trail to an alternate location
+    (if the skin supports it).
+    Omit the location attribute to display the trail in the default location.
+    Use location="none" to not display the trail (if the skin supports it).
+    For some skins just set the attributes to blank.
+    
+    NOTE: If a breadcrumb entry points at a local file the href must
+    be complete, that is it must point to the file itself, not to a 
+    directory.
+  -->
+
+<!-- Configure the TOC, i.e. the Table of Contents.
+  @max-depth
+   how many "section" levels need to be included in the
+   generated Table of Contents (TOC). 
+  @min-sections
+   Minimum required to create a TOC.
+  @location ("page","menu","page,menu", "none")
+   Where to show the TOC.
+  -->
+  <toc max-depth="1" min-sections="1" location="page"/>
+
+<!-- Heading types can be clean|underlined|boxed  
+-->
+  <headings type="clean"/>
+
+<!-- The optional feedback element will be used to construct a
+    feedback link in the footer with the page pathname appended:
+    <a href="@href">{@to}</a>
+
+  <feedback to="webmaster@"
+    href="mailto:webmaster@?subject=Feedback&#160;" >
+    Send feedback about the website to:
+  </feedback>
+ -->
+ 
+<!-- Optional message of the day (MOTD).
+    Note: This is only implemented in the pelt skin.
+    Note: Beware issue FOR-677 if you use an absolute path uri.
+    If the optional <motd> element is used, then messages will be appended
+    depending on the URI string pattern.
+    motd-option : Each option will match a pattern and apply its text.
+      The "pattern" attribute specifies the pattern to be matched.
+      This can be a specific page, or a general pattern to match a set of pages,
+      e.g. everything in the "samples" directory.
+      The @starts-with=true anchors the string to the start, otherwise contains 
+    motd-title : This text will betadded in brackets after the <html><title>
+      and this can be empty.
+    motd-page : This text will be added in a panel on the face of the page,
+      with the "motd-page-url" being the hyperlink "More".
+    Values for the "location" attribute are:
+      page : on the face of the page, e.g. in the spare space of the toc
+      alt : at the bottom of the left-hand navigation panel
+      both : both
+    -->
+<!--
+  <motd>
+    <motd-option pattern="samples/sample.html">
+      <motd-title>sample</motd-title>
+      <motd-page location="both">
+        This is an example of a Message of the day (MOTD).
+      </motd-page>
+      <motd-page-url>faq.html</motd-page-url>
+    </motd-option>
+    <motd-option pattern="samples/faq.html">
+      <motd-page location="page">
+        How to enable this MOTD is on this page.
+      </motd-page>
+      <motd-page-url>http://forrest.apache.org/docs/faq.html</motd-page-url>
+    </motd-option>
+  </motd>
+-->
+<!--
+    extra-css - here you can define custom css-elements that are 
+    A) overriding the fallback elements or 
+    B) adding the css definition from new elements that you may have 
+       used in your documentation.
+    -->
+  
+  <extra-css>
+<!--Example of reason B:
+        To define the css definition of a new element that you may have used
+        in the class attribute of a <p> node. 
+        e.g. <p class="quote"/>
+    -->
+    p.quote {
+      margin-left: 2em;
+      padding: .5em;
+      background-color: #f0f0f0;
+      font-family: monospace;
+    }
+    <!--Example:
+        To override the colours of links only in the footer.
+    -->
+    #footer a { color: #0F3660; }
+    #footer a:visited { color: #009999; }
+    
+<!--Headers and Code -->
+        
+	#content h1 {
+	  margin-bottom: .5em;
+	  font-size: 200%; color: black;
+	  font-family: arial;
+	}  
+    h2, .h3 { font-size: 195%; color: black; font-family: arial; }
+	h3, .h4 { font-size: 140%; color: black; font-family: arial; margin-bottom: 0.5em; }
+	h4, .h5 { font-size: 125%; color: black;  font-style: italic; font-weight: bold; font-family: arial; }
+	h5, h6 { font-size: 110%; color: #363636; font-weight: bold; } 
+   
+    pre.code {
+      margin-left: 0em;
+      padding: 0.5em;
+      background-color: rgb(241,239,231);
+      font-family: monospace;
+    }   
+    
+  </extra-css>
+  
+  
+  <colors>
+<!-- These values are used for the generated CSS files.
+    They essentially "override" the default colors defined in the chosen skin.
+    There are four duplicate "groups" of colors below, denoted by comments:
+      Color group: Forrest, Krysalis, Collabnet, and Lenya using Pelt.
+    They are provided for example only. To customize the colors of any skin,
+    uncomment one of these groups of color elements and change the values
+    of the particular color elements that you wish to change.
+    Note that by default, all color groups are commented-out which means that
+    the default colors provided by the skin are being used.
+  -->
+  
+<!-- Color group: Forrest 
+
+    Example colors similar to forrest.apache.org
+    Some of the element names are obscure, so comments are added to show how
+    the "pelt" skin uses them, other skins might use these elements in a different way.
+    Tip: temporarily change the value of an element to red (#ff0000) and see the effect.
+     pelt: breadtrail: the strip at the top of the page and the second strip under the tabs
+     pelt: header: top strip containing project and group logos
+     pelt: heading|subheading: section headings within the content
+     pelt: navstrip: the strip under the tabs which contains the published date
+     pelt: menu: the left-hand navigation panel
+     pelt: toolbox: the selected menu item
+     pelt: searchbox: the background of the searchbox
+     pelt: border: line border around selected menu item
+     pelt: body: any remaining parts, e.g. the bottom of the page
+     pelt: footer: the second from bottom strip containing credit logos and published date
+     pelt: feedback: the optional bottom strip containing feedback link
+  -->
+<!--
+    <color name="breadtrail" value="#cedfef" font="#0F3660" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="header" value="#294563"/>
+    <color name="tab-selected" value="#4a6d8c" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="tab-unselected" value="#b5c7e7" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="subtab-selected" value="#4a6d8c" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="subtab-unselected" value="#4a6d8c" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="heading" value="#294563"/>
+    <color name="subheading" value="#4a6d8c"/>
+    <color name="published" value="#4C6C8F" font="#FFFFFF"/>
+    <color name="feedback" value="#4C6C8F" font="#FFFFFF" align="center"/>
+    <color name="navstrip" value="#4a6d8c" font="#ffffff" link="#0F3660" vlink="#0F3660" hlink="#000066"/>
+    <color name="menu" value="#4a6d8c" font="#cedfef" link="#ffffff" vlink="#ffffff" hlink="#ffcf00"/>    
+    <color name="toolbox" value="#4a6d8c"/>
+    <color name="border" value="#294563"/>
+    <color name="dialog" value="#4a6d8c"/>
+    <color name="searchbox" value="#4a6d8c" font="#000000"/>
+    <color name="body" value="#ffffff" link="#0F3660" vlink="#009999" hlink="#000066"/>
+    <color name="table" value="#7099C5"/>    
+    <color name="table-cell" value="#f0f0ff"/>    
+    <color name="highlight" value="#ffff00"/>
+    <color name="fixme" value="#cc6600"/>
+    <color name="note" value="#006699"/>
+    <color name="warning" value="#990000"/>
+    <color name="code" value="#CFDCED"/>
+    <color name="footer" value="#cedfef"/>
+-->
+
+
+<!-- Color group: Krysalis -->
+<!--
+    <color name="header"    value="#FFFFFF"/>
+
+    <color name="tab-selected" value="#a5b6c6" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="tab-unselected" value="#F7F7F7"  link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="subtab-selected" value="#a5b6c6"  link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="subtab-unselected" value="#a5b6c6"  link="#000000" vlink="#000000" hlink="#000000"/>
+
+    <color name="heading" value="#a5b6c6"/>
+    <color name="subheading" value="#CFDCED"/>
+        
+    <color name="navstrip" value="#CFDCED" font="#000000" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="toolbox" value="#a5b6c6"/>
+    <color name="border" value="#a5b6c6"/>
+        
+    <color name="menu" value="#F7F7F7" link="#000000" vlink="#000000" hlink="#000000"/>    
+    <color name="dialog" value="#F7F7F7"/>
+            
+    <color name="body"    value="#ffffff" link="#0F3660" vlink="#009999" hlink="#000066"/>
+    
+    <color name="table" value="#a5b6c6"/>    
+    <color name="table-cell" value="#ffffff"/>    
+    <color name="highlight" value="#ffff00"/>
+    <color name="fixme" value="#cc6600"/>
+    <color name="note" value="#006699"/>
+    <color name="warning" value="#990000"/>
+    <color name="code" value="#a5b6c6"/>
+        
+    <color name="footer" value="#a5b6c6"/>
+-->
+
+
+<!-- Color group: Collabnet -->
+<!--
+    <color name="header"    value="#003366"/>
+
+    <color name="tab-selected" value="#dddddd" link="#555555" vlink="#555555" hlink="#555555"/>
+    <color name="tab-unselected" value="#999999" link="#ffffff" vlink="#ffffff" hlink="#ffffff"/>
+    <color name="subtab-selected" value="#cccccc" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="subtab-unselected" value="#cccccc" link="#555555" vlink="#555555" hlink="#555555"/>
+
+    <color name="heading" value="#003366"/>
+    <color name="subheading" value="#888888"/>
+    
+    <color name="navstrip" value="#dddddd" font="#555555"/>
+    <color name="toolbox" value="#dddddd" font="#555555"/>
+    <color name="border" value="#999999"/>
+    
+    <color name="menu" value="#ffffff"/>    
+    <color name="dialog" value="#eeeeee"/>
+            
+    <color name="body"      value="#ffffff"/>
+    
+    <color name="table" value="#ccc"/>    
+    <color name="table-cell" value="#ffffff"/>   
+    <color name="highlight" value="#ffff00"/>
+    <color name="fixme" value="#cc6600"/>
+    <color name="note" value="#006699"/>
+    <color name="warning" value="#990000"/>
+    <color name="code" value="#003366"/>
+        
+    <color name="footer" value="#ffffff"/>
+-->
+
+
+<!-- Color group: Lenya using pelt-->
+<!--
+    <color name="header" value="#ffffff"/>
+
+    <color name="tab-selected" value="#E5E4D9" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="tab-unselected" value="#F5F4E9" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="subtab-selected" value="#000000" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="subtab-unselected" value="#E5E4D9" link="#000000" vlink="#000000" hlink="#000000"/>
+
+    <color name="heading" value="#E5E4D9"/>
+    <color name="subheading" value="#000000"/>
+    <color name="published" value="#000000"/>
+    <color name="navstrip" value="#E5E4D9" font="#000000"/>
+    <color name="toolbox" value="#CFDCED" font="#000000"/>
+    <color name="border" value="#999999"/>
+
+    <color name="menu" value="#E5E4D9" font="#000000" link="#000000" vlink="#000000" hlink="#000000"/>
+    <color name="dialog" value="#CFDCED"/>
+    <color name="body" value="#ffffff" />
+
+    <color name="table" value="#ccc"/>
+    <color name="table-cell" value="#ffffff"/>
+    <color name="highlight" value="#ffff00"/>
+    <color name="fixme" value="#cc6600"/>
+    <color name="note" value="#006699"/>
+    <color name="warning" value="#990000"/>
+    <color name="code" value="#003366"/>
+
+    <color name="footer" value="#E5E4D9"/>
+-->
+  </colors>
+  
+  
+<!-- Settings specific to PDF output. -->
+  <pdf>
+<!-- 
+       Supported page sizes are a0, a1, a2, a3, a4, a5, executive,
+       folio, legal, ledger, letter, quarto, tabloid (default letter).
+       Supported page orientations are portrait, landscape (default
+       portrait).
+       Supported text alignments are left, right, justify (default left).
+    -->
+    <page size="letter" orientation="portrait" text-align="left"/>
+<!-- 
+       Pattern of the page numbering in the footer - Default is "Page x".
+       first occurrence of '1' digit represents the current page number,
+       second occurrence of '1' digit represents the total page number,
+       anything else is considered as the static part of the numbering pattern.
+       Examples : x is the current page number, y the total page number.
+       <page-numbering-format>none</page-numbering-format> Do not displays the page numbering
+       <page-numbering-format>1</page-numbering-format> Displays "x"
+       <page-numbering-format>p1.</page-numbering-format> Displays "px."
+       <page-numbering-format>Page 1/1</page-numbering-format> Displays "Page x/y"
+       <page-numbering-format>(1-1)</page-numbering-format> Displays "(x-y)"
+    -->
+    <page-numbering-format>Page 1</page-numbering-format>
+<!--
+       Margins can be specified for top, bottom, inner, and outer
+       edges. If double-sided="false", the inner edge is always left
+       and the outer is always right. If double-sided="true", the
+       inner edge will be left on odd pages, right on even pages,
+       the outer edge vice versa.
+       Specified below are the default settings.
+    -->
+    <margins double-sided="false">
+      <top>1in</top>
+      <bottom>1in</bottom>
+      <inner>1.25in</inner>
+      <outer>1in</outer>
+    </margins>
+<!--
+      Print the URL text next to all links going outside the file
+    -->
+    <show-external-urls>false</show-external-urls>
+<!--
+      Disable the copyright footer on each page of the PDF.
+      A footer is composed for each page. By default, a "credit" with role=pdf
+      will be used, as explained below. Otherwise a copyright statement
+      will be generated. This latter can be disabled.
+    -->
+    <disable-copyright-footer>false</disable-copyright-footer>
+  </pdf>
+
+</skinconfig>

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatCli.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatCli.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatCli.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatCli.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,300 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli;
+
+import java.io.BufferedReader;
+import java.io.FileNotFoundException;
+import java.io.FileReader;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.io.PrintStream;
+import java.io.PrintWriter;
+import java.io.UnsupportedEncodingException;
+import java.util.ArrayList;
+
+import org.apache.commons.cli.CommandLine;
+import org.apache.commons.cli.GnuParser;
+import org.apache.commons.cli.HelpFormatter;
+import org.apache.commons.cli.Option;
+import org.apache.commons.cli.OptionBuilder;
+import org.apache.commons.cli.Options;
+import org.apache.commons.cli.ParseException;
+import org.apache.commons.cli.Parser;
+import org.apache.commons.lang.StringUtils;
+import org.apache.hadoop.fs.permission.FsPermission;
+import org.apache.hadoop.hive.cli.CliSessionState;
+import org.apache.hadoop.hive.conf.HiveConf;
+import org.apache.hadoop.hive.conf.HiveConf.ConfVars;
+import org.apache.hadoop.hive.ql.Driver;
+import org.apache.hadoop.hive.ql.processors.SetProcessor;
+import org.apache.hadoop.hive.ql.session.SessionState;
+import org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer;
+import org.apache.hcatalog.common.HCatConstants;
+import org.apache.hcatalog.common.HCatUtil;
+
+public class HCatCli {
+
+  @SuppressWarnings("static-access")
+  public static void main(String[] args) {
+
+    SessionState.initHiveLog4j();
+
+    CliSessionState ss = new CliSessionState(new HiveConf(SessionState.class));
+    ss.in = System.in;
+    try {
+      ss.out = new PrintStream(System.out, true, "UTF-8");
+      ss.err = new PrintStream(System.err, true, "UTF-8");
+    } catch (UnsupportedEncodingException e) {
+      System.exit(1);
+    }
+
+    HiveConf conf = ss.getConf();
+
+    HiveConf.setVar(conf, ConfVars.SEMANTIC_ANALYZER_HOOK, HCatSemanticAnalyzer.class.getName());
+
+    SessionState.start(ss);
+
+    Options options = new Options();
+
+    // -e 'quoted-query-string'
+    options.addOption(OptionBuilder
+        .hasArg()
+        .withArgName("exec")
+        .withDescription("hcat command given from command line")
+        .create('e'));
+
+    // -f <query-file>
+    options.addOption(OptionBuilder
+        .hasArg()
+        .withArgName("file")
+        .withDescription("hcat commands in file")
+        .create('f'));
+
+    // -g
+    options.addOption(OptionBuilder
+        .hasArg().
+        withArgName("group").
+        withDescription("group for the db/table specified in CREATE statement").
+        create('g'));
+
+    // -p
+    options.addOption(OptionBuilder
+        .hasArg()
+        .withArgName("perms")
+        .withDescription("permissions for the db/table specified in CREATE statement")
+        .create('p'));
+
+    // [-h|--help]
+    options.addOption(new Option("h", "help", false, "Print help information"));
+
+    Parser parser = new GnuParser();
+    CommandLine cmdLine = null;
+
+    try {
+      cmdLine  = parser.parse(options,args);
+
+    } catch (ParseException e) {
+      printUsage(options, ss.err);
+      System.exit(1);
+    }
+    // -e
+    String execString = (String) cmdLine.getOptionValue('e');
+    // -f
+    String fileName = (String) cmdLine.getOptionValue('f');
+    // -h
+    if (cmdLine.hasOption('h')) {
+      printUsage(options,ss.out);
+      System.exit(0);
+    }
+
+    if (execString != null && fileName != null) {
+      System.err.println("The '-e' and '-f' options cannot be specified simultaneously");
+      printUsage(options,ss.err);
+      System.exit(1);
+    }
+
+    // -p
+    String perms = (String)cmdLine.getOptionValue('p');
+    if(perms != null){
+      validatePermissions(ss, conf, perms);
+    }
+
+    // -g
+    String grp = (String) cmdLine.getOptionValue('g');
+    if(grp != null){
+      conf.set(HCatConstants.HCAT_GROUP, grp);
+    }
+
+    if (execString != null) {
+      System.exit(processLine(execString));
+    }
+
+    try {
+      if (fileName != null) {
+        System.exit(processFile(fileName));
+      }
+    } catch (FileNotFoundException e) {
+      ss.err.println("Input file not found. (" + e.getMessage() + ")");
+      System.exit(1);
+    } catch (IOException e) {
+      ss.err.println("Could not open input file for reading. (" + e.getMessage() + ")");
+      System.exit(1);
+    }
+
+    // -h
+    printUsage(options, ss.err);
+    System.exit(1);
+  }
+
+  private static int processLine(String line) {
+    int ret = 0;
+
+    String command = "";
+    for (String oneCmd : line.split(";")) {
+
+      if (StringUtils.endsWith(oneCmd, "\\")) {
+        command += StringUtils.chop(oneCmd) + ";";
+        continue;
+      } else {
+        command += oneCmd;
+      }
+      if (StringUtils.isBlank(command)) {
+        continue;
+      }
+
+      ret = processCmd(command);
+      command = "";
+    }
+    return ret;
+  }
+
+  private static int processFile(String fileName) throws IOException {
+    FileReader fileReader = null;
+    BufferedReader reader = null;
+    try {
+      fileReader = new FileReader(fileName);
+      reader = new BufferedReader(fileReader);
+      String line;
+      StringBuilder qsb = new StringBuilder();
+
+      while ((line = reader.readLine()) != null) {
+        qsb.append(line + "\n");
+      }
+
+      return (processLine(qsb.toString()));
+    } finally {
+      if (fileReader != null) {
+        fileReader.close();
+      }
+      if(reader != null) {
+        reader.close();
+      }
+    }
+  }
+
+  private static int processCmd(String cmd){
+
+    SessionState ss = SessionState.get();
+    long start = System.currentTimeMillis();
+
+    cmd = cmd.trim();
+    String firstToken = cmd.split("\\s+")[0].trim();
+
+    if(firstToken.equalsIgnoreCase("set")){
+      return new SetProcessor().run(cmd.substring(firstToken.length()).trim()).getResponseCode();
+    }
+
+    Driver driver = new HCatDriver();
+
+    int ret = driver.run(cmd).getResponseCode();
+
+    if (ret != 0) {
+      driver.close();
+      System.exit(ret);
+    }
+
+    ArrayList<String> res = new ArrayList<String>();
+    try {
+      while (driver.getResults(res)) {
+        for (String r : res) {
+          ss.out.println(r);
+        }
+        res.clear();
+      }
+    } catch (IOException e) {
+      ss.err.println("Failed with exception " + e.getClass().getName() + ":"
+          + e.getMessage() + "\n" + org.apache.hadoop.util.StringUtils.stringifyException(e));
+      ret = 1;
+    }
+
+    int cret = driver.close();
+    if (ret == 0) {
+      ret = cret;
+    }
+
+    long end = System.currentTimeMillis();
+    if (end > start) {
+      double timeTaken = (end - start) / 1000.0;
+      ss.err.println("Time taken: " + timeTaken + " seconds");
+    }
+    return ret;
+  }
+
+  private static void printUsage(Options options, OutputStream os) {
+    PrintWriter pw = new PrintWriter(os);
+    new HelpFormatter().printHelp(pw, 2 * HelpFormatter.DEFAULT_WIDTH,
+        "hcat { -e \"<query>\" | -f \"<filepath>\" } [ -g \"<group>\" ] [ -p \"<perms>\" ]",
+        null,options, HelpFormatter.DEFAULT_LEFT_PAD,HelpFormatter.DEFAULT_DESC_PAD,
+        null, false);
+    pw.flush();
+  }
+
+  private static void validatePermissions(CliSessionState ss, HiveConf conf, String perms) {
+    perms = perms.trim();
+    FsPermission fp = null;
+
+    if (perms.matches("^\\s*([r,w,x,-]{9})\\s*$")){
+      fp = FsPermission.valueOf("d"+perms);
+    } else if (perms.matches("^\\s*([0-7]{3})\\s*$")){
+      fp = new FsPermission(Short.decode("0"+perms));
+    } else {
+      ss.err.println("Invalid permission specification: "+perms);
+      System.exit(1);
+    }
+
+    if (!HCatUtil.validateMorePermissive(fp.getUserAction(),fp.getGroupAction())){
+      ss.err.println("Invalid permission specification: "+perms+" : user permissions must be more permissive than group permission ");
+      System.exit(1);
+    }
+    if (!HCatUtil.validateMorePermissive(fp.getGroupAction(),fp.getOtherAction())){
+      ss.err.println("Invalid permission specification: "+perms+" : group permissions must be more permissive than other permission ");
+      System.exit(1);
+    }
+    if ( (!HCatUtil.validateExecuteBitPresentIfReadOrWrite(fp.getUserAction())) ||
+        (!HCatUtil.validateExecuteBitPresentIfReadOrWrite(fp.getGroupAction())) ||
+        (!HCatUtil.validateExecuteBitPresentIfReadOrWrite(fp.getOtherAction())) ) {
+      ss.err.println("Invalid permission specification: "+perms+" : permissions must have execute permissions if read or write permissions are specified ");
+      System.exit(1);
+    }
+
+    conf.set(HCatConstants.HCAT_PERMS, "d"+fp.toString());
+
+  }
+
+
+}

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatDriver.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatDriver.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatDriver.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/HCatDriver.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,129 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileSystem;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.fs.permission.FsPermission;
+import org.apache.hadoop.hive.metastore.MetaStoreUtils;
+import org.apache.hadoop.hive.metastore.Warehouse;
+import org.apache.hadoop.hive.ql.Driver;
+import org.apache.hadoop.hive.ql.metadata.Hive;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.metadata.Table;
+import org.apache.hadoop.hive.ql.processors.CommandProcessorResponse;
+import org.apache.hadoop.hive.ql.session.SessionState;
+import org.apache.hcatalog.common.HCatConstants;
+
+public class HCatDriver extends Driver {
+
+  @Override
+  public CommandProcessorResponse run(String command) {
+
+    int ret = super.run(command).getResponseCode();
+
+    SessionState ss = SessionState.get();
+
+    if (ret == 0){
+      // Only attempt to do this, if cmd was successful.
+      ret = setFSPermsNGrp(ss);
+    }
+    // reset conf vars
+    ss.getConf().set(HCatConstants.HCAT_CREATE_DB_NAME, "");
+    ss.getConf().set(HCatConstants.HCAT_CREATE_TBL_NAME, "");
+
+    return new CommandProcessorResponse(ret);
+  }
+
+  private int setFSPermsNGrp(SessionState ss) {
+
+    Configuration conf =ss.getConf();
+
+    String tblName = conf.get(HCatConstants.HCAT_CREATE_TBL_NAME,"");
+    String dbName = conf.get(HCatConstants.HCAT_CREATE_DB_NAME, "");
+    String grp = conf.get(HCatConstants.HCAT_GROUP,null);
+    String permsStr = conf.get(HCatConstants.HCAT_PERMS,null);
+
+    if(tblName.isEmpty() && dbName.isEmpty()){
+      // it wasn't create db/table
+      return 0;
+    }
+
+    if(null == grp && null == permsStr) {
+      // there were no grp and perms to begin with.
+      return 0;
+    }
+
+    FsPermission perms = FsPermission.valueOf(permsStr);
+    if(!tblName.isEmpty()){
+      Hive db = null;
+      try{
+        db = Hive.get();
+        Table tbl =  db.getTable(tblName);
+        Path tblPath = tbl.getPath();
+
+        FileSystem fs = tblPath.getFileSystem(conf);
+        if(null != perms){
+          fs.setPermission(tblPath, perms);
+        }
+        if(null != grp){
+          fs.setOwner(tblPath, null, grp);
+        }
+        return 0;
+
+      } catch (Exception e){
+          ss.err.println(String.format("Failed to set permissions/groups on TABLE: <%s> %s",tblName,e.getMessage()));
+          try {  // We need to drop the table.
+            if(null != db){ db.dropTable(tblName); }
+          } catch (HiveException he) {
+            ss.err.println(String.format("Failed to drop TABLE <%s> after failing to set permissions/groups on it. %s",tblName,e.getMessage()));
+          }
+          return 1;
+      }
+    }
+    else{
+      // looks like a db operation
+      if (dbName.isEmpty() || dbName.equals(MetaStoreUtils.DEFAULT_DATABASE_NAME)){
+        // We dont set perms or groups for default dir.
+        return 0;
+      }
+      else{
+        try{
+          Path dbPath = new Warehouse(conf).getDefaultDatabasePath(dbName);
+          FileSystem fs = dbPath.getFileSystem(conf);
+          if(perms != null){
+            fs.setPermission(dbPath, perms);
+          }
+          if(null != grp){
+            fs.setOwner(dbPath, null, grp);
+          }
+          return 0;
+        } catch (Exception e){
+          ss.err.println(String.format("Failed to set permissions and/or group on DB: <%s> %s", dbName, e.getMessage()));
+          try {
+            Hive.get().dropDatabase(dbName);
+          } catch (Exception e1) {
+            ss.err.println(String.format("Failed to drop DB <%s> after failing to set permissions/group on it. %s", dbName, e1.getMessage()));
+          }
+          return 1;
+        }
+      }
+    }
+  }
+}

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AddPartitionHook.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AddPartitionHook.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AddPartitionHook.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AddPartitionHook.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli.SemanticAnalysis;
+
+import java.util.Map;
+
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.parse.ASTNode;
+import org.apache.hadoop.hive.ql.parse.AbstractSemanticAnalyzerHook;
+import org.apache.hadoop.hive.ql.parse.HiveSemanticAnalyzerHookContext;
+import org.apache.hadoop.hive.ql.parse.SemanticException;
+import org.apache.hcatalog.common.HCatConstants;
+
+public class AddPartitionHook extends AbstractSemanticAnalyzerHook{
+
+  private String tblName, inDriver, outDriver;
+
+  @Override
+  public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
+      throws SemanticException {
+    Map<String, String> tblProps;
+    tblName = ast.getChild(0).getText();
+    try {
+      tblProps = context.getHive().getTable(tblName).getParameters();
+    } catch (HiveException he) {
+      throw new SemanticException(he);
+    }
+
+    inDriver = tblProps.get(HCatConstants.HCAT_ISD_CLASS);
+    outDriver = tblProps.get(HCatConstants.HCAT_OSD_CLASS);
+
+    if(inDriver == null  || outDriver == null){
+      throw new SemanticException("Operation not supported. Partitions can be added only in a table created through HCatalog. " +
+      		"It seems table "+tblName+" was not created through HCatalog.");
+    }
+    return ast;
+  }
+
+//  @Override
+//  public void postAnalyze(HiveSemanticAnalyzerHookContext context,
+//      List<Task<? extends Serializable>> rootTasks) throws SemanticException {
+//
+//    try {
+//      Hive db = context.getHive();
+//      Table tbl = db.getTable(MetaStoreUtils.DEFAULT_DATABASE_NAME, tblName);
+//      for(Task<? extends Serializable> task : rootTasks){
+//        System.err.println("PArt spec: "+((DDLWork)task.getWork()).getAddPartitionDesc().getPartSpec());
+//        Partition part = db.getPartition(tbl,((DDLWork)task.getWork()).getAddPartitionDesc().getPartSpec(),false);
+//        Map<String,String> partParams = part.getParameters();
+//        if(partParams == null){
+//          System.err.println("Part map null ");
+//          partParams = new HashMap<String, String>();
+//        }
+//        partParams.put(InitializeInput.HOWL_ISD_CLASS, inDriver);
+//        partParams.put(InitializeInput.HOWL_OSD_CLASS, outDriver);
+//        part.getTPartition().setParameters(partParams);
+//        db.alterPartition(tblName, part);
+//      }
+//    } catch (HiveException he) {
+//      throw new SemanticException(he);
+//    } catch (InvalidOperationException e) {
+//      throw new SemanticException(e);
+//    }
+//  }
+}
+
+
+

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AlterTableFileFormatHook.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AlterTableFileFormatHook.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AlterTableFileFormatHook.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/AlterTableFileFormatHook.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli.SemanticAnalysis;
+
+import java.io.Serializable;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.hadoop.hive.metastore.api.InvalidOperationException;
+import org.apache.hadoop.hive.ql.exec.Task;
+import org.apache.hadoop.hive.ql.io.RCFileInputFormat;
+import org.apache.hadoop.hive.ql.io.RCFileOutputFormat;
+import org.apache.hadoop.hive.ql.metadata.Hive;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.metadata.Partition;
+import org.apache.hadoop.hive.ql.metadata.Table;
+import org.apache.hadoop.hive.ql.parse.ASTNode;
+import org.apache.hadoop.hive.ql.parse.AbstractSemanticAnalyzerHook;
+import org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer;
+import org.apache.hadoop.hive.ql.parse.HiveParser;
+import org.apache.hadoop.hive.ql.parse.HiveSemanticAnalyzerHookContext;
+import org.apache.hadoop.hive.ql.parse.SemanticException;
+import org.apache.hadoop.hive.ql.plan.DDLWork;
+import org.apache.hcatalog.common.HCatConstants;
+import org.apache.hcatalog.rcfile.RCFileInputDriver;
+import org.apache.hcatalog.rcfile.RCFileOutputDriver;
+
+public class AlterTableFileFormatHook extends AbstractSemanticAnalyzerHook {
+
+  private String inDriver, outDriver, tableName;
+
+  @Override
+  public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast) throws SemanticException {
+
+    String inputFormat = null, outputFormat = null;
+    tableName = BaseSemanticAnalyzer.unescapeIdentifier(((ASTNode)ast.getChild(0)).getChild(0).getText());
+    ASTNode child =  (ASTNode)((ASTNode)ast.getChild(1)).getChild(0);
+
+    switch (child.getToken().getType()) {
+    case HiveParser.TOK_TABLEFILEFORMAT:
+      inputFormat  = BaseSemanticAnalyzer.unescapeSQLString(((ASTNode) child.getChild(0)).getToken().getText());
+      outputFormat = BaseSemanticAnalyzer.unescapeSQLString(((ASTNode) child.getChild(1)).getToken().getText());
+      inDriver     = BaseSemanticAnalyzer.unescapeSQLString(((ASTNode) child.getChild(2)).getToken().getText());
+      outDriver    = BaseSemanticAnalyzer.unescapeSQLString(((ASTNode) child.getChild(3)).getToken().getText());
+      break;
+
+    case HiveParser.TOK_TBLSEQUENCEFILE:
+      throw new SemanticException("Operation not supported. HCatalog doesn't support Sequence File by default yet. " +
+      "You may specify it through INPUT/OUTPUT storage drivers.");
+
+    case HiveParser.TOK_TBLTEXTFILE:
+      throw new SemanticException("Operation not supported. HCatalog doesn't support Text File by default yet. " +
+      "You may specify it through INPUT/OUTPUT storage drivers.");
+
+    case HiveParser.TOK_TBLRCFILE:
+      inputFormat = RCFileInputFormat.class.getName();
+      outputFormat = RCFileOutputFormat.class.getName();
+      inDriver = RCFileInputDriver.class.getName();
+      outDriver = RCFileOutputDriver.class.getName();
+      break;
+    }
+
+    if(inputFormat == null || outputFormat == null || inDriver == null || outDriver == null){
+      throw new SemanticException("File format specification in command Alter Table file format is incorrect.");
+    }
+    return ast;
+  }
+
+  @Override
+  public void postAnalyze(HiveSemanticAnalyzerHookContext context,
+      List<Task<? extends Serializable>> rootTasks) throws SemanticException {
+
+    Map<String,String> partSpec = ((DDLWork)rootTasks.get(rootTasks.size()-1).getWork()).getAlterTblDesc().getPartSpec();
+    Map<String, String> howlProps = new HashMap<String, String>(2);
+    howlProps.put(HCatConstants.HCAT_ISD_CLASS, inDriver);
+    howlProps.put(HCatConstants.HCAT_OSD_CLASS, outDriver);
+
+    try {
+      Hive db = context.getHive();
+      Table tbl = db.getTable(tableName);
+      if(partSpec == null){
+        // File format is for table; not for partition.
+        tbl.getTTable().getParameters().putAll(howlProps);
+        db.alterTable(tableName, tbl);
+      }else{
+        Partition part = db.getPartition(tbl,partSpec,false);
+        Map<String,String> partParams = part.getParameters();
+        if(partParams == null){
+          partParams = new HashMap<String, String>();
+        }
+        partParams.putAll(howlProps);
+        part.getTPartition().setParameters(partParams);
+        db.alterPartition(tableName, part);
+      }
+    } catch (HiveException he) {
+      throw new SemanticException(he);
+    } catch (InvalidOperationException e) {
+      throw new SemanticException(e);
+    }
+  }
+}

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,83 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli.SemanticAnalysis;
+
+import java.io.Serializable;
+import java.util.List;
+
+import org.apache.hadoop.hive.ql.exec.Task;
+import org.apache.hadoop.hive.ql.metadata.Hive;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.parse.ASTNode;
+import org.apache.hadoop.hive.ql.parse.AbstractSemanticAnalyzerHook;
+import org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer;
+import org.apache.hadoop.hive.ql.parse.HiveParser;
+import org.apache.hadoop.hive.ql.parse.HiveSemanticAnalyzerHookContext;
+import org.apache.hadoop.hive.ql.parse.SemanticException;
+import org.apache.hcatalog.common.HCatConstants;
+
+final class CreateDatabaseHook  extends AbstractSemanticAnalyzerHook{
+
+  String databaseName;
+
+  @Override
+  public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
+  throws SemanticException {
+
+    Hive db;
+    try {
+      db = context.getHive();
+    } catch (HiveException e) {
+      throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
+    }
+
+    // Analyze and create tbl properties object
+    int numCh = ast.getChildCount();
+
+    databaseName = BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText());
+
+    for (int num = 1; num < numCh; num++) {
+      ASTNode child = (ASTNode) ast.getChild(num);
+
+      switch (child.getToken().getType()) {
+
+      case HiveParser.TOK_QUERY: // CTAS
+        throw new SemanticException("Operation not supported. Create db as Select is not a valid operation.");
+
+      case HiveParser.TOK_IFNOTEXISTS:
+        try {
+          List<String> dbs = db.getDatabasesByPattern(databaseName);
+          if (dbs != null && dbs.size() > 0) { // db exists
+            return ast;
+          }
+        } catch (HiveException e) {
+          throw new SemanticException(e);
+        }
+        break;
+      }
+    }
+
+    return ast;
+  }
+
+  @Override
+  public void postAnalyze(HiveSemanticAnalyzerHookContext context,
+      List<Task<? extends Serializable>> rootTasks) throws SemanticException {
+    context.getConf().set(HCatConstants.HCAT_CREATE_DB_NAME, databaseName);
+  }
+}

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,211 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli.SemanticAnalysis;
+
+import java.io.Serializable;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.fs.permission.FsAction;
+import org.apache.hadoop.hive.metastore.Warehouse;
+import org.apache.hadoop.hive.metastore.api.FieldSchema;
+import org.apache.hadoop.hive.metastore.api.MetaException;
+import org.apache.hadoop.hive.ql.exec.DDLTask;
+import org.apache.hadoop.hive.ql.exec.Task;
+import org.apache.hadoop.hive.ql.io.RCFileInputFormat;
+import org.apache.hadoop.hive.ql.io.RCFileOutputFormat;
+import org.apache.hadoop.hive.ql.metadata.Hive;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.parse.ASTNode;
+import org.apache.hadoop.hive.ql.parse.AbstractSemanticAnalyzerHook;
+import org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer;
+import org.apache.hadoop.hive.ql.parse.HiveParser;
+import org.apache.hadoop.hive.ql.parse.HiveSemanticAnalyzerHookContext;
+import org.apache.hadoop.hive.ql.parse.SemanticException;
+import org.apache.hadoop.hive.ql.plan.CreateTableDesc;
+import org.apache.hcatalog.common.AuthUtils;
+import org.apache.hcatalog.common.HCatConstants;
+import org.apache.hcatalog.common.HCatException;
+import org.apache.hcatalog.rcfile.RCFileInputDriver;
+import org.apache.hcatalog.rcfile.RCFileOutputDriver;
+
+final class CreateTableHook  extends AbstractSemanticAnalyzerHook{
+
+  private String inStorageDriver, outStorageDriver, tableName;
+
+  @Override
+  public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
+  throws SemanticException {
+
+    Hive db;
+    try {
+      db = context.getHive();
+    } catch (HiveException e) {
+      throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
+    }
+
+    // Analyze and create tbl properties object
+    int numCh = ast.getChildCount();
+
+    String inputFormat = null, outputFormat = null;
+    tableName = BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText());
+
+    for (int num = 1; num < numCh; num++) {
+      ASTNode child = (ASTNode) ast.getChild(num);
+
+      switch (child.getToken().getType()) {
+
+      case HiveParser.TOK_QUERY: // CTAS
+        throw new SemanticException("Operation not supported. Create table as Select is not a valid operation.");
+
+      case HiveParser.TOK_TABLEBUCKETS:
+        throw new SemanticException("Operation not supported. HCatalog doesn't allow Clustered By in create table.");
+
+      case HiveParser.TOK_TBLSEQUENCEFILE:
+        throw new SemanticException("Operation not supported. HCatalog doesn't support Sequence File by default yet. " +
+        "You may specify it through INPUT/OUTPUT storage drivers.");
+
+      case HiveParser.TOK_TBLTEXTFILE:
+        throw new SemanticException("Operation not supported. HCatalog doesn't support Text File by default yet. " +
+        "You may specify it through INPUT/OUTPUT storage drivers.");
+
+      case HiveParser.TOK_LIKETABLE:
+
+        String likeTableName;
+        if (child.getChildCount() > 0 && (likeTableName = BaseSemanticAnalyzer.unescapeIdentifier(child.getChild(0).getText())) != null) {
+
+          throw new SemanticException("Operation not supported. CREATE TABLE LIKE is not supported.");
+//          Map<String, String> tblProps;
+//          try {
+//            tblProps = db.getTable(MetaStoreUtils.DEFAULT_DATABASE_NAME, likeTableName).getParameters();
+//          } catch (HiveException he) {
+//            throw new SemanticException(he);
+//          }
+//          if(!(tblProps.containsKey(InitializeInput.HOWL_ISD_CLASS) && tblProps.containsKey(InitializeInput.HOWL_OSD_CLASS))){
+//            throw new SemanticException("Operation not supported. Table "+likeTableName+" should have been created through Howl. Seems like its not.");
+//          }
+//          return ast;
+        }
+        break;
+
+      case HiveParser.TOK_IFNOTEXISTS:
+        try {
+          List<String> tables = db.getTablesByPattern(tableName);
+          if (tables != null && tables.size() > 0) { // table exists
+            return ast;
+          }
+        } catch (HiveException e) {
+          throw new SemanticException(e);
+        }
+        break;
+
+      case HiveParser.TOK_TABLEPARTCOLS:
+        List<FieldSchema> partCols = BaseSemanticAnalyzer.getColumns((ASTNode) child.getChild(0), false);
+        for(FieldSchema fs : partCols){
+          if(!fs.getType().equalsIgnoreCase("string")){
+            throw new SemanticException("Operation not supported. HCatalog only supports partition columns of type string. " +
+                "For column: "+fs.getName()+" Found type: "+fs.getType());
+          }
+        }
+        break;
+
+      case HiveParser.TOK_TABLEFILEFORMAT:
+        if(child.getChildCount() < 4) {
+          throw new SemanticException("Incomplete specification of File Format. You must provide InputFormat, OutputFormat, InputDriver, OutputDriver.");
+        }
+        inputFormat      = BaseSemanticAnalyzer.unescapeSQLString(child.getChild(0).getText());
+        outputFormat     = BaseSemanticAnalyzer.unescapeSQLString(child.getChild(1).getText());
+        inStorageDriver  = BaseSemanticAnalyzer.unescapeSQLString(child.getChild(2).getText());
+        outStorageDriver = BaseSemanticAnalyzer.unescapeSQLString(child.getChild(3).getText());
+        break;
+
+      case HiveParser.TOK_TBLRCFILE:
+        inputFormat      = RCFileInputFormat.class.getName();
+        outputFormat     = RCFileOutputFormat.class.getName();
+        inStorageDriver  = RCFileInputDriver.class.getName();
+        outStorageDriver = RCFileOutputDriver.class.getName();
+        break;
+
+      }
+    }
+
+    if(inputFormat == null || outputFormat == null || inStorageDriver == null || outStorageDriver == null){
+      throw new SemanticException("STORED AS specification is either incomplete or incorrect.");
+    }
+
+    return ast;
+  }
+
+  @Override
+  public void postAnalyze(HiveSemanticAnalyzerHookContext context, List<Task<? extends Serializable>> rootTasks) throws SemanticException {
+
+    if(rootTasks.size() == 0){
+      // There will be no DDL task created in case if its CREATE TABLE IF NOT EXISTS
+      return;
+    }
+    CreateTableDesc desc = ((DDLTask)rootTasks.get(rootTasks.size()-1)).getWork().getCreateTblDesc();
+
+    // first check if we will allow the user to create table.
+    authorize(context, desc.getLocation());
+
+    if(desc == null){
+      // Desc will be null if its CREATE TABLE LIKE. Desc will be contained
+      // in CreateTableLikeDesc. Currently, Howl disallows CTLT in pre-hook.
+      // So, desc can never be null.
+      return;
+    }
+    Map<String,String> tblProps = desc.getTblProps();
+    if(tblProps == null) {
+      // tblProps will be null if user didnt use tblprops in his CREATE TABLE cmd.
+      tblProps = new HashMap<String, String>();
+    }
+    tblProps.put(HCatConstants.HCAT_ISD_CLASS, inStorageDriver);
+    tblProps.put(HCatConstants.HCAT_OSD_CLASS, outStorageDriver);
+    desc.setTblProps(tblProps);
+    context.getConf().set(HCatConstants.HCAT_CREATE_TBL_NAME, tableName);
+  }
+
+  private void authorize(HiveSemanticAnalyzerHookContext context, String loc) throws SemanticException{
+
+    Path tblDir;
+    Configuration conf = context.getConf();
+    try {
+      Warehouse wh = new Warehouse(conf);
+      if (loc == null || loc.isEmpty()){
+        tblDir = wh.getDnsPath(wh.getDefaultTablePath(context.getHive().getCurrentDatabase(), tableName).getParent());
+      }
+      else{
+        tblDir = wh.getDnsPath(new Path(loc));
+      }
+
+      try {
+        AuthUtils.authorize(tblDir, FsAction.WRITE, conf);
+      } catch (HCatException e) {
+        throw new SemanticException(e);
+      }
+    }
+    catch (MetaException e) {
+      throw new SemanticException(e);
+    } catch (HiveException e) {
+      throw new SemanticException(e);
+    }
+  }
+}

Added: incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java
URL: http://svn.apache.org/viewvc/incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java?rev=1091509&view=auto
==============================================================================
--- incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java (added)
+++ incubator/hcatalog/trunk/src/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java Tue Apr 12 17:30:08 2011
@@ -0,0 +1,207 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hcatalog.cli.SemanticAnalysis;
+
+import java.io.Serializable;
+import java.util.List;
+
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.fs.permission.FsAction;
+import org.apache.hadoop.hive.metastore.Warehouse;
+import org.apache.hadoop.hive.metastore.api.MetaException;
+import org.apache.hadoop.hive.ql.exec.Task;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.ql.metadata.InvalidTableException;
+import org.apache.hadoop.hive.ql.metadata.Table;
+import org.apache.hadoop.hive.ql.parse.ASTNode;
+import org.apache.hadoop.hive.ql.parse.AbstractSemanticAnalyzerHook;
+import org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer;
+import org.apache.hadoop.hive.ql.parse.HiveParser;
+import org.apache.hadoop.hive.ql.parse.HiveSemanticAnalyzerHookContext;
+import org.apache.hadoop.hive.ql.parse.SemanticException;
+import org.apache.hcatalog.common.AuthUtils;
+import org.apache.hcatalog.common.ErrorType;
+import org.apache.hcatalog.common.HCatException;
+
+public class HCatSemanticAnalyzer extends AbstractSemanticAnalyzerHook {
+
+  private AbstractSemanticAnalyzerHook hook;
+  private ASTNode ast;
+
+  @Override
+  public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
+      throws SemanticException {
+
+    this.ast = ast;
+    switch (ast.getToken().getType()) {
+
+    // Howl wants to intercept following tokens and special-handle them.
+    case HiveParser.TOK_CREATETABLE:
+      hook = new CreateTableHook();
+      return hook.preAnalyze(context, ast);
+
+    case HiveParser.TOK_CREATEDATABASE:
+      hook = new CreateDatabaseHook();
+      return hook.preAnalyze(context, ast);
+
+    // DML commands used in Howl where we use the same implementation as default Hive.
+    case HiveParser.TOK_SHOWDATABASES:
+    case HiveParser.TOK_DROPDATABASE:
+    case HiveParser.TOK_SWITCHDATABASE:
+      return ast;
+
+    // Howl will allow these operations to be performed since they are DDL statements.
+    case HiveParser.TOK_DROPTABLE:
+    case HiveParser.TOK_DESCTABLE:
+    case HiveParser.TOK_ALTERTABLE_ADDCOLS:
+    case HiveParser.TOK_ALTERTABLE_RENAME:
+    case HiveParser.TOK_ALTERTABLE_DROPPARTS:
+    case HiveParser.TOK_ALTERTABLE_PROPERTIES:
+    case HiveParser.TOK_ALTERTABLE_SERIALIZER:
+    case HiveParser.TOK_ALTERTABLE_SERDEPROPERTIES:
+    case HiveParser.TOK_SHOW_TABLESTATUS:
+    case HiveParser.TOK_SHOWTABLES:
+    case HiveParser.TOK_SHOWPARTITIONS:
+      return ast;
+
+    case HiveParser.TOK_ALTERTABLE_ADDPARTS:
+      hook = new AddPartitionHook();
+      return hook.preAnalyze(context, ast);
+
+    case HiveParser.TOK_ALTERTABLE_PARTITION:
+      if (((ASTNode)ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_FILEFORMAT) {
+        hook = new AlterTableFileFormatHook();
+        return hook.preAnalyze(context, ast);
+      } else {
+        return ast;
+      }
+
+    // In all other cases, throw an exception. Its a white-list of allowed operations.
+    default:
+      throw new SemanticException("Operation not supported.");
+
+    }
+  }
+
+  @Override
+  public void postAnalyze(HiveSemanticAnalyzerHookContext context,
+      List<Task<? extends Serializable>> rootTasks) throws SemanticException {
+
+    try{
+
+      switch (ast.getToken().getType()) {
+
+      case HiveParser.TOK_DESCTABLE:
+        authorize(getFullyQualifiedName((ASTNode) ast.getChild(0).getChild(0)), context, FsAction.READ, false);
+        break;
+
+      case HiveParser.TOK_SHOWPARTITIONS:
+        authorize(BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText()), context, FsAction.READ, false);
+        break;
+
+      case HiveParser.TOK_ALTERTABLE_ADDPARTS:
+      case HiveParser.TOK_DROPTABLE:
+      case HiveParser.TOK_ALTERTABLE_ADDCOLS:
+      case HiveParser.TOK_ALTERTABLE_RENAME:
+      case HiveParser.TOK_ALTERTABLE_DROPPARTS:
+      case HiveParser.TOK_ALTERTABLE_PROPERTIES:
+      case HiveParser.TOK_ALTERTABLE_SERIALIZER:
+      case HiveParser.TOK_ALTERTABLE_SERDEPROPERTIES:
+        authorize(BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText()), context, FsAction.WRITE, false);
+        break;
+
+      case HiveParser.TOK_ALTERTABLE_PARTITION:
+        authorize(BaseSemanticAnalyzer.unescapeIdentifier(((ASTNode)ast.getChild(0)).getChild(0).getText()), context, FsAction.WRITE, false);
+        break;
+
+      case HiveParser.TOK_SWITCHDATABASE:
+        authorize(BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText()), context, FsAction.READ, true);
+        break;
+
+      case HiveParser.TOK_DROPDATABASE:
+        authorize(BaseSemanticAnalyzer.unescapeIdentifier(ast.getChild(0).getText()), context, FsAction.WRITE, true);
+        break;
+
+      case HiveParser.TOK_CREATEDATABASE:
+      case HiveParser.TOK_SHOWDATABASES:
+      case HiveParser.TOK_SHOW_TABLESTATUS:
+      case HiveParser.TOK_SHOWTABLES:
+        // We do no checks for show tables/db , create db. Its always allowed.
+
+      case HiveParser.TOK_CREATETABLE:
+        // No checks for Create Table, since its not possible to compute location
+        // here easily. So, it is especially handled in CreateTable post hook.
+        break;
+
+      default:
+        throw new HCatException(ErrorType.ERROR_INTERNAL_EXCEPTION, "Unexpected token: "+ast.getToken());
+      }
+    } catch(HCatException e){
+      throw new SemanticException(e);
+    } catch (MetaException e) {
+      throw new SemanticException(e);
+    } catch (HiveException e) {
+      throw new SemanticException(e);
+  }
+
+    if(hook != null){
+      hook.postAnalyze(context, rootTasks);
+    }
+  }
+
+  private void authorize(String name, HiveSemanticAnalyzerHookContext cntxt, FsAction action, boolean isDBOp)
+                                                      throws MetaException, HiveException, HCatException{
+
+
+    Warehouse wh = new Warehouse(cntxt.getConf());
+    if(!isDBOp){
+      // Do validations for table path.
+      Table tbl;
+      try{
+        tbl = cntxt.getHive().getTable(name);
+      }
+      catch(InvalidTableException ite){
+        // Table itself doesn't exist in metastore, nothing to validate.
+        return;
+      }
+      Path path = tbl.getPath();
+      if(path != null){
+        AuthUtils.authorize(wh.getDnsPath(path), action, cntxt.getConf());
+      } else{
+        // This will happen, if table exists in metastore for a given
+        // tablename, but has no path associated with it, so there is nothing to check.
+        // In such cases, do no checks and allow whatever hive behavior is for it.
+        return;
+      }
+    } else{
+      // Else, its a DB operation.
+      AuthUtils.authorize(wh.getDefaultDatabasePath(name), action, cntxt.getConf());
+    }
+  }
+
+
+  private String getFullyQualifiedName(ASTNode ast) {
+    // Copied verbatim from DDLSemanticAnalyzer, since its private there.
+    if (ast.getChildCount() == 0) {
+      return ast.getText();
+    }
+
+    return getFullyQualifiedName((ASTNode) ast.getChild(0)) + "."
+        + getFullyQualifiedName((ASTNode) ast.getChild(1));
+  }
+}



Mime
View raw message