Return-Path: Delivered-To: apmail-hive-dev-archive@www.apache.org Received: (qmail 34840 invoked from network); 25 Dec 2010 16:15:12 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 25 Dec 2010 16:15:12 -0000 Received: (qmail 52720 invoked by uid 500); 25 Dec 2010 16:15:12 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 52626 invoked by uid 500); 25 Dec 2010 16:15:11 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 52618 invoked by uid 500); 25 Dec 2010 16:15:11 -0000 Delivered-To: apmail-hadoop-hive-dev@hadoop.apache.org Received: (qmail 52615 invoked by uid 99); 25 Dec 2010 16:15:11 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 25 Dec 2010 16:15:11 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.22] (HELO thor.apache.org) (140.211.11.22) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 25 Dec 2010 16:15:07 +0000 Received: from thor (localhost [127.0.0.1]) by thor.apache.org (8.13.8+Sun/8.13.8) with ESMTP id oBPGEkP2001945 for ; Sat, 25 Dec 2010 16:14:46 GMT Message-ID: <14821387.17981293293686141.JavaMail.jira@thor> Date: Sat, 25 Dec 2010 11:14:46 -0500 (EST) From: "Edward Capriolo (JIRA)" To: hive-dev@hadoop.apache.org Subject: [jira] Resolved: (HIVE-1391) Various issues when using MS-SQL2005 as the Hive Metastore MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 X-Virus-Checked: Checked by ClamAV on apache.org [ https://issues.apache.org/jira/browse/HIVE-1391?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Edward Capriolo resolved HIVE-1391. ----------------------------------- Resolution: Won't Fix This is not an attempt to pass the buck. We do not officially support any particular RDBMS. However we defacto support Derby and MySQL. http://www.datanucleus.org/products/accessplatform/rdbms/support.html Datanucleus claims to support ms-sql2005 so maybe the issue should be taken up there. We have at least proved we are doing something correct by working with two of their other supported databases. > Various issues when using MS-SQL2005 as the Hive Metastore > ---------------------------------------------------------- > > Key: HIVE-1391 > URL: https://issues.apache.org/jira/browse/HIVE-1391 > Project: Hive > Issue Type: Bug > Components: Metastore > Affects Versions: 0.5.0 > Reporter: Alex Rovner > Attachments: hive-trace.txt > > > When I have tried to use MS-SQL2005 as the hive metastore I have encountered numerous issues. > My configuration: > property> > javax.jdo.option.ConnectionURL > jdbc:sqlserver://cwdbint05:1445;DatabaseName=HiveMetastore; > JDBC connect string for a JDBC metastore > > > javax.jdo.option.ConnectionDriverName > com.microsoft.sqlserver.jdbc.SQLServerDriver > Driver class name for a JDBC metastore > > > javax.jdo.option.ConnectionUserName > HiveUser > username to use against metastore database > > > javax.jdo.option.ConnectionPassword > XXXXXXXXXXXXXX > password to use against metastore database > > > datanucleus.autoCreateSchema > true > creates necessary schema on a startup if one doesn't exist. set this to false, after creating it once > > Hive user has full rights to the HiveMetastore DB. > --------------------------------------------------------------------- > When launching hive on command line and executing "show tables;" i got the following: > FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory > NestedThrowables: > java.lang.reflect.InvocationTargetException > FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask > When launching hive through the Java API (org.apache.commons.cli.CommandLine) the auto create kicked in but failed with the following (Full stack trace attached to ticket): > [2010-06-04 09:22:11,817] ERROR (Log4JLogger.java:115) - Error thrown executing ALTER TABLE COLUMNS ADD TYPE_NAME varchar(128) NOT NULL : Cannot find the object "COLUMNS" because it does not exist or you do not have permissions. > com.microsoft.sqlserver.jdbc.SQLServerException: Cannot find the object "COLUMNS" because it does not exist or you do not have permissions. > at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:196) > at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1454) > at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:786) > at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:685) > at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4026) > at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1416) > at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:185) > at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:160) > at com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(SQLServerStatement.java:658) > at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730) > at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681) > at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:261) > at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.performTablesValidation(RDBMSManager.java:2794) > at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.addClassTablesAndValidate(RDBMSManager.java:2595) > at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.run(RDBMSManager.java:2241) > at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113) > at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994) > at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960) > at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691) > at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358) > at org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344) > at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736) > at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411) > at org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312) > at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225) > at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174) > at org.datanucleus.store.query.Query.executeQuery(Query.java:1443) > at org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244) > at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357) > at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:265) > at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:551) > at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:494) > at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:397) > at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:353) > at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340) > at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:308) > at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:293) > at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:179) > at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122) > at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142) > at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48) > at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78) > at com.contextweb.hive.cli.Launcher.main(Launcher.java:46) > At this point in time I have looked at what hive done to my DB and I saw that it created the following tables: > DBS > NUCLEUS_TABLES > SEQUENCE_TABLE > The table COLUMS does not exist and the alter statement is failing (Makes sense) > So I went ahead and created the table with the needed column: > CREATE TABLE COLUMNS (TYPE_NAME varchar(128) NOT NULL) > When I ran hive again with the CLI the auto create managed to complete creation this time but during the run failed with the following: > [2010-06-04 09:54:38,787] INFO (SemanticAnalyzer.java:5399) - Creating tablelookup_CampaignId positin=22 > FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) > NestedThrowables: > java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'. > [2010-06-04 09:54:39,158] ERROR (SessionState.java:248) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) > NestedThrowables: > java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'. > org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) > NestedThrowables: > java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'. > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281) > at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99) > at com.contextweb.hive.session.HiveQuery.exec(HiveQuery.java:112) > at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:201) > at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122) > at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142) > at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48) > at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78) > at com.contextweb.hive.cli.Launcher.main(Launcher.java:46) > Caused by: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) > NestedThrowables: > java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'. > at org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289) > at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673) > at org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693) > at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:458) > at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:321) > at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:254) > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275) > ... 10 more > Caused by: java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'. > at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1132) > at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:573) > at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:366) > at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStoreSpecialization.internalAdd(RDBMSJoinListStoreSpecialization.java:425) > at org.datanucleus.store.mapped.scostore.JoinListStore.internalAdd(JoinListStore.java:239) > at org.datanucleus.store.mapped.scostore.AbstractListStore.addAll(AbstractListStore.java:128) > at org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(CollectionMapping.java:157) > at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:515) > at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200) > at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179) > at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097) > at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073) > at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280) > at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObjectAsValue(PersistenceCapableMapping.java:604) > at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObject(PersistenceCapableMapping.java:364) > at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197) > at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1011) > at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java) > at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java) > at org.datanucleus.state.JDOStateManagerImpl.provideFields(JDOStateManagerImpl.java:2627) > at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:294) > at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200) > at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179) > at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097) > at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073) > at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280) > at org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157) > at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668) > ... 15 more > Seems like the autocreate forgot to create the column "COLUMN_NAME" > I have again ran the command manually in my db: > ALTER TABLE COLUMNS ADD COLUMN_NAME varchar(256) NOT NULL > At this point I was able to run the hive through the CLI successfully but running "show tables;" from the command line still give me: > FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory > NestedThrowables: > java.lang.reflect.InvocationTargetException > FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask > Please contact me if you need further information. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.