hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alex Rovner (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HIVE-1391) Various issues when using MS-SQL2005 as the Hive Metastore
Date Fri, 04 Jun 2010 14:00:59 GMT

     [ https://issues.apache.org/jira/browse/HIVE-1391?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Alex Rovner updated HIVE-1391:
------------------------------

    Description: 
When I have tried to use MS-SQL2005 as the hive metastore I have encountered numerous issues.

My configuration:

property>
  <name>javax.jdo.option.ConnectionURL</name>
 <value>jdbc:sqlserver://cwdbint05:1445;DatabaseName=HiveMetastore;</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.microsoft.sqlserver.jdbc.SQLServerDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>HiveUser</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>XXXXXXXXXXXXXX</value>
  <description>password to use against metastore database</description>
</property>

<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
  <description>creates necessary schema on a startup if one doesn't exist. set this
to false, after creating it once</description>
</property>


Hive user has full rights to the HiveMetastore DB.
---------------------------------------------------------------------

When launching hive on command line and executing "show tables;" i got the following:

FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask


When launching hive through the Java API (org.apache.commons.cli.CommandLine) the auto create
kicked in but failed with the following (Full stack trace attached to ticket):

[2010-06-04 09:22:11,817] ERROR (Log4JLogger.java:115) - Error thrown executing ALTER TABLE
COLUMNS ADD TYPE_NAME varchar(128) NOT NULL : Cannot find the object "COLUMNS" because it
does not exist or you do not have permissions.
com.microsoft.sqlserver.jdbc.SQLServerException: Cannot find the object "COLUMNS" because
it does not exist or you do not have permissions.
	at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:196)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1454)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:786)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:685)
	at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4026)
	at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1416)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:185)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:160)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(SQLServerStatement.java:658)
	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)
	at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:261)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.performTablesValidation(RDBMSManager.java:2794)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.addClassTablesAndValidate(RDBMSManager.java:2595)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.run(RDBMSManager.java:2241)
	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
	at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
	at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
	at org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
	at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
	at org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
	at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:265)
	at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:551)
	at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:494)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:397)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:353)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:308)
	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:293)
	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:179)
	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)


At this point in time I have looked at what hive done to my DB and I saw that it created the
following tables:
DBS
NUCLEUS_TABLES
SEQUENCE_TABLE

The table COLUMS does not exist and the alter statement is failing (Makes sense)

So I went ahead and created the table with the needed column:
CREATE TABLE COLUMNS (TYPE_NAME varchar(128) NOT NULL)

When I ran hive again with the CLI the auto create managed to complete creation this time
but during the run failed with the following:

[2010-06-04 09:54:38,787] INFO  (SemanticAnalyzer.java:5399) - Creating tablelookup_CampaignId
positin=22
FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO
COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
[2010-06-04 09:54:39,158] ERROR (SessionState.java:248) - FAILED: Error in metadata: javax.jdo.JDODataStoreException:
Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX)
VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Add request
failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?)

NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
	at com.contextweb.hive.session.HiveQuery.exec(HiveQuery.java:112)
	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:201)
	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)
Caused by: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX)
VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
	at org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:458)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:321)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:254)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275)
	... 10 more
Caused by: java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1132)
	at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:573)
	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:366)
	at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStoreSpecialization.internalAdd(RDBMSJoinListStoreSpecialization.java:425)
	at org.datanucleus.store.mapped.scostore.JoinListStore.internalAdd(JoinListStore.java:239)
	at org.datanucleus.store.mapped.scostore.AbstractListStore.addAll(AbstractListStore.java:128)
	at org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(CollectionMapping.java:157)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:515)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObjectAsValue(PersistenceCapableMapping.java:604)
	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObject(PersistenceCapableMapping.java:364)
	at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197)
	at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1011)
	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java)
	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java)
	at org.datanucleus.state.JDOStateManagerImpl.provideFields(JDOStateManagerImpl.java:2627)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:294)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
	at org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
	... 15 more


Seems like the autocreate forgot to create the column "COLUMN_NAME" 

I have again ran the command manually in my db:
ALTER TABLE COLUMNS ADD COLUMN_NAME varchar(256) NOT NULL

At this point I was able to run the hive through the CLI successfully but running "show tables;"
from the command line still give me:
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Please contact me if you need further information.

  was:
When I have tried to use MS-SQL2005 as the hive metastore I have encountered numerous issues.

My configuration:

property>
  <name>javax.jdo.option.ConnectionURL</name>
 <value>jdbc:sqlserver://cwdbint05:1445;DatabaseName=HiveMetastore;</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.microsoft.sqlserver.jdbc.SQLServerDriver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>HiveUser</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>XXXXXXXXXXXXXX</value>
  <description>password to use against metastore database</description>
</property>

<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
  <description>creates necessary schema on a startup if one doesn't exist. set this
to false, after creating it once</description>
</property>


Hive user has full rights to the HiveMetastore DB.
---------------------------------------------------------------------

When launching hive on command line and executing "show tables;" i got the following:

FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask


When launching hive through the Java API (org.apache.commons.cli.CommandLine) the auto create
kicked in but failed with the following (Full stack trace attached to ticket):

[2010-06-04 09:22:11,817] ERROR (Log4JLogger.java:115) - Error thrown executing ALTER TABLE
COLUMNS ADD TYPE_NAME varchar(128) NOT NULL : Cannot find the object "COLUMNS" because it
does not exist or you do not have permissions.
com.microsoft.sqlserver.jdbc.SQLServerException: Cannot find the object "COLUMNS" because
it does not exist or you do not have permissions.
	at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:196)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1454)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:786)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:685)
	at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4026)
	at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1416)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:185)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:160)
	at com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(SQLServerStatement.java:658)
	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)
	at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:261)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.performTablesValidation(RDBMSManager.java:2794)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.addClassTablesAndValidate(RDBMSManager.java:2595)
	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.run(RDBMSManager.java:2241)
	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
	at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
	at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
	at org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
	at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
	at org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
	at org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
	at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:265)
	at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:551)
	at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:494)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:397)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:353)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:308)
	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:293)
	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:179)
	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)


At this point in time I have looked at what hive done to my DB and I saw that it created the
following tables:
DBS
NUCLEUS_TABLES
SEQUENCE_TABLE

The table COLUMS does not exist and the alter statement is failing (Makes sense)

So I went ahead and created the table with the needed column:
CREATE TABLE COLUMNS (TYPE_NAME varchar(128) NOT NULL)

When I ran hive with the CLI the auto create managed to complete creation this time but during
the ran failed with the following:

[2010-06-04 09:54:38,787] INFO  (SemanticAnalyzer.java:5399) - Creating tablelookup_CampaignId
positin=22
FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO
COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
[2010-06-04 09:54:39,158] ERROR (SessionState.java:248) - FAILED: Error in metadata: javax.jdo.JDODataStoreException:
Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX)
VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Add request
failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?)

NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
	at com.contextweb.hive.session.HiveQuery.exec(HiveQuery.java:112)
	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:201)
	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)
Caused by: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX)
VALUES (?,?,?,?,?) 
NestedThrowables:
java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
	at org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:458)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:321)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:254)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275)
	... 10 more
Caused by: java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1132)
	at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:573)
	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:366)
	at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStoreSpecialization.internalAdd(RDBMSJoinListStoreSpecialization.java:425)
	at org.datanucleus.store.mapped.scostore.JoinListStore.internalAdd(JoinListStore.java:239)
	at org.datanucleus.store.mapped.scostore.AbstractListStore.addAll(AbstractListStore.java:128)
	at org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(CollectionMapping.java:157)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:515)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObjectAsValue(PersistenceCapableMapping.java:604)
	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObject(PersistenceCapableMapping.java:364)
	at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197)
	at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1011)
	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java)
	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java)
	at org.datanucleus.state.JDOStateManagerImpl.provideFields(JDOStateManagerImpl.java:2627)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:294)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
	at org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
	... 15 more


Seems like the autocreate forgot to create the column "COLUMN_NAME" 

I have again ran the command manually in my db:
ALTER TABLE COLUMNS ADD COLUMN_NAME varchar(256) NOT NULL

At this point I was able to run the hive through the CLI successfully but running "show tables;"
from the command line still give me:
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Please contact me if you need further information.


> Various issues when using MS-SQL2005 as the Hive Metastore
> ----------------------------------------------------------
>
>                 Key: HIVE-1391
>                 URL: https://issues.apache.org/jira/browse/HIVE-1391
>             Project: Hadoop Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 0.5.0
>            Reporter: Alex Rovner
>         Attachments: hive-trace.txt
>
>
> When I have tried to use MS-SQL2005 as the hive metastore I have encountered numerous
issues.
> My configuration:
> property>
>   <name>javax.jdo.option.ConnectionURL</name>
>  <value>jdbc:sqlserver://cwdbint05:1445;DatabaseName=HiveMetastore;</value>
>   <description>JDBC connect string for a JDBC metastore</description>
> </property>
> <property>
>   <name>javax.jdo.option.ConnectionDriverName</name>
>   <value>com.microsoft.sqlserver.jdbc.SQLServerDriver</value>
>   <description>Driver class name for a JDBC metastore</description>
> </property>
> <property>
>   <name>javax.jdo.option.ConnectionUserName</name>
>   <value>HiveUser</value>
>   <description>username to use against metastore database</description>
> </property>
> <property>
>   <name>javax.jdo.option.ConnectionPassword</name>
>   <value>XXXXXXXXXXXXXX</value>
>   <description>password to use against metastore database</description>
> </property>
> <property>
>   <name>datanucleus.autoCreateSchema</name>
>   <value>true</value>
>   <description>creates necessary schema on a startup if one doesn't exist. set
this to false, after creating it once</description>
> </property>
> Hive user has full rights to the HiveMetastore DB.
> ---------------------------------------------------------------------
> When launching hive on command line and executing "show tables;" i got the following:
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
> When launching hive through the Java API (org.apache.commons.cli.CommandLine) the auto
create kicked in but failed with the following (Full stack trace attached to ticket):
> [2010-06-04 09:22:11,817] ERROR (Log4JLogger.java:115) - Error thrown executing ALTER
TABLE COLUMNS ADD TYPE_NAME varchar(128) NOT NULL : Cannot find the object "COLUMNS" because
it does not exist or you do not have permissions.
> com.microsoft.sqlserver.jdbc.SQLServerException: Cannot find the object "COLUMNS" because
it does not exist or you do not have permissions.
> 	at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:196)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1454)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:786)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:685)
> 	at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4026)
> 	at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1416)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:185)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:160)
> 	at com.microsoft.sqlserver.jdbc.SQLServerStatement.execute(SQLServerStatement.java:658)
> 	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
> 	at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:681)
> 	at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:261)
> 	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.performTablesValidation(RDBMSManager.java:2794)
> 	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.addClassTablesAndValidate(RDBMSManager.java:2595)
> 	at org.datanucleus.store.rdbms.RDBMSManager$ClassAdder.run(RDBMSManager.java:2241)
> 	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
> 	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:994)
> 	at org.datanucleus.store.rdbms.RDBMSManager.addClasses(RDBMSManager.java:960)
> 	at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:691)
> 	at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:358)
> 	at org.datanucleus.store.rdbms.RDBMSManager.getExtent(RDBMSManager.java:1344)
> 	at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:3736)
> 	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
> 	at org.datanucleus.store.rdbms.query.QueryCompiler.executionCompile(QueryCompiler.java:312)
> 	at org.datanucleus.store.rdbms.query.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
> 	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:174)
> 	at org.datanucleus.store.query.Query.executeQuery(Query.java:1443)
> 	at org.datanucleus.store.rdbms.query.JDOQLQuery.executeQuery(JDOQLQuery.java:244)
> 	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1357)
> 	at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:265)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:551)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:494)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:397)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:353)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:308)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:293)
> 	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:179)
> 	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
> 	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
> 	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
> 	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
> 	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)
> At this point in time I have looked at what hive done to my DB and I saw that it created
the following tables:
> DBS
> NUCLEUS_TABLES
> SEQUENCE_TABLE
> The table COLUMS does not exist and the alter statement is failing (Makes sense)
> So I went ahead and created the table with the needed column:
> CREATE TABLE COLUMNS (TYPE_NAME varchar(128) NOT NULL)
> When I ran hive again with the CLI the auto create managed to complete creation this
time but during the run failed with the following:
> [2010-06-04 09:54:38,787] INFO  (SemanticAnalyzer.java:5399) - Creating tablelookup_CampaignId
positin=22
> FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add request failed : INSERT
INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) 
> NestedThrowables:
> java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
> [2010-06-04 09:54:39,158] ERROR (SessionState.java:248) - FAILED: Error in metadata:
javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX)
VALUES (?,?,?,?,?) 
> NestedThrowables:
> java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
> org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Add
request failed : INSERT INTO COLUMNS (SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES
(?,?,?,?,?) 
> NestedThrowables:
> java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
> 	at com.contextweb.hive.session.HiveQuery.exec(HiveQuery.java:112)
> 	at com.contextweb.hive.lookup.LookupDumper.createTable(LookupDumper.java:201)
> 	at com.contextweb.hive.lookup.LookupDumper.execute(LookupDumper.java:122)
> 	at com.contextweb.hive.cli.QueryCommand.runImpl(QueryCommand.java:142)
> 	at com.contextweb.hive.cli.HiveSessionAwareCommand.run(HiveSessionAwareCommand.java:48)
> 	at com.contextweb.hive.cli.Launcher.exec(Launcher.java:78)
> 	at com.contextweb.hive.cli.Launcher.main(Launcher.java:46)
> Caused by: javax.jdo.JDODataStoreException: Add request failed : INSERT INTO COLUMNS
(SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?) 
> NestedThrowables:
> java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
> 	at org.datanucleus.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:289)
> 	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:673)
> 	at org.datanucleus.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:693)
> 	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:458)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:321)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:254)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275)
> 	... 10 more
> Caused by: java.sql.BatchUpdateException: Invalid column name 'COLUMN_NAME'.
> 	at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1132)
> 	at org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:573)
> 	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:366)
> 	at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStoreSpecialization.internalAdd(RDBMSJoinListStoreSpecialization.java:425)
> 	at org.datanucleus.store.mapped.scostore.JoinListStore.internalAdd(JoinListStore.java:239)
> 	at org.datanucleus.store.mapped.scostore.AbstractListStore.addAll(AbstractListStore.java:128)
> 	at org.datanucleus.store.mapped.mapping.CollectionMapping.postInsert(CollectionMapping.java:157)
> 	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:515)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
> 	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
> 	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
> 	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
> 	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObjectAsValue(PersistenceCapableMapping.java:604)
> 	at org.datanucleus.store.mapped.mapping.PersistenceCapableMapping.setObject(PersistenceCapableMapping.java:364)
> 	at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:197)
> 	at org.datanucleus.state.AbstractStateManager.providedObjectField(AbstractStateManager.java:1011)
> 	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java)
> 	at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java)
> 	at org.datanucleus.state.JDOStateManagerImpl.provideFields(JDOStateManagerImpl.java:2627)
> 	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:294)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:200)
> 	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:179)
> 	at org.datanucleus.state.JDOStateManagerImpl.internalMakePersistent(JDOStateManagerImpl.java:3097)
> 	at org.datanucleus.state.JDOStateManagerImpl.makePersistent(JDOStateManagerImpl.java:3073)
> 	at org.datanucleus.ObjectManagerImpl.persistObjectInternal(ObjectManagerImpl.java:1280)
> 	at org.datanucleus.ObjectManagerImpl.persistObject(ObjectManagerImpl.java:1157)
> 	at org.datanucleus.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:668)
> 	... 15 more
> Seems like the autocreate forgot to create the column "COLUMN_NAME" 
> I have again ran the command manually in my db:
> ALTER TABLE COLUMNS ADD COLUMN_NAME varchar(256) NOT NULL
> At this point I was able to run the hive through the CLI successfully but running "show
tables;" from the command line still give me:
> FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional
connection factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
> Please contact me if you need further information.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message