hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alina GHERMAN (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-6425) Unable to create external table with 3000+ columns
Date Tue, 15 Dec 2015 15:03:46 GMT

    [ https://issues.apache.org/jira/browse/HIVE-6425?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15058173#comment-15058173
] 

Alina GHERMAN commented on HIVE-6425:
-------------------------------------

In fact the limit was not on the number of columns but on the number of characters in 
SERDEPROPERTIES("hbase.columns.mapping"= "a field that has maximum 4000 characters").

So this bug is due to the error: ERROR: value too long for type character varying(4000)
workaround: https://support.pivotal.io/hc/en-us/articles/203422043-ERROR-value-too-long-for-type-character-varying-4000-






> Unable to create external table with 3000+ columns
> --------------------------------------------------
>
>                 Key: HIVE-6425
>                 URL: https://issues.apache.org/jira/browse/HIVE-6425
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 0.10.0
>         Environment: Linux, CDH 4.2.0
>            Reporter: Anurag
>              Labels: patch
>         Attachments: Hive_Script.txt
>
>
> While creating an external table in Hive to a table in HBase with 3000+ columns, Hive
shows up an error:
> FAILED: Error in metadata: MetaException(message:javax.jdo.JDODataStoreException: Put
request failed : INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES
(?,?,?)
> NestedThrowables:
> org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO "SERDE_PARAMS"
("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?) )
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message