phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Junegunn Choi (JIRA)" <j...@apache.org>
Subject [jira] [Reopened] (PHOENIX-930) duplicated columns cause query exception and drop table exception
Date Tue, 31 May 2016 16:32:12 GMT

     [ https://issues.apache.org/jira/browse/PHOENIX-930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Junegunn Choi reopened PHOENIX-930:
-----------------------------------
      Assignee: Junegunn Choi

Reopening as it's reproducible with the current master.

{code:sql}
create table a (id integer primary key, dupe integer, dupe integer);
drop table a;
{code}

And I can confirm that the suggested patch, which is fairly straightforward, prevents the
anomaly. Let me revise the patch with a test case.

> duplicated columns cause query exception and drop table exception
> -----------------------------------------------------------------
>
>                 Key: PHOENIX-930
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-930
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 3.0.0
>            Reporter: wangkai
>            Assignee: Junegunn Choi
>         Attachments: PHOENIX-930
>
>
> when I create table like this: "create table test (id varchar not null primary key, f.name
varchar, f.email varchar, f.email varchar)", this will cause an org.apache.phoenix.schema.ColumnAlreadyExistsException,
but the table is successful created.
> Then I run a query like "select * from test", an exception is threw:
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 3
>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:283)
>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:216)
>         at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:209)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:443)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:254)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1077)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1023)
>         ... 10 more
> then I try to drop the table: "drop table test", an exception is also threw:
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 3
>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:283)
>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:216)
>         at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:209)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:443)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:254)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1077)
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1023)
>         ... 10 more
> So I have to drop SYSTEM.CATALOG, SYSTEM.SEQUENCE from hbase shell……
> The ArrayIndexOutOfBoundsException is threw out because the position of f.email column
in CATALOG table is not correct. I think it's better to check  columns before creating table.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message