phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "James Taylor (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-4646) The data exceeds the max capacity for the data type error for valid scenarios.
Date Thu, 08 Mar 2018 22:31:01 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-4646?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16392023#comment-16392023
] 

James Taylor commented on PHOENIX-4646:
---------------------------------------

I think it's legal to assign a CHAR(10) into a CHAR(5) for example, but I suspect Phoenix
would throw an exception in this case. If the trimmed CHAR(10) fits into the CHAR(5), it seems
like a similar situation to the VARCHAR case (with an equivalent fix needed). I believe that
we're supposed to even silently truncate the CHAR(10) regardless of trailing nulls, but we
don't do that now (PHOENIX-1145). We also don't support multi-byte characters in CHAR which
is non standard (but done on purpose so we can calculate a constant offset at compile time
if in the row key).
 

> The data exceeds the max capacity for the data type error for valid scenarios.
> ------------------------------------------------------------------------------
>
>                 Key: PHOENIX-4646
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-4646
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.14.0
>            Reporter: Sergey Soldatov
>            Assignee: Sergey Soldatov
>            Priority: Major
>             Fix For: 4.14.0
>
>         Attachments: PHOENIX-4646.patch
>
>
> Here is an example:
> {noformat}
> create table test_trim_source(name varchar(160) primary key, id varchar(120), address
varchar(160)); 
> create table test_trim_target(name varchar(160) primary key, id varchar(10), address

>  varchar(10));
> upsert into test_trim_source values('test','test','test');
> upsert into test_trim_target select * from test_trim_source;
> {noformat}
> It fails with 
> {noformat}
> Error: ERROR 206 (22003): The data exceeds the max capacity for the data type. value='test'
columnName=ID (state=22003,code=206)
> java.sql.SQLException: ERROR 206 (22003): The data exceeds the max capacity for the data
type. value='test' columnName=ID
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:489)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> 	at org.apache.phoenix.util.ServerUtil.parseRemoteException(ServerUtil.java:165)
> 	at org.apache.phoenix.util.ServerUtil.parseServerExceptionOrNull(ServerUtil.java:149)
> 	at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:116)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:1261)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:1203)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.getIterators(RoundRobinResultIterator.java:176)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.next(RoundRobinResultIterator.java:91)
> 	at org.apache.phoenix.compile.UpsertCompiler$ClientUpsertSelectMutationPlan.execute(UpsertCompiler.java:1300)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:398)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:381)
> 	at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:380)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:368)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1794)
> 	at sqlline.Commands.execute(Commands.java:822)
> 	at sqlline.Commands.sql(Commands.java:732)
> 	at sqlline.SqlLine.dispatch(SqlLine.java:813)
> 	at sqlline.SqlLine.begin(SqlLine.java:686)
> 	at sqlline.SqlLine.start(SqlLine.java:398)
> 	at sqlline.SqlLine.main(SqlLine.java:291)
> Caused by: java.sql.SQLException: ERROR 206 (22003): The data exceeds the max capacity
for the data type. value='test' columnName=ID
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:489)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150)
> 	at org.apache.phoenix.compile.UpsertCompiler.upsertSelect(UpsertCompiler.java:235)
> 	at org.apache.phoenix.compile.UpsertCompiler$UpsertingParallelIteratorFactory.mutate(UpsertCompiler.java:284)
> 	at org.apache.phoenix.compile.MutatingParallelIteratorFactory.newIterator(MutatingParallelIteratorFactory.java:59)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:121)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:113)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> {noformat} 
> The problem is that in PVarchar.isSizeCompatible we ignore the length of the value if
the source has specified max size for the value. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message