phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Pedro Boado (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-3163) Split during global index creation may cause ERROR 201 error
Date Sun, 13 May 2018 22:43:00 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-3163?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16473653#comment-16473653
] 

Pedro Boado commented on PHOENIX-3163:
--------------------------------------

Hi [~sergey.soldatov] 

I've just noticed that SkipScanAfterManualSplitIT.testManualSplit started failing in branch
4.x-HBase-1.2 .
{code:java}
2018-05-13 23:41:03,729 DEBUG [B.defaultRpcServer.handler=1,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=1,queue=0,port=43819: callId: 465 service: ClientService methodName:
Scan size: 595 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.NotServingRegionException: Region T000002,\x01,1526251248024.34b289cddcb2b99d8e776602b796d731.
is not online on xps,43819,1526251221783
	at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:2942)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1072)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2410)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
2018-05-13 23:41:03,729 DEBUG [B.defaultRpcServer.handler=2,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=2,queue=0,port=43819: callId: 467 service: ClientService methodName:
Scan size: 586 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.NotServingRegionException: Region T000002,\x02,1526251248024.9201bdc4f44225f390edb40ab1548a82.
is not online on xps,43819,1526251221783
	at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:2942)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1072)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2410)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
2018-05-13 23:41:03,729 DEBUG [B.defaultRpcServer.handler=0,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=0,queue=0,port=43819: callId: 466 service: ClientService methodName:
Scan size: 585 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.NotServingRegionException: Region T000002,,1526251248024.9bb19fa73f91248dd407192c4ce512fe.
is not online on xps,43819,1526251221783
	at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:2942)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1072)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2410)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
2018-05-13 23:41:03,729 DEBUG [B.defaultRpcServer.handler=3,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=3,queue=0,port=43819: callId: 468 service: ClientService methodName:
Scan size: 595 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.NotServingRegionException: Region T000002,\x03,1526251248024.6944b7e5e33cdcbdcc674c745ad8c1a5.
is not online on xps,43819,1526251221783
	at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:2942)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1072)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2410)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
2018-05-13 23:41:03,790 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.coprocessor.CoprocessorHost(182):
Loading coprocessor class org.apache.phoenix.hbase.index.Indexer with path null and priority
805306366
2018-05-13 23:41:03,826 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.phoenix.hbase.index.Indexer(244):
Setting up recovery writter with failure policy: class org.apache.phoenix.hbase.index.write.recovery.StoreFailuresInCachePolicy
2018-05-13 23:41:03,826 INFO  [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost(367):
Loaded coprocessor org.apache.phoenix.hbase.index.Indexer from HTD of T000002 successfully.
2018-05-13 23:41:03,826 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.coprocessor.CoprocessorHost(182):
Loading coprocessor class org.apache.phoenix.coprocessor.ServerCachingEndpointImpl with path
null and priority 805306366
2018-05-13 23:41:03,826 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.HRegion(7836):
Registered coprocessor service: region=T000002,\x01sp,1526251261881.bff1498b104381411dbcc5c1b104e0e8.
service=ServerCachingService
2018-05-13 23:41:03,826 INFO  [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost(367):
Loaded coprocessor org.apache.phoenix.coprocessor.ServerCachingEndpointImpl from HTD of T000002
successfully.
2018-05-13 23:41:03,826 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.coprocessor.CoprocessorHost(182):
Loading coprocessor class org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver with
path null and priority 805306366
2018-05-13 23:41:03,827 INFO  [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost(367):
Loaded coprocessor org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver from HTD
of T000002 successfully.
2018-05-13 23:41:03,827 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.coprocessor.CoprocessorHost(182):
Loading coprocessor class org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver
with path null and priority 805306366
2018-05-13 23:41:03,847 DEBUG [B.defaultRpcServer.handler=4,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=4,queue=0,port=43819: callId: 472 service: ClientService methodName:
Scan size: 595 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.DoNotRetryIOException: ERROR 1108 (XCL08): Cache of region boundaries
are out of date. tableName=T000002
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.throwIfScanOutOfRegion(BaseScannerRegionObserver.java:175)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.preScannerOpen(BaseScannerRegionObserver.java:203)
	at org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver.preScannerOpen(UngroupedAggregateRegionObserver.java:327)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2428)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.phoenix.schema.StaleRegionBoundaryCacheException: ERROR 1108 (XCL08):
Cache of region boundaries are out of date. tableName=T000002
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.throwIfScanOutOfRegion(BaseScannerRegionObserver.java:174)
	... 14 more
2018-05-13 23:41:03,851 DEBUG [B.defaultRpcServer.handler=0,queue=0,port=43819] org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver(528):
Starting ungrouped coprocessor scan {"timeRange":[0,9223372036854775807],"batch":-1,"startRow":"\\x00
c","stopRow":"\\x00 c\\x00","loadColumnFamiliesOnDemand":true,"totalColumns":1,"cacheBlocks":true,"families":{"0":["ALL"]},"maxResultSize":2097152,"maxVersions":1,"filter":"FilterList
AND (2/2): [FirstKeyOnlyFilter, SkipScanFilter [[\\x00 c]]]","caching":2147483647} {ENCODED
=> ed285ee58edd90670656b3753dd6a3fd, NAME => 'T000002,,1526251252848.ed285ee58edd90670656b3753dd6a3fd.',
STARTKEY => '', ENDKEY => '\x00jj'}
2018-05-13 23:41:03,851 DEBUG [B.defaultRpcServer.handler=0,queue=0,port=43819] org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver(822):
Finished scanning 0 rows for ungrouped coprocessor scan {"timeRange":[0,9223372036854775807],"batch":-1,"startRow":"\\x00
c","stopRow":"\\x00 c\\x00","loadColumnFamiliesOnDemand":true,"totalColumns":1,"cacheBlocks":true,"families":{"0":["ALL"]},"maxResultSize":2097152,"maxVersions":1,"filter":"FilterList
AND (2/2): [FirstKeyOnlyFilter, SkipScanFilter [[\\x00 c]]]","caching":2147483647}
2018-05-13 23:41:03,852 DEBUG [B.defaultRpcServer.handler=2,queue=0,port=43819] org.apache.hadoop.hbase.ipc.CallRunner(115):
B.defaultRpcServer.handler=2,queue=0,port=43819: callId: 475 service: ClientService methodName:
Scan size: 595 connection: 127.0.0.1:57462
org.apache.hadoop.hbase.DoNotRetryIOException: ERROR 1108 (XCL08): Cache of region boundaries
are out of date. tableName=T000002
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.throwIfScanOutOfRegion(BaseScannerRegionObserver.java:175)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.preScannerOpen(BaseScannerRegionObserver.java:203)
	at org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver.preScannerOpen(UngroupedAggregateRegionObserver.java:327)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2428)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2188)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.phoenix.schema.StaleRegionBoundaryCacheException: ERROR 1108 (XCL08):
Cache of region boundaries are out of date. tableName=T000002
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.throwIfScanOutOfRegion(BaseScannerRegionObserver.java:174)
	... 14 more
2018-05-13 23:41:03,853 DEBUG [B.defaultRpcServer.handler=3,queue=0,port=43819] org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver(528):
Starting ungrouped coprocessor scan {"timeRange":[0,9223372036854775807],"batch":-1,"startRow":"\\x02
a","stopRow":"\\x02 a\\x00","loadColumnFamiliesOnDemand":true,"totalColumns":1,"cacheBlocks":true,"families":{"0":["ALL"]},"maxResultSize":2097152,"maxVersions":1,"filter":"FilterList
AND (2/2): [FirstKeyOnlyFilter, SkipScanFilter [[\\x02 a]]]","caching":2147483647} {ENCODED
=> c5c9386b8657bed3abfba259adc54cf5, NAME => 'T000002,\x02,1526251257604.c5c9386b8657bed3abfba259adc54cf5.',
STARTKEY => '\x02', ENDKEY => '\x02jh'}
2018-05-13 23:41:03,854 DEBUG [B.defaultRpcServer.handler=3,queue=0,port=43819] org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver(822):
Finished scanning 0 rows for ungrouped coprocessor scan {"timeRange":[0,9223372036854775807],"batch":-1,"startRow":"\\x02
a","stopRow":"\\x02 a\\x00","loadColumnFamiliesOnDemand":true,"totalColumns":1,"cacheBlocks":true,"families":{"0":["ALL"]},"maxResultSize":2097152,"maxVersions":1,"filter":"FilterList
AND (2/2): [FirstKeyOnlyFilter, SkipScanFilter [[\\x02 a]]]","caching":2147483647}
2018-05-13 23:41:03,866 INFO  [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost(367):
Loaded coprocessor org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver from HTD
of T000002 successfully.
2018-05-13 23:41:03,867 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.coprocessor.CoprocessorHost(182):
Loading coprocessor class org.apache.phoenix.coprocessor.ScanRegionObserver with path null
and priority 805306366
2018-05-13 23:41:03,867 INFO  [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost(367):
Loaded coprocessor org.apache.phoenix.coprocessor.ScanRegionObserver from HTD of T000002 successfully.
2018-05-13 23:41:03,867 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.MetricsRegionSourceImpl(70):
Creating new MetricsRegionSourceImpl for table T000002 bff1498b104381411dbcc5c1b104e0e8
2018-05-13 23:41:03,867 DEBUG [RS:0;xps:43819-splits-1526251252838] org.apache.hadoop.hbase.regionserver.HRegion(738):
Instantiated T000002,\x01sp,1526251261881.bff1498b104381411dbcc5c1b104e0e8.
2018-05-13 23:41:03,877 DEBUG [main] org.apache.phoenix.util.ReadOnlyProps(317): Creating
new ReadOnlyProps due to phoenix.query.force.rowkeyorder with true!=false
2018-05-13 23:41:03,877 INFO  [StoreOpener-730c649272e9520771a3b38973dea323-1] org.apache.hadoop.hbase.io.hfile.CacheConfig(236):
Created cacheConfig for 0: blockCache=LruBlockCache{blockCount=0, currentSize=3061960, freeSize=2981814584,
maxSize=2984876544, heapSize=3061960, minSize=2835632640, minFactor=0.95, multiSize=1417816320,
multiFactor=0.5, singleSize=708908160, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false,
cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false,
prefetchOnOpen=false
2018-05-13 23:41:03,877 INFO  [StoreOpener-730c649272e9520771a3b38973dea323-1] org.apache.hadoop.hbase.regionserver.compactions.CompactionConfiguration(104):
size [134217728, 9223372036854775807, 9223372036854775807); files [3, 10); ratio 1.200000;
off-peak ratio 5.000000; throttle point 2684354560; major period 604800000, major jitter 0.500000,
min locality to compact 0.000000
2018-05-13 23:41:03,878 INFO  [StoreOpener-bff1498b104381411dbcc5c1b104e0e8-1] org.apache.hadoop.hbase.io.hfile.CacheConfig(236):
Created cacheConfig for 0: blockCache=LruBlockCache{blockCount=0, currentSize=3061960, freeSize=2981814584,
maxSize=2984876544, heapSize=3061960, minSize=2835632640, minFactor=0.95, multiSize=1417816320,
multiFactor=0.5, singleSize=708908160, singleFactor=0.25}, cacheDataOnRead=true, cacheDataOnWrite=false,
cacheIndexesOnWrite=false, cacheBloomsOnWrite=false, cacheEvictOnClose=false, cacheDataCompressed=false,
prefetchOnOpen=false
2018-05-13 23:41:03,878 INFO  [StoreOpener-bff1498b104381411dbcc5c1b104e0e8-1] org.apache.hadoop.hbase.regionserver.compactions.CompactionConfiguration(104):
size [134217728, 9223372036854775807, 9223372036854775807); files [3, 10); ratio 1.200000;
off-peak ratio 5.000000; throttle point 2684354560; major period 604800000, major jitter 0.500000,
min locality to compact 0.000000

java.lang.AssertionError: 
Expected :2
Actual   :0
 {code}

Thanks!

> Split during global index creation may cause ERROR 201 error
> ------------------------------------------------------------
>
>                 Key: PHOENIX-3163
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3163
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.8.0
>            Reporter: Sergey Soldatov
>            Assignee: Sergey Soldatov
>            Priority: Major
>             Fix For: 4.14.0, 5.0.0
>
>         Attachments: PHOENIX-3163_v1.patch, PHOENIX-3163_v3.patch, PHOENIX-3163_v4.patch,
PHOENIX-3163_v5.patch, PHOENIX-3163_v6.patch
>
>
> When we create global index and split happen meanwhile there is a chance to fail with
ERROR 201:
> {noformat}
> 2016-08-08 15:55:17,248 INFO  [Thread-6] org.apache.phoenix.iterate.BaseResultIterators(878):
Failed to execute task during cancel
> java.util.concurrent.ExecutionException: java.sql.SQLException: ERROR 201 (22000): Illegal
data.
> 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> 	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> 	at org.apache.phoenix.iterate.BaseResultIterators.close(BaseResultIterators.java:872)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:809)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:713)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.getIterators(RoundRobinResultIterator.java:176)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.next(RoundRobinResultIterator.java:91)
> 	at org.apache.phoenix.compile.UpsertCompiler$2.execute(UpsertCompiler.java:815)
> 	at org.apache.phoenix.compile.DelegateMutationPlan.execute(DelegateMutationPlan.java:31)
> 	at org.apache.phoenix.compile.PostIndexDDLCompiler$1.execute(PostIndexDDLCompiler.java:124)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.java:2823)
> 	at org.apache.phoenix.schema.MetaDataClient.buildIndex(MetaDataClient.java:1079)
> 	at org.apache.phoenix.schema.MetaDataClient.createIndex(MetaDataClient.java:1382)
> 	at org.apache.phoenix.compile.CreateIndexCompiler$1.execute(CreateIndexCompiler.java:85)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
> 	at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:330)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440)
> 	at org.apache.phoenix.hbase.index.write.TestIndexWriter$1.run(TestIndexWriter.java:93)
> Caused by: java.sql.SQLException: ERROR 201 (22000): Illegal data.
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:441)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
> 	at org.apache.phoenix.schema.types.PDataType.newIllegalDataException(PDataType.java:287)
> 	at org.apache.phoenix.schema.types.PUnsignedSmallint$UnsignedShortCodec.decodeShort(PUnsignedSmallint.java:146)
> 	at org.apache.phoenix.schema.types.PSmallint.toObject(PSmallint.java:104)
> 	at org.apache.phoenix.schema.types.PSmallint.toObject(PSmallint.java:28)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:980)
> 	at org.apache.phoenix.schema.types.PUnsignedSmallint.toObject(PUnsignedSmallint.java:102)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:980)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:992)
> 	at org.apache.phoenix.schema.types.PDataType.coerceBytes(PDataType.java:830)
> 	at org.apache.phoenix.schema.types.PDecimal.coerceBytes(PDecimal.java:342)
> 	at org.apache.phoenix.schema.types.PDataType.coerceBytes(PDataType.java:810)
> 	at org.apache.phoenix.expression.CoerceExpression.evaluate(CoerceExpression.java:149)
> 	at org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:69)
> 	at org.apache.phoenix.jdbc.PhoenixResultSet.getBytes(PhoenixResultSet.java:308)
> 	at org.apache.phoenix.compile.UpsertCompiler.upsertSelect(UpsertCompiler.java:197)
> 	at org.apache.phoenix.compile.UpsertCompiler.access$000(UpsertCompiler.java:115)
> 	at org.apache.phoenix.compile.UpsertCompiler$UpsertingParallelIteratorFactory.mutate(UpsertCompiler.java:259)
> 	at org.apache.phoenix.compile.MutatingParallelIteratorFactory.newIterator(MutatingParallelIteratorFactory.java:59)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:112)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> java.sql.SQLException: ERROR 201 (22000): Illegal data. ERROR 201 (22000): Illegal data.
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:441)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
> 	at org.apache.phoenix.util.ServerUtil.parseRemoteException(ServerUtil.java:129)
> 	at org.apache.phoenix.util.ServerUtil.parseServerExceptionOrNull(ServerUtil.java:118)
> 	at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:107)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:768)
> 	at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:713)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.getIterators(RoundRobinResultIterator.java:176)
> 	at org.apache.phoenix.iterate.RoundRobinResultIterator.next(RoundRobinResultIterator.java:91)
> 	at org.apache.phoenix.compile.UpsertCompiler$2.execute(UpsertCompiler.java:815)
> 	at org.apache.phoenix.compile.DelegateMutationPlan.execute(DelegateMutationPlan.java:31)
> 	at org.apache.phoenix.compile.PostIndexDDLCompiler$1.execute(PostIndexDDLCompiler.java:124)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.java:2823)
> 	at org.apache.phoenix.schema.MetaDataClient.buildIndex(MetaDataClient.java:1079)
> 	at org.apache.phoenix.schema.MetaDataClient.createIndex(MetaDataClient.java:1382)
> 	at org.apache.phoenix.compile.CreateIndexCompiler$1.execute(CreateIndexCompiler.java:85)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
> 	at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
> 	at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:330)
> 	at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440)
> 	at org.apache.phoenix.hbase.index.write.TestIndexWriter$1.run(TestIndexWriter.java:93)
> Caused by: java.sql.SQLException: ERROR 201 (22000): Illegal data.
> 	at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:441)
> 	at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)
> 	at org.apache.phoenix.schema.types.PDataType.newIllegalDataException(PDataType.java:287)
> 	at org.apache.phoenix.schema.types.PUnsignedSmallint$UnsignedShortCodec.decodeShort(PUnsignedSmallint.java:146)
> 	at org.apache.phoenix.schema.types.PSmallint.toObject(PSmallint.java:104)
> 	at org.apache.phoenix.schema.types.PSmallint.toObject(PSmallint.java:28)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:980)
> 	at org.apache.phoenix.schema.types.PUnsignedSmallint.toObject(PUnsignedSmallint.java:102)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:980)
> 	at org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:992)
> 	at org.apache.phoenix.schema.types.PDataType.coerceBytes(PDataType.java:830)
> 	at org.apache.phoenix.schema.types.PDecimal.coerceBytes(PDecimal.java:342)
> 	at org.apache.phoenix.schema.types.PDataType.coerceBytes(PDataType.java:810)
> 	at org.apache.phoenix.expression.CoerceExpression.evaluate(CoerceExpression.java:149)
> 	at org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:69)
> 	at org.apache.phoenix.jdbc.PhoenixResultSet.getBytes(PhoenixResultSet.java:308)
> 	at org.apache.phoenix.compile.UpsertCompiler.upsertSelect(UpsertCompiler.java:197)
> 	at org.apache.phoenix.compile.UpsertCompiler.access$000(UpsertCompiler.java:115)
> 	at org.apache.phoenix.compile.UpsertCompiler$UpsertingParallelIteratorFactory.mutate(UpsertCompiler.java:259)
> 	at org.apache.phoenix.compile.MutatingParallelIteratorFactory.newIterator(MutatingParallelIteratorFactory.java:59)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:112)
> 	at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message