phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "James Taylor (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-3623) Integrate Omid with Phoenix
Date Fri, 25 May 2018 14:43:00 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-3623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16490822#comment-16490822 ] 

James Taylor commented on PHOENIX-3623:
---------------------------------------

Here's a complete test run. It took 7 hours! First the failures:
{code:java}
[INFO] --- maven-failsafe-plugin:2.20:integration-test (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------

[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR] SaltedViewIT.testSaltedUpdatableViewWithLocalIndex:43->BaseViewIT.testUpdatableViewWithIndex:81->BaseViewIT.testUpdatableViewIndex:175 » Commit
[ERROR] ViewIT.testNonSaltedUpdatableViewWithLocalIndex:140->BaseViewIT.testUpdatableViewWithIndex:81->BaseViewIT.testUpdatableViewIndex:175 » Commit
[ERROR] ViewIT.testViewUsesTableGlobalIndex:505->testViewUsesTableIndex:547 » PhoenixIO
[ERROR] ViewIT.testViewUsesTableLocalIndex:510->testViewUsesTableIndex:535 » SQL java....
[INFO] 
[ERROR] Tests run: 158, Failures: 0, Errors: 4, Skipped: 0


[INFO] --- maven-failsafe-plugin:2.20:integration-test (ParallelStatsDisabledTest) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] MutableIndexIT.testCoveredColumnUpdates:144
[ERROR] MutableIndexIT.testCoveredColumnUpdates:144
[ERROR] FlappingTransactionIT.testInflightDeleteNotSeen:193 expected:<2> but was:<1>
[ERROR] FlappingTransactionIT.testInflightUpdateNotSeen:140
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] Errors: 
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1033 » UncheckedExecution
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.ap...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.ap...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoo...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoo...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org....
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.apac...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.apac...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoop....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoop....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org.ap...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org.ap...
[ERROR] MutableIndexIT.testCompoundIndexKey:354 » Commit org.apache.hadoop.hbase.clien...
[ERROR] MutableIndexIT.testCompoundIndexKey:354 » Commit org.apache.hadoop.hbase.clien...
[ERROR] MutableIndexIT.testCoveredColumns:245 » Commit org.apache.hadoop.hbase.client....
[ERROR] MutableIndexIT.testCoveredColumns:245 » Commit org.apache.hadoop.hbase.client....
[ERROR] MutableIndexIT.testIndexHalfStoreFileReader:664 » SQL java.util.concurrent.Exe...
[ERROR] MutableIndexIT.testIndexHalfStoreFileReader:664 » SQL java.util.concurrent.Exe...
[ERROR] MutableIndexIT.testMultipleUpdatesToSingleRow:476 » Commit org.apache.hadoop.h...
[ERROR] MutableIndexIT.testMultipleUpdatesToSingleRow:476 » Commit org.apache.hadoop.h...
[ERROR] MutableIndexIT.testUpsertingNullForIndexedColumns:544 » Commit org.apache.hado...
[ERROR] MutableIndexIT.testUpsertingNullForIndexedColumns:544 » Commit org.apache.hado...
[ERROR] MutableRollbackIT.testCheckpointAndRollback:475 » Commit java.lang.IllegalArgu...
[ERROR] MutableRollbackIT.testCheckpointAndRollback:457 » Commit org.apache.hadoop.hba...
[ERROR] MutableRollbackIT.testMultiRollbackOfUncommittedExistingRowKeyIndexUpdate:354 » Commit
[ERROR] MutableRollbackIT.testRollbackOfUncommittedExistingRowKeyIndexUpdate:219 » Commit
[ERROR] RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert:85 » Commit org.apache...
[ERROR] RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert:85 » Commit org.apache...
[ERROR] RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert:128 » Commit org.apache....
[ERROR] RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert:128 » Commit org.apache....
[ERROR] FlappingTransactionIT.testExternalTxContext:241 » SQL ERROR 1092 (44A23): Cann...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[INFO] 
[ERROR] Tests run: 3477, Failures: 52, Errors: 84, Skipped: 3


[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] 
[INFO] Results:
[ERROR] Failures: 
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] ImmutableIndexIT.testDeleteFromNonPK:233 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromNonPK:233 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromPartialPK:191 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromPartialPK:191 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDropIfImmutableKeyValueColumn:145 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDropIfImmutableKeyValueColumn:145 expected:<3> but was:<0>
[ERROR] PartialCommitIT.testDeleteFailure:202->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testOrderOfMutationsIsPredicatable:216->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testStatementOrderMaintainedInConnection:228->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testUpsertFailure:176->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] Errors: 
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] TxWriteFailureIT.testDataTableWriteFailure:115->helpTestWriteFailure:149 » Commit
[ERROR] TxWriteFailureIT.testDataTableWriteFailure:115->helpTestWriteFailure:149 » Commit
[ERROR] PartialCommitIT.testNoFailure:170->testPartialCommit:274 » NullPointer
[ERROR] PartialCommitIT.testUpsertSelectFailure:192->testPartialCommit:274 » NullPointer
[INFO] 
[ERROR] Tests run: 640, Failures: 14, Errors: 20, Skipped: 44
{code}
Then the complete output:
{code:java}

[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.phoenix.memory.MemoryManagerTest
[INFO] Running org.apache.phoenix.trace.TraceSpanReceiverTest
[INFO] Running org.apache.phoenix.cache.JodaTimezoneCacheTest
[INFO] Running org.apache.phoenix.metrics.MetricTypeTest
[INFO] Running org.apache.phoenix.iterate.MergeSortResultIteratorTest
[INFO] Running org.apache.phoenix.cache.TenantCacheTest
[INFO] Running org.apache.phoenix.iterate.SpoolingResultIteratorTest
[INFO] Running org.apache.phoenix.iterate.ConcatResultIteratorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.483 s - in org.apache.phoenix.metrics.MetricTypeTest
[INFO] Running org.apache.phoenix.iterate.RowKeyOrderedAggregateResultIteratorTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.615 s - in org.apache.phoenix.cache.TenantCacheTest
[INFO] Running org.apache.phoenix.iterate.AggregateResultScannerTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.773 s - in org.apache.phoenix.iterate.MergeSortResultIteratorTest
[INFO] Running org.apache.phoenix.iterate.OrderedResultIteratorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.iterate.OrderedResultIteratorTest
[INFO] Running org.apache.phoenix.util.TenantIdByteConversionTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.829 s - in org.apache.phoenix.iterate.ConcatResultIteratorTest
[INFO] Running org.apache.phoenix.util.DateUtilTest
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.223 s - in org.apache.phoenix.util.TenantIdByteConversionTest
[INFO] Running org.apache.phoenix.util.LogUtilTest
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.368 s - in org.apache.phoenix.util.DateUtilTest
[INFO] Running org.apache.phoenix.util.ColumnInfoTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.71 s - in org.apache.phoenix.trace.TraceSpanReceiverTest
[INFO] Running org.apache.phoenix.util.PropertiesUtilTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.724 s - in org.apache.phoenix.iterate.SpoolingResultIteratorTest
[INFO] Running org.apache.phoenix.util.SequenceUtilTest
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.02 s - in org.apache.phoenix.util.SequenceUtilTest
[INFO] Running org.apache.phoenix.util.MetaDataUtilTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.895 s - in org.apache.phoenix.cache.JodaTimezoneCacheTest
[INFO] Running org.apache.phoenix.util.ScanUtilTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.328 s - in org.apache.phoenix.util.PropertiesUtilTest
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.141 s - in org.apache.phoenix.util.ScanUtilTest
[INFO] Running org.apache.phoenix.util.ByteUtilTest
[INFO] Running org.apache.phoenix.util.PrefixByteEncoderDecoderTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.util.PrefixByteEncoderDecoderTest
[INFO] Running org.apache.phoenix.util.PhoenixRuntimeTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.317 s - in org.apache.phoenix.util.MetaDataUtilTest
[INFO] Running org.apache.phoenix.util.json.JsonUpsertExecutorTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.16 s - in org.apache.phoenix.memory.MemoryManagerTest
[INFO] Running org.apache.phoenix.util.csv.StringToArrayConverterTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.212 s - in org.apache.phoenix.util.ByteUtilTest
[INFO] Running org.apache.phoenix.util.csv.CsvUpsertExecutorTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.202 s - in org.apache.phoenix.util.ColumnInfoTest
[INFO] Running org.apache.phoenix.util.IndexUtilTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.apache.phoenix.util.IndexUtilTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.422 s - in org.apache.phoenix.util.LogUtilTest
[INFO] Running org.apache.phoenix.util.Base62EncoderTest
[INFO] Running org.apache.phoenix.util.PhoenixEncodeDecodeTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 s - in org.apache.phoenix.util.Base62EncoderTest
[INFO] Running org.apache.phoenix.util.QualifierEncodingSchemeTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.util.QualifierEncodingSchemeTest
[INFO] Running org.apache.phoenix.util.QueryUtilTest
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.799 s - in org.apache.phoenix.util.QueryUtilTest
[INFO] Running org.apache.phoenix.util.JDBCUtilTest
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in org.apache.phoenix.util.JDBCUtilTest
[INFO] Running org.apache.phoenix.util.StringUtilTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.util.StringUtilTest
[INFO] Running org.apache.phoenix.util.LikeExpressionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.util.LikeExpressionTest
[INFO] Running org.apache.phoenix.util.PhoenixContextExecutorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s - in org.apache.phoenix.util.PhoenixContextExecutorTest
[INFO] Running org.apache.phoenix.parse.QueryParserTest
[INFO] Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.194 s - in org.apache.phoenix.parse.QueryParserTest
[INFO] Running org.apache.phoenix.parse.CastParseNodeTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.parse.CastParseNodeTest
[INFO] Running org.apache.phoenix.parse.BuiltInFunctionInfoTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 s - in org.apache.phoenix.parse.BuiltInFunctionInfoTest
[INFO] Running org.apache.phoenix.parse.CursorParserTest
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 s - in org.apache.phoenix.parse.CursorParserTest
[INFO] Running org.apache.phoenix.jdbc.SecureUserConnectionsTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.034 s - in org.apache.phoenix.iterate.AggregateResultScannerTest
[INFO] Running org.apache.phoenix.jdbc.PhoenixPreparedStatementTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.179 s - in org.apache.phoenix.iterate.RowKeyOrderedAggregateResultIteratorTest
[INFO] Running org.apache.phoenix.jdbc.PhoenixResultSetMetadataTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.941 s - in org.apache.phoenix.util.csv.StringToArrayConverterTest
[INFO] Running org.apache.phoenix.jdbc.ReadOnlyPropertiesTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.jdbc.ReadOnlyPropertiesTest
[INFO] Running org.apache.phoenix.jdbc.PhoenixDriverTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.873 s - in org.apache.phoenix.util.PhoenixEncodeDecodeTest
[INFO] Running org.apache.phoenix.jdbc.PhoenixEmbeddedDriverTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.418 s - in org.apache.phoenix.util.json.JsonUpsertExecutorTest
[INFO] Running org.apache.phoenix.schema.RowKeySchemaTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.889 s - in org.apache.phoenix.jdbc.PhoenixPreparedStatementTest
[INFO] Running org.apache.phoenix.schema.RowKeyValueAccessorTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 s - in org.apache.phoenix.jdbc.PhoenixEmbeddedDriverTest
[INFO] Running org.apache.phoenix.schema.ValueBitSetTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.schema.ValueBitSetTest
[INFO] Running org.apache.phoenix.schema.types.PDataTypeTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.76 s - in org.apache.phoenix.util.csv.CsvUpsertExecutorTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveShortPhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 s - in org.apache.phoenix.schema.types.PrimitiveShortPhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PVarcharArrayToStringTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 s - in org.apache.phoenix.schema.types.PVarcharArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PDateArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.03 s - in org.apache.phoenix.schema.types.PDateArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveIntPhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.schema.types.PrimitiveIntPhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveFloatPhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.schema.types.PrimitiveFloatPhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveBytePhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.schema.types.PrimitiveBytePhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveDoublePhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.schema.types.PrimitiveDoublePhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveBooleanPhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.schema.types.PrimitiveBooleanPhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PrimitiveLongPhoenixArrayToStringTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.schema.types.PrimitiveLongPhoenixArrayToStringTest
[INFO] Running org.apache.phoenix.schema.types.PDataTypeForArraysTest
[INFO] Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.604 s - in org.apache.phoenix.schema.types.PDataTypeTest
[INFO] Running org.apache.phoenix.schema.MutationTest
[INFO] Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.176 s - in org.apache.phoenix.schema.types.PDataTypeForArraysTest
[INFO] Running org.apache.phoenix.schema.PMetaDataImplTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.schema.PMetaDataImplTest
[INFO] Running org.apache.phoenix.schema.SchemaUtilTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.schema.SchemaUtilTest
[INFO] Running org.apache.phoenix.schema.PCharPadTest
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.schema.PCharPadTest
[INFO] Running org.apache.phoenix.schema.ImmutableStorageSchemeTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.851 s - in org.apache.phoenix.schema.RowKeyValueAccessorTest
[INFO] Running org.apache.phoenix.schema.SaltingUtilTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.schema.SaltingUtilTest
[INFO] Running org.apache.phoenix.schema.SequenceAllocationTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.schema.SequenceAllocationTest
[INFO] Running org.apache.phoenix.schema.SystemSplitPolicyTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.schema.SystemSplitPolicyTest
[INFO] Running org.apache.phoenix.schema.stats.StatisticsScannerTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.826 s - in org.apache.phoenix.jdbc.PhoenixResultSetMetadataTest
[INFO] Running org.apache.phoenix.schema.SortOrderTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 s - in org.apache.phoenix.schema.SortOrderTest
[INFO] Running org.apache.phoenix.compile.ViewCompilerTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.256 s - in org.apache.phoenix.schema.RowKeySchemaTest
[INFO] Running org.apache.phoenix.compile.StatementHintsCompilationTest
[WARNING] Tests run: 7, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 2.331 s - in org.apache.phoenix.jdbc.PhoenixDriverTest
[INFO] Running org.apache.phoenix.compile.CursorCompilerTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.464 s - in org.apache.phoenix.schema.MutationTest
[INFO] Running org.apache.phoenix.compile.JoinQueryCompilerTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.253 s - in org.apache.phoenix.schema.stats.StatisticsScannerTest
[INFO] Running org.apache.phoenix.compile.SaltedScanRangesTest
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.466 s - in org.apache.phoenix.schema.ImmutableStorageSchemeTest
[INFO] Running org.apache.phoenix.compile.SelectStatementRewriterTest
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.08 s - in org.apache.phoenix.compile.SaltedScanRangesTest
[INFO] Running org.apache.phoenix.compile.QueryCompilerTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.803 s - in org.apache.phoenix.compile.ViewCompilerTest
[INFO] Running org.apache.phoenix.compile.ScanRangesIntersectTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.compile.ScanRangesIntersectTest
[INFO] Running org.apache.phoenix.compile.WhereCompilerTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.687 s - in org.apache.phoenix.compile.StatementHintsCompilationTest
[INFO] Running org.apache.phoenix.compile.WhereOptimizerTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.058 s - in org.apache.phoenix.compile.CursorCompilerTest
[INFO] Running org.apache.phoenix.compile.CreateTableCompilerTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.769 s - in org.apache.phoenix.compile.SelectStatementRewriterTest
[INFO] Running org.apache.phoenix.compile.QueryOptimizerTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.379 s - in org.apache.phoenix.compile.JoinQueryCompilerTest
[INFO] Running org.apache.phoenix.compile.LimitCompilerTest
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.084 s - in org.apache.phoenix.util.PhoenixRuntimeTest
[INFO] Running org.apache.phoenix.compile.HavingCompilerTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.739 s - in org.apache.phoenix.compile.CreateTableCompilerTest
[INFO] Running org.apache.phoenix.compile.TenantSpecificViewIndexCompileTest
[WARNING] Tests run: 45, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 2.091 s - in org.apache.phoenix.compile.WhereCompilerTest
[INFO] Running org.apache.phoenix.compile.ScanRangesTest
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.16 s - in org.apache.phoenix.compile.ScanRangesTest
[INFO] Running org.apache.phoenix.compile.QueryMetaDataTest
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.802 s - in org.apache.phoenix.compile.HavingCompilerTest
[INFO] Running org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.981 s - in org.apache.phoenix.compile.LimitCompilerTest
[INFO] Running org.apache.phoenix.hbase.index.write.TestParalleWriterIndexCommitter
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.454 s - in org.apache.phoenix.hbase.index.write.TestParalleWriterIndexCommitter
[INFO] Running org.apache.phoenix.hbase.index.write.TestParalleIndexWriter
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in org.apache.phoenix.hbase.index.write.TestParalleIndexWriter
[INFO] Running org.apache.phoenix.hbase.index.write.TestWALRecoveryCaching
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.005 s - in org.apache.phoenix.hbase.index.write.TestWALRecoveryCaching
[INFO] Running org.apache.phoenix.hbase.index.write.TestIndexWriter
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.911 s - in org.apache.phoenix.compile.QueryMetaDataTest
[INFO] Running org.apache.phoenix.hbase.index.parallel.TestThreadPoolBuilder
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.hbase.index.parallel.TestThreadPoolBuilder
[INFO] Running org.apache.phoenix.hbase.index.parallel.TestThreadPoolManager
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.hbase.index.parallel.TestThreadPoolManager
[INFO] Running org.apache.phoenix.hbase.index.util.TestIndexManagementUtil
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.084 s - in org.apache.phoenix.hbase.index.write.TestIndexWriter
[INFO] Running org.apache.phoenix.hbase.index.covered.CoveredColumnsTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.hbase.index.covered.CoveredColumnsTest
[INFO] Running org.apache.phoenix.hbase.index.covered.NonTxIndexBuilderTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 s - in org.apache.phoenix.hbase.index.util.TestIndexManagementUtil
[INFO] Running org.apache.phoenix.hbase.index.covered.update.TestIndexUpdateManager
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.408 s - in org.apache.phoenix.hbase.index.covered.update.TestIndexUpdateManager
[INFO] Running org.apache.phoenix.hbase.index.covered.LocalTableStateTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.046 s - in org.apache.phoenix.compile.TenantSpecificViewIndexCompileTest
[INFO] Running org.apache.phoenix.hbase.index.covered.TestColumnTracker
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.hbase.index.covered.TestColumnTracker
[INFO] Running org.apache.phoenix.hbase.index.covered.TestCoveredColumnIndexCodec
[INFO] Tests run: 110, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.886 s - in org.apache.phoenix.compile.WhereOptimizerTest
[INFO] Running org.apache.phoenix.hbase.index.covered.filter.TestApplyAndFilterDeletesFilter
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.hbase.index.covered.filter.TestApplyAndFilterDeletesFilter
[INFO] Running org.apache.phoenix.hbase.index.covered.filter.TestNewerTimestampFilter
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.hbase.index.covered.filter.TestNewerTimestampFilter
[INFO] Running org.apache.phoenix.hbase.index.covered.TestCoveredIndexSpecifierBuilder
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.hbase.index.covered.TestCoveredIndexSpecifierBuilder
[INFO] Running org.apache.phoenix.hbase.index.covered.data.TestIndexMemStore
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.364 s - in org.apache.phoenix.hbase.index.covered.LocalTableStateTest
[INFO] Running org.apache.phoenix.hbase.index.covered.data.TestLocalTable
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.hbase.index.covered.data.TestLocalTable
[INFO] Running org.apache.phoenix.index.automated.MRJobSubmitterTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in org.apache.phoenix.hbase.index.covered.data.TestIndexMemStore
[INFO] Running org.apache.phoenix.index.IndexMaintainerTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 s - in org.apache.phoenix.hbase.index.covered.TestCoveredColumnIndexCodec
[INFO] Running org.apache.phoenix.execute.CorrelatePlanTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.578 s - in org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache
[INFO] Running org.apache.phoenix.execute.DescVarLengthFastByteComparisonsTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.execute.DescVarLengthFastByteComparisonsTest
[INFO] Running org.apache.phoenix.execute.UnnestArrayPlanTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.3 s - in org.apache.phoenix.index.automated.MRJobSubmitterTest
[INFO] Running org.apache.phoenix.execute.MutationStateTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.558 s - in org.apache.phoenix.execute.UnnestArrayPlanTest
[INFO] Running org.apache.phoenix.execute.LiteralResultIteratorPlanTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.execute.LiteralResultIteratorPlanTest
[INFO] Running org.apache.phoenix.expression.ArrayFillFunctionTest
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.048 s - in org.apache.phoenix.expression.ArrayFillFunctionTest
[INFO] Running org.apache.phoenix.expression.ColumnExpressionTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.expression.ColumnExpressionTest
[INFO] Running org.apache.phoenix.expression.LnLogFunctionTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.438 s - in org.apache.phoenix.execute.MutationStateTest
[INFO] Running org.apache.phoenix.expression.RegexpSplitFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.043 s - in org.apache.phoenix.expression.LnLogFunctionTest
[INFO] Running org.apache.phoenix.expression.util.regex.PatternPerformanceTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.expression.util.regex.PatternPerformanceTest
[INFO] Running org.apache.phoenix.expression.ArrayAppendFunctionTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.827 s - in org.apache.phoenix.execute.CorrelatePlanTest
[INFO] Running org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.025 s - in org.apache.phoenix.hbase.index.covered.NonTxIndexBuilderTest
[INFO] Running org.apache.phoenix.expression.ArrayConstructorExpressionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.expression.ArrayConstructorExpressionTest
[INFO] Running org.apache.phoenix.expression.RegexpSubstrFunctionTest
[INFO] Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.456 s - in org.apache.phoenix.expression.ArrayAppendFunctionTest
[INFO] Running org.apache.phoenix.expression.SignFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.044 s - in org.apache.phoenix.expression.SignFunctionTest
[INFO] Running org.apache.phoenix.expression.ArrayConcatFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.591 s - in org.apache.phoenix.expression.RegexpSplitFunctionTest
[INFO] Running org.apache.phoenix.expression.SortOrderExpressionTest
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.064 s - in org.apache.phoenix.expression.SortOrderExpressionTest
[INFO] Running org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
[INFO] Running org.apache.phoenix.expression.function.LowerFunctionTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 s - in org.apache.phoenix.expression.function.LowerFunctionTest
[INFO] Running org.apache.phoenix.expression.function.UpperFunctionTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.expression.function.UpperFunctionTest
[INFO] Running org.apache.phoenix.expression.function.CollationKeyFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.264 s - in org.apache.phoenix.expression.RegexpSubstrFunctionTest
[INFO] Running org.apache.phoenix.expression.function.InstrFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.005 s - in org.apache.phoenix.expression.function.InstrFunctionTest
[INFO] Running org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 s - in org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
[INFO] Running org.apache.phoenix.expression.ArrayPrependFunctionTest
[INFO] Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.549 s - in org.apache.phoenix.expression.ArrayPrependFunctionTest
[INFO] Running org.apache.phoenix.expression.StringToArrayFunctionTest
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.04 s - in org.apache.phoenix.expression.StringToArrayFunctionTest
[INFO] Running org.apache.phoenix.expression.ArithmeticOperationTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.048 s - in org.apache.phoenix.expression.ArithmeticOperationTest
[INFO] Running org.apache.phoenix.expression.OctetLengthFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 s - in org.apache.phoenix.expression.OctetLengthFunctionTest
[INFO] Running org.apache.phoenix.expression.GetSetByteBitFunctionTest
[INFO] Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.48 s - in org.apache.phoenix.expression.ArrayConcatFunctionTest
[INFO] Running org.apache.phoenix.expression.ArrayToStringFunctionTest
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.013 s - in org.apache.phoenix.expression.ArrayToStringFunctionTest
[INFO] Running org.apache.phoenix.expression.DeterminismTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s - in org.apache.phoenix.expression.DeterminismTest
[INFO] Running org.apache.phoenix.expression.ExpFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.apache.phoenix.expression.ExpFunctionTest
[INFO] Running org.apache.phoenix.expression.ArrayRemoveFunctionTest
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.015 s - in org.apache.phoenix.expression.ArrayRemoveFunctionTest
[INFO] Running org.apache.phoenix.expression.CbrtFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.apache.phoenix.expression.CbrtFunctionTest
[INFO] Running org.apache.phoenix.expression.PowerFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.074 s - in org.apache.phoenix.expression.PowerFunctionTest
[INFO] Running org.apache.phoenix.expression.CoerceExpressionTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.phoenix.expression.CoerceExpressionTest
[INFO] Running org.apache.phoenix.expression.ILikeExpressionTest
[INFO] Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.224 s - in org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
[INFO] Running org.apache.phoenix.expression.SqrtFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 s - in org.apache.phoenix.expression.SqrtFunctionTest
[INFO] Running org.apache.phoenix.expression.LikeExpressionTest
[INFO] Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.077 s - in org.apache.phoenix.compile.QueryOptimizerTest
[INFO] Running org.apache.phoenix.expression.RegexpReplaceFunctionTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.378 s - in org.apache.phoenix.expression.ILikeExpressionTest
[INFO] Running org.apache.phoenix.expression.NullValueTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.362 s - in org.apache.phoenix.expression.LikeExpressionTest
[INFO] Running org.apache.phoenix.expression.AbsFunctionTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in org.apache.phoenix.expression.AbsFunctionTest
[INFO] Running org.apache.phoenix.filter.DistinctPrefixFilterTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 s - in org.apache.phoenix.filter.DistinctPrefixFilterTest
[INFO] Running org.apache.phoenix.filter.SkipScanFilterTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.369 s - in org.apache.phoenix.expression.RegexpReplaceFunctionTest
[INFO] Running org.apache.phoenix.filter.SkipScanBigFilterTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.315 s - in org.apache.phoenix.expression.GetSetByteBitFunctionTest
[INFO] Running org.apache.phoenix.filter.SkipScanFilterIntersectTest
[INFO] Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 s - in org.apache.phoenix.filter.SkipScanFilterIntersectTest
[INFO] Running org.apache.phoenix.query.KeyRangeCoalesceTest
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.229 s - in org.apache.phoenix.filter.SkipScanFilterTest
[INFO] Running org.apache.phoenix.query.ScannerLeaseRenewalTest
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.134 s - in org.apache.phoenix.query.KeyRangeCoalesceTest
[INFO] Running org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.039 s - in org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
[INFO] Running org.apache.phoenix.query.KeyRangeClipTest
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.576 s - in org.apache.phoenix.expression.function.CollationKeyFunctionTest
[INFO] Running org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
[INFO] Running org.apache.phoenix.query.ParallelIteratorsSplitTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.819 s - in org.apache.phoenix.expression.NullValueTest
[INFO] Running org.apache.phoenix.query.KeyRangeUnionTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 s - in org.apache.phoenix.query.KeyRangeUnionTest
[INFO] Running org.apache.phoenix.query.KeyRangeIntersectTest
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.274 s - in org.apache.phoenix.index.IndexMaintainerTest
[INFO] Running org.apache.phoenix.query.OrderByTest
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 s - in org.apache.phoenix.query.KeyRangeIntersectTest
[INFO] Running org.apache.phoenix.query.QueryPlanTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.206 s - in org.apache.phoenix.jdbc.SecureUserConnectionsTest
[INFO] Running org.apache.phoenix.query.ConnectionlessTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.924 s - in org.apache.phoenix.filter.SkipScanBigFilterTest
[INFO] Running org.apache.phoenix.query.HBaseFactoryProviderTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.query.HBaseFactoryProviderTest
[INFO] Running org.apache.phoenix.query.ConnectionQueryServicesImplTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.92 s - in org.apache.phoenix.query.KeyRangeClipTest
[INFO] Running org.apache.phoenix.query.KeyRangeMoreTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.369 s - in org.apache.phoenix.query.ConnectionQueryServicesImplTest
[INFO] Running org.apache.phoenix.query.PropertyPolicyProviderTest
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.027 s - in org.apache.phoenix.query.ParallelIteratorsSplitTest
[INFO] Running org.apache.phoenix.mapreduce.CsvToKeyValueMapperTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.778 s - in org.apache.phoenix.query.OrderByTest
[INFO] Running org.apache.phoenix.mapreduce.CsvBulkImportUtilTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.042 s - in org.apache.phoenix.mapreduce.CsvToKeyValueMapperTest
[INFO] Running org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtilTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.661 s - in org.apache.phoenix.query.PropertyPolicyProviderTest
[INFO] Running org.apache.phoenix.mapreduce.util.IndexColumnNamesTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.821 s - in org.apache.phoenix.query.QueryPlanTest
[INFO] Running org.apache.phoenix.mapreduce.util.ColumnInfoToStringEncoderDecoderTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.599 s - in org.apache.phoenix.mapreduce.util.IndexColumnNamesTest
[INFO] Running org.apache.phoenix.mapreduce.FormatToBytesWritableMapperTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 s - in org.apache.phoenix.mapreduce.util.ColumnInfoToStringEncoderDecoderTest
[INFO] Running org.apache.phoenix.mapreduce.bulkload.TestTableRowkeyPair
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.phoenix.mapreduce.bulkload.TestTableRowkeyPair
[INFO] Running org.apache.phoenix.mapreduce.index.IndexScrutinyTableOutputTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.197 s - in org.apache.phoenix.mapreduce.CsvBulkImportUtilTest
[INFO] Running org.apache.phoenix.mapreduce.BulkLoadToolTest
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 s - in org.apache.phoenix.mapreduce.BulkLoadToolTest
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.ReadWriteKeyValuesWithCodecTest
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.26 s - in org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtilTest
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodecTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.181 s - in org.apache.phoenix.mapreduce.FormatToBytesWritableMapperTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodecTest
[INFO] Running org.apache.hadoop.hbase.ipc.PhoenixIndexRpcSchedulerTest
[INFO] Running org.apache.hadoop.hbase.regionserver.PhoenixRpcSchedulerFactoryTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 s - in org.apache.hadoop.hbase.regionserver.PhoenixRpcSchedulerFactoryTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.701 s - in org.apache.hadoop.hbase.ipc.PhoenixIndexRpcSchedulerTest
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.821 s - in org.apache.phoenix.mapreduce.index.IndexScrutinyTableOutputTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.873 s - in org.apache.hadoop.hbase.regionserver.wal.ReadWriteKeyValuesWithCodecTest
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.094 s - in org.apache.phoenix.query.ConnectionlessTest
[WARNING] Tests run: 186, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.209 s - in org.apache.phoenix.compile.QueryCompilerTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.246 s - in org.apache.phoenix.query.ScannerLeaseRenewalTest
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.479 s - in org.apache.phoenix.query.KeyRangeMoreTest
[INFO] 
[INFO] Results:
[INFO] 
[WARNING] Tests run: 1640, Failures: 0, Errors: 0, Skipped: 5
[INFO] 
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-core ---
[INFO] Building jar: /Users/jtaylor/dev/apache/phoenix-omid/phoenix-core/target/phoenix-core-4.14.0-HBase-1.3-sources.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-core ---
[INFO] Building jar: /Users/jtaylor/dev/apache/phoenix-omid/phoenix-core/target/phoenix-core-4.14.0-HBase-1.3-tests.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-core ---
[INFO] Building jar: /Users/jtaylor/dev/apache/phoenix-omid/phoenix-core/target/phoenix-core-4.14.0-HBase-1.3.jar
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-core ---
[INFO] 
[INFO] --- maven-assembly-plugin:2.5.2:single (core) @ phoenix-core ---
[INFO] Reading assembly descriptor: src/build/phoenix-core.xml
[WARNING] Artifact: org.apache.phoenix:phoenix-core:jar:4.14.0-HBase-1.3 references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
[INFO] Building jar: /Users/jtaylor/dev/apache/phoenix-omid/phoenix-core/target/phoenix-core-4.14.0-HBase-1.3.jar
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.phoenix.end2end.KeyOnlyIT
[INFO] Running org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
[INFO] Running org.apache.phoenix.end2end.MultiCfQueryExecIT
[INFO] Running org.apache.phoenix.end2end.ExplainPlanWithStatsEnabledIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.654 s - in org.apache.phoenix.end2end.KeyOnlyIT
[INFO] Running org.apache.phoenix.end2end.ParallelIteratorsIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.165 s - in org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
[INFO] Running org.apache.phoenix.end2end.QueryWithTableSampleIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.613 s - in org.apache.phoenix.end2end.ParallelIteratorsIT
[INFO] Running org.apache.phoenix.end2end.ReadIsolationLevelIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.614 s - in org.apache.phoenix.end2end.ReadIsolationLevelIT
[INFO] Running org.apache.phoenix.end2end.SaltedViewIT
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.359 s - in org.apache.phoenix.end2end.MultiCfQueryExecIT
[INFO] Running org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.917 s - in org.apache.phoenix.end2end.QueryWithTableSampleIT
[INFO] Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
[INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.445 s - in org.apache.phoenix.end2end.ExplainPlanWithStatsEnabledIT
[INFO] Running org.apache.phoenix.end2end.TransactionalViewIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.512 s - in org.apache.phoenix.end2end.TransactionalViewIT
[INFO] Running org.apache.phoenix.end2end.ViewIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 158.354 s - in org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
[INFO] Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.595 s - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 147.538 s - in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
[ERROR] Tests run: 4, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 559.818 s <<< FAILURE! - in org.apache.phoenix.end2end.SaltedViewIT
[ERROR] testSaltedUpdatableViewWithLocalIndex[transactional = true](org.apache.phoenix.end2end.SaltedViewIT) Time elapsed: 539.673 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,49541,1527229173308, 
at org.apache.phoenix.end2end.SaltedViewIT.testSaltedUpdatableViewWithLocalIndex(SaltedViewIT.java:43)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,49541,1527229173308, 
at org.apache.phoenix.end2end.SaltedViewIT.testSaltedUpdatableViewWithLocalIndex(SaltedViewIT.java:43)

[ERROR] Tests run: 56, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 828.065 s <<< FAILURE! - in org.apache.phoenix.end2end.ViewIT
[ERROR] testViewUsesTableLocalIndex[transactional = true](org.apache.phoenix.end2end.ViewIT) Time elapsed: 11.197 s <<< ERROR!
java.sql.SQLException: 
java.util.concurrent.ExecutionException: java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:535)
at org.apache.phoenix.end2end.ViewIT.testViewUsesTableLocalIndex(ViewIT.java:510)
Caused by: java.util.concurrent.ExecutionException: 
java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:535)
at org.apache.phoenix.end2end.ViewIT.testViewUsesTableLocalIndex(ViewIT.java:510)
Caused by: java.lang.Exception: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more


[ERROR] testViewUsesTableGlobalIndex[transactional = true](org.apache.phoenix.end2end.ViewIT) Time elapsed: 9.573 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:547)
at org.apache.phoenix.end2end.ViewIT.testViewUsesTableGlobalIndex(ViewIT.java:505)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.ViewIT.testViewUsesTableIndex(ViewIT.java:547)
at org.apache.phoenix.end2end.ViewIT.testViewUsesTableGlobalIndex(ViewIT.java:505)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: S_T000210.I_T000212,,1527229504676.c3a079d8038434a97b349bef12a3f5f9.: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: java.lang.IllegalArgumentException: Negative column qualifier-2146712960 not allowed 
at org.apache.phoenix.util.EncodedColumnsUtil.isReservedColumnQualifier(EncodedColumnsUtil.java:196)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.getReservedQualifier(PTable.java:494)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme.access$400(PTable.java:249)
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:340)
at org.apache.phoenix.filter.MultiEncodedCQKeyValueComparisonFilter.filterKeyValue(MultiEncodedCQKeyValueComparisonFilter.java:229)
at org.apache.hadoop.hbase.filter.FilterList.filterKeyValue(FilterList.java:264)
at org.apache.hadoop.hbase.regionserver.ScanQueryMatcher.match(ScanQueryMatcher.java:437)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:525)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5921)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6084)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testNonSaltedUpdatableViewWithLocalIndex[transactional = true](org.apache.phoenix.end2end.ViewIT) Time elapsed: 540.726 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,50688,1527229277265, 
at org.apache.phoenix.end2end.ViewIT.testNonSaltedUpdatableViewWithLocalIndex(ViewIT.java:140)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,50688,1527229277265, 
at org.apache.phoenix.end2end.ViewIT.testNonSaltedUpdatableViewWithLocalIndex(ViewIT.java:140)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR] SaltedViewIT.testSaltedUpdatableViewWithLocalIndex:43->BaseViewIT.testUpdatableViewWithIndex:81->BaseViewIT.testUpdatableViewIndex:175 » Commit
[ERROR] ViewIT.testNonSaltedUpdatableViewWithLocalIndex:140->BaseViewIT.testUpdatableViewWithIndex:81->BaseViewIT.testUpdatableViewIndex:175 » Commit
[ERROR] ViewIT.testViewUsesTableGlobalIndex:505->testViewUsesTableIndex:547 » PhoenixIO
[ERROR] ViewIT.testViewUsesTableLocalIndex:510->testViewUsesTableIndex:535 » SQL java....
[INFO] 
[ERROR] Tests run: 158, Failures: 0, Errors: 4, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (ParallelStatsDisabledTest) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.phoenix.end2end.AggregateIT
[INFO] Running org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
[INFO] Running org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.436 s - in org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.AlterSessionIT
[INFO] Running org.apache.phoenix.end2end.AggregateQueryIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.539 s - in org.apache.phoenix.end2end.AlterSessionIT
[INFO] Running org.apache.phoenix.end2end.AlterTableIT
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.266 s - in org.apache.phoenix.end2end.AggregateIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.134 s - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
[INFO] Running org.apache.phoenix.end2end.AppendOnlySchemaIT
[INFO] Running org.apache.phoenix.end2end.AlterTableWithViewsIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.55 s - in org.apache.phoenix.end2end.AppendOnlySchemaIT
[INFO] Running org.apache.phoenix.end2end.ArithmeticQueryIT
[INFO] Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.985 s - in org.apache.phoenix.end2end.ArithmeticQueryIT
[INFO] Running org.apache.phoenix.end2end.Array1IT
[INFO] Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 266.607 s - in org.apache.phoenix.end2end.AggregateQueryIT
[INFO] Tests run: 27, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.405 s - in org.apache.phoenix.end2end.Array1IT
[INFO] Running org.apache.phoenix.end2end.Array2IT
[INFO] Running org.apache.phoenix.end2end.Array3IT
[INFO] Tests run: 52, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 337.827 s - in org.apache.phoenix.end2end.AlterTableIT
[INFO] Tests run: 27, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.07 s - in org.apache.phoenix.end2end.Array2IT
[INFO] Running org.apache.phoenix.end2end.ArrayConcatFunctionIT
[INFO] Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.04 s - in org.apache.phoenix.end2end.Array3IT
[INFO] Running org.apache.phoenix.end2end.ArrayFillFunctionIT
[INFO] Running org.apache.phoenix.end2end.ArrayAppendFunctionIT
[INFO] Tests run: 31, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.023 s - in org.apache.phoenix.end2end.ArrayConcatFunctionIT
[INFO] Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.171 s - in org.apache.phoenix.end2end.ArrayFillFunctionIT
[INFO] Running org.apache.phoenix.end2end.ArrayPrependFunctionIT
[INFO] Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.289 s - in org.apache.phoenix.end2end.ArrayAppendFunctionIT
[INFO] Running org.apache.phoenix.end2end.ArrayRemoveFunctionIT
[INFO] Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.762 s - in org.apache.phoenix.end2end.ArrayPrependFunctionIT
[INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.811 s - in org.apache.phoenix.end2end.ArrayRemoveFunctionIT
[ERROR] Tests run: 52, Failures: 0, Errors: 4, Skipped: 0, Time elapsed: 439.499 s <<< FAILURE! - in org.apache.phoenix.end2end.AlterTableWithViewsIT
[ERROR] testMakeBaseTableTransactional[AlterTableWithViewsIT_multiTenant=false, columnEncoded=false](org.apache.phoenix.end2end.AlterTableWithViewsIT) Time elapsed: 2.456 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=NONTXNTBL_T0000451
at org.apache.phoenix.end2end.AlterTableWithViewsIT.testMakeBaseTableTransactional(AlterTableWithViewsIT.java:784)

[ERROR] testMakeBaseTableTransactional[AlterTableWithViewsIT_multiTenant=false, columnEncoded=true](org.apache.phoenix.end2end.AlterTableWithViewsIT) Time elapsed: 2.517 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=NONTXNTBL_T0000601
at org.apache.phoenix.end2end.AlterTableWithViewsIT.testMakeBaseTableTransactional(AlterTableWithViewsIT.java:784)

[ERROR] testMakeBaseTableTransactional[AlterTableWithViewsIT_multiTenant=true, columnEncoded=false](org.apache.phoenix.end2end.AlterTableWithViewsIT) Time elapsed: 4.807 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=NONTXNTBL_T0000750
at org.apache.phoenix.end2end.AlterTableWithViewsIT.testMakeBaseTableTransactional(AlterTableWithViewsIT.java:784)

[ERROR] testMakeBaseTableTransactional[AlterTableWithViewsIT_multiTenant=true, columnEncoded=true](org.apache.phoenix.end2end.AlterTableWithViewsIT) Time elapsed: 4.845 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=NONTXNTBL_T0000900
at org.apache.phoenix.end2end.AlterTableWithViewsIT.testMakeBaseTableTransactional(AlterTableWithViewsIT.java:784)

[INFO] Running org.apache.phoenix.end2end.ArraysWithNullsIT
[INFO] Running org.apache.phoenix.end2end.AutoCommitIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.275 s - in org.apache.phoenix.end2end.AutoCommitIT
[INFO] Running org.apache.phoenix.end2end.BinaryRowKeyIT
[INFO] Running org.apache.phoenix.end2end.AutoPartitionViewsIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.797 s - in org.apache.phoenix.end2end.BinaryRowKeyIT
[INFO] Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
[INFO] Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.443 s - in org.apache.phoenix.end2end.ArrayToStringFunctionIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.299 s - in org.apache.phoenix.end2end.ArraysWithNullsIT
[INFO] Running org.apache.phoenix.end2end.CastAndCoerceIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.431 s - in org.apache.phoenix.end2end.CSVCommonsLoaderIT
[INFO] Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.CaseStatementIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.156 s - in org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.CoalesceFunctionIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.715 s - in org.apache.phoenix.end2end.AutoPartitionViewsIT
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.585 s - in org.apache.phoenix.end2end.CoalesceFunctionIT
[INFO] Running org.apache.phoenix.end2end.CollationKeyFunctionIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedBytesPropIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.799 s - in org.apache.phoenix.end2end.CollationKeyFunctionIT
[INFO] Running org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.002 s - in org.apache.phoenix.end2end.ColumnEncodedBytesPropIT
[INFO] Running org.apache.phoenix.end2end.ConcurrentMutationsIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.75 s - in org.apache.phoenix.end2end.ColumnProjectionOptimizationIT
[INFO] Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.111 s - in org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctApproximateHyperLogLogIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.724 s - in org.apache.phoenix.end2end.CountDistinctApproximateHyperLogLogIT
[INFO] Running org.apache.phoenix.end2end.CreateSchemaIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.52 s - in org.apache.phoenix.end2end.CreateSchemaIT
[INFO] Running org.apache.phoenix.end2end.CreateTableIT
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.009 s - in org.apache.phoenix.end2end.CreateTableIT
[WARNING] Tests run: 15, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 140.272 s - in org.apache.phoenix.end2end.ConcurrentMutationsIT
[INFO] Running org.apache.phoenix.end2end.CursorWithRowValueConstructorIT
[INFO] Running org.apache.phoenix.end2end.CustomEntityDataIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.858 s - in org.apache.phoenix.end2end.CustomEntityDataIT
[INFO] Running org.apache.phoenix.end2end.DateArithmeticIT
[INFO] Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 290.628 s - in org.apache.phoenix.end2end.CastAndCoerceIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.442 s - in org.apache.phoenix.end2end.DateArithmeticIT
[INFO] Running org.apache.phoenix.end2end.DecodeFunctionIT
[INFO] Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.997 s - in org.apache.phoenix.end2end.CursorWithRowValueConstructorIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.57 s - in org.apache.phoenix.end2end.DecodeFunctionIT
[INFO] Running org.apache.phoenix.end2end.DeleteIT
[INFO] Running org.apache.phoenix.end2end.DateTimeIT
[INFO] Running org.apache.phoenix.end2end.DefaultColumnValueIT
[INFO] Tests run: 63, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 374.692 s - in org.apache.phoenix.end2end.CaseStatementIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.674 s - in org.apache.phoenix.end2end.DefaultColumnValueIT
[INFO] Running org.apache.phoenix.end2end.DerivedTableIT
[INFO] Running org.apache.phoenix.end2end.DisableLocalIndexIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.572 s - in org.apache.phoenix.end2end.DisableLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.DistinctCountIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.133 s - in org.apache.phoenix.end2end.DistinctCountIT
[INFO] Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 168.122 s - in org.apache.phoenix.end2end.DeleteIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.853 s - in org.apache.phoenix.end2end.DerivedTableIT
[INFO] Running org.apache.phoenix.end2end.DynamicColumnIT
[INFO] Running org.apache.phoenix.end2end.DropTableIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.269 s - in org.apache.phoenix.end2end.DropTableIT
[INFO] Running org.apache.phoenix.end2end.DynamicFamilyIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.574 s - in org.apache.phoenix.end2end.DynamicFamilyIT
[INFO] Running org.apache.phoenix.end2end.DynamicUpsertIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.788 s - in org.apache.phoenix.end2end.DynamicUpsertIT
[INFO] Running org.apache.phoenix.end2end.EncodeFunctionIT
[INFO] Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 213.517 s - in org.apache.phoenix.end2end.DateTimeIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.245 s - in org.apache.phoenix.end2end.DynamicColumnIT
[INFO] Running org.apache.phoenix.end2end.ExecuteStatementsIT
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.287 s - in org.apache.phoenix.end2end.DistinctPrefixFilterIT
[INFO] Running org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.453 s - in org.apache.phoenix.end2end.EncodeFunctionIT
[INFO] Running org.apache.phoenix.end2end.ExplainPlanWithStatsDisabledIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.873 s - in org.apache.phoenix.end2end.ExecuteStatementsIT
[INFO] Running org.apache.phoenix.end2end.ExtendedQueryExecIT
[INFO] Running org.apache.phoenix.end2end.EvaluationOfORIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.889 s - in org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.165 s - in org.apache.phoenix.end2end.ExtendedQueryExecIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.262 s - in org.apache.phoenix.end2end.EvaluationOfORIT
[INFO] Running org.apache.phoenix.end2end.FlappingAlterTableIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.389 s - in org.apache.phoenix.end2end.FlappingAlterTableIT
[INFO] Running org.apache.phoenix.end2end.FunkyNamesIT
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.984 s - in org.apache.phoenix.end2end.ExplainPlanWithStatsDisabledIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.817 s - in org.apache.phoenix.end2end.FunkyNamesIT
[INFO] Running org.apache.phoenix.end2end.FirstValueFunctionIT
[INFO] Running org.apache.phoenix.end2end.FirstValuesFunctionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.281 s - in org.apache.phoenix.end2end.FirstValuesFunctionIT
[INFO] Running org.apache.phoenix.end2end.ImmutableTablePropertiesIT
[INFO] Running org.apache.phoenix.end2end.GroupByIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.871 s - in org.apache.phoenix.end2end.FirstValueFunctionIT
[INFO] Running org.apache.phoenix.end2end.InListIT
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.608 s - in org.apache.phoenix.end2end.ImmutableTablePropertiesIT
[INFO] Running org.apache.phoenix.end2end.InQueryIT
[INFO] Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.288 s - in org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.InstrFunctionIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.101 s - in org.apache.phoenix.end2end.InstrFunctionIT
[INFO] Running org.apache.phoenix.end2end.IntArithmeticIT
[INFO] Tests run: 63, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 372.94 s - in org.apache.phoenix.end2end.InQueryIT
[INFO] Tests run: 63, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 405.633 s - in org.apache.phoenix.end2end.GroupByIT
[INFO] Running org.apache.phoenix.end2end.IsNullIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.771 s - in org.apache.phoenix.end2end.IsNullIT
[INFO] Running org.apache.phoenix.end2end.LastValuesFunctionIT
[INFO] Running org.apache.phoenix.end2end.LastValueFunctionIT
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.941 s - in org.apache.phoenix.end2end.LastValueFunctionIT
[INFO] Running org.apache.phoenix.end2end.LikeExpressionIT
[INFO] Tests run: 70, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 415.616 s - in org.apache.phoenix.end2end.IntArithmeticIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.731 s - in org.apache.phoenix.end2end.LastValuesFunctionIT
[INFO] Running org.apache.phoenix.end2end.MD5FunctionIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.796 s - in org.apache.phoenix.end2end.MD5FunctionIT
[INFO] Running org.apache.phoenix.end2end.MapReduceIT
[INFO] Running org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.606 s - in org.apache.phoenix.end2end.MapReduceIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.639 s - in org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.MetaDataEndPointIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.417 s - in org.apache.phoenix.end2end.LikeExpressionIT
[INFO] Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.278 s - in org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
[INFO] Running org.apache.phoenix.end2end.ModulusExpressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.499 s - in org.apache.phoenix.end2end.MetaDataEndPointIT
[INFO] Running org.apache.phoenix.end2end.MutationStateIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.588 s - in org.apache.phoenix.end2end.MutationStateIT
[INFO] Running org.apache.phoenix.end2end.NamespaceSchemaMappingIT
[INFO] Running org.apache.phoenix.end2end.MappingTableDataTypeIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.86 s - in org.apache.phoenix.end2end.NamespaceSchemaMappingIT
[INFO] Running org.apache.phoenix.end2end.NativeHBaseTypesIT
[INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.339 s - in org.apache.phoenix.end2end.ModulusExpressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.675 s - in org.apache.phoenix.end2end.MappingTableDataTypeIT
[INFO] Running org.apache.phoenix.end2end.NotQueryWithLocalImmutableIndexesIT
[INFO] Running org.apache.phoenix.end2end.NotQueryWithGlobalImmutableIndexesIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 60.369 s - in org.apache.phoenix.end2end.NativeHBaseTypesIT
[INFO] Running org.apache.phoenix.end2end.NthValueFunctionIT
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.994 s - in org.apache.phoenix.end2end.NthValueFunctionIT
[INFO] Running org.apache.phoenix.end2end.NullIT
[INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 638.128 s - in org.apache.phoenix.end2end.InListIT
[INFO] Running org.apache.phoenix.end2end.NumericArithmeticIT
[INFO] Tests run: 44, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 170.734 s - in org.apache.phoenix.end2end.NotQueryWithGlobalImmutableIndexesIT
[INFO] Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.786 s - in org.apache.phoenix.end2end.NumericArithmeticIT
[INFO] Running org.apache.phoenix.end2end.OnDuplicateKeyIT
[INFO] Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.266 s - in org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.OrderByIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 278.832 s - in org.apache.phoenix.end2end.NotQueryWithLocalImmutableIndexesIT
[INFO] Running org.apache.phoenix.end2end.PartialScannerResultsDisabledIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.764 s - in org.apache.phoenix.end2end.PartialScannerResultsDisabledIT
[INFO] Running org.apache.phoenix.end2end.PercentileIT
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 107.279 s - in org.apache.phoenix.end2end.OrderByIT
[INFO] Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 241.603 s - in org.apache.phoenix.end2end.NullIT
[INFO] Running org.apache.phoenix.end2end.PhoenixRuntimeIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.901 s - in org.apache.phoenix.end2end.PercentileIT
[INFO] Running org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.64 s - in org.apache.phoenix.end2end.PowerFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.PrimitiveTypeIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.705 s - in org.apache.phoenix.end2end.PhoenixRuntimeIT
[INFO] Running org.apache.phoenix.end2end.ProductMetricsIT
[INFO] Running org.apache.phoenix.end2end.PointInTimeQueryIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.383 s - in org.apache.phoenix.end2end.PrimitiveTypeIT
[INFO] Running org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
[INFO] Tests run: 48, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 306.109 s - in org.apache.phoenix.end2end.OnDuplicateKeyIT
[INFO] Tests run: 55, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 125.837 s - in org.apache.phoenix.end2end.ProductMetricsIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 100.217 s - in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
[INFO] Running org.apache.phoenix.end2end.QueryExecWithoutSCNIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.273 s - in org.apache.phoenix.end2end.QueryExecWithoutSCNIT
[INFO] Running org.apache.phoenix.end2end.QueryWithOffsetIT
[INFO] Running org.apache.phoenix.end2end.QueryIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.624 s - in org.apache.phoenix.end2end.QueryWithOffsetIT
[INFO] Running org.apache.phoenix.end2end.RTrimFunctionIT
[INFO] Running org.apache.phoenix.end2end.QueryMoreIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.552 s - in org.apache.phoenix.end2end.RTrimFunctionIT
[INFO] Running org.apache.phoenix.end2end.RangeScanIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.417 s - in org.apache.phoenix.end2end.QueryMoreIT
[INFO] Running org.apache.phoenix.end2end.ReadOnlyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.674 s - in org.apache.phoenix.end2end.ReadOnlyIT
[INFO] Running org.apache.phoenix.end2end.RegexpReplaceFunctionIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.511 s - in org.apache.phoenix.end2end.RegexpReplaceFunctionIT
[INFO] Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
[INFO] Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 252.456 s - in org.apache.phoenix.end2end.PointInTimeQueryIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.025 s - in org.apache.phoenix.end2end.RegexpSplitFunctionIT
[INFO] Running org.apache.phoenix.end2end.RegexpSubstrFunctionIT
[INFO] Running org.apache.phoenix.end2end.ReverseFunctionIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.55 s - in org.apache.phoenix.end2end.RegexpSubstrFunctionIT
[INFO] Running org.apache.phoenix.end2end.ReverseScanIT
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.338 s - in org.apache.phoenix.end2end.ReverseFunctionIT
[INFO] Running org.apache.phoenix.end2end.RoundFloorCeilFuncIT
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.678 s - in org.apache.phoenix.end2end.ReverseScanIT
[INFO] Running org.apache.phoenix.end2end.RowTimestampIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.499 s - in org.apache.phoenix.end2end.RowTimestampIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.439 s - in org.apache.phoenix.end2end.RoundFloorCeilFuncIT
[INFO] Running org.apache.phoenix.end2end.RowValueConstructorIT
[INFO] Running org.apache.phoenix.end2end.SequenceBulkAllocationIT
[INFO] Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 247.886 s - in org.apache.phoenix.end2end.QueryIT
[INFO] Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.775 s - in org.apache.phoenix.end2end.SequenceBulkAllocationIT
[INFO] Running org.apache.phoenix.end2end.SerialIteratorsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.284 s - in org.apache.phoenix.end2end.SerialIteratorsIT
[INFO] Running org.apache.phoenix.end2end.ServerExceptionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.269 s - in org.apache.phoenix.end2end.ServerExceptionIT
[INFO] Running org.apache.phoenix.end2end.SetPropertyOnEncodedTableIT
[INFO] Running org.apache.phoenix.end2end.SequenceIT
[INFO] Tests run: 55, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.394 s - in org.apache.phoenix.end2end.SequenceIT
[INFO] Running org.apache.phoenix.end2end.SetPropertyOnNonEncodedTableIT
[INFO] Tests run: 47, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 125.126 s - in org.apache.phoenix.end2end.RowValueConstructorIT
[INFO] Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 331.843 s - in org.apache.phoenix.end2end.RangeScanIT
[INFO] Running org.apache.phoenix.end2end.SignFunctionEnd2EndIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.09 s - in org.apache.phoenix.end2end.SignFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.SkipScanQueryIT
[INFO] Running org.apache.phoenix.end2end.SkipScanAfterManualSplitIT
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.918 s - in org.apache.phoenix.end2end.SkipScanQueryIT
[INFO] Running org.apache.phoenix.end2end.SortMergeJoinMoreIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.262 s - in org.apache.phoenix.end2end.SkipScanAfterManualSplitIT
[INFO] Running org.apache.phoenix.end2end.SortOrderIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.623 s - in org.apache.phoenix.end2end.SortMergeJoinMoreIT
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 195.596 s - in org.apache.phoenix.end2end.SetPropertyOnEncodedTableIT
[INFO] Running org.apache.phoenix.end2end.SpooledTmpFileDeleteIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.412 s - in org.apache.phoenix.end2end.SpooledTmpFileDeleteIT
[INFO] Running org.apache.phoenix.end2end.StatementHintsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.287 s - in org.apache.phoenix.end2end.StatementHintsIT
[INFO] Running org.apache.phoenix.end2end.StddevIT
[INFO] Running org.apache.phoenix.end2end.SqrtFunctionEnd2EndIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.126 s - in org.apache.phoenix.end2end.StddevIT
[INFO] Running org.apache.phoenix.end2end.StoreNullsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.082 s - in org.apache.phoenix.end2end.SqrtFunctionEnd2EndIT
[INFO] Running org.apache.phoenix.end2end.StoreNullsPropIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.261 s - in org.apache.phoenix.end2end.StoreNullsPropIT
[INFO] Running org.apache.phoenix.end2end.StringIT
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 196.197 s - in org.apache.phoenix.end2end.SetPropertyOnNonEncodedTableIT
[INFO] Running org.apache.phoenix.end2end.StringToArrayFunctionIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.311 s - in org.apache.phoenix.end2end.StringIT
[INFO] Running org.apache.phoenix.end2end.TenantIdTypeIT
[INFO] Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 153.384 s - in org.apache.phoenix.end2end.SortOrderIT
[INFO] Running org.apache.phoenix.end2end.TenantSpecificViewIndexIT
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.412 s - in org.apache.phoenix.end2end.StringToArrayFunctionIT
[INFO] Running org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 68.522 s - in org.apache.phoenix.end2end.TenantIdTypeIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.748 s - in org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
[INFO] Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
[INFO] Running org.apache.phoenix.end2end.ToCharFunctionIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.878 s - in org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
[INFO] Running org.apache.phoenix.end2end.ToDateFunctionIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 87.817 s - in org.apache.phoenix.end2end.TenantSpecificViewIndexIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.903 s - in org.apache.phoenix.end2end.ToDateFunctionIT
[INFO] Running org.apache.phoenix.end2end.TopNIT
[INFO] Running org.apache.phoenix.end2end.ToNumberFunctionIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.071 s - in org.apache.phoenix.end2end.ToNumberFunctionIT
[INFO] Running org.apache.phoenix.end2end.TruncateFunctionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.29 s - in org.apache.phoenix.end2end.TruncateFunctionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.132 s - in org.apache.phoenix.end2end.TopNIT
[INFO] Running org.apache.phoenix.end2end.UnionAllIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.06 s - in org.apache.phoenix.end2end.ToCharFunctionIT
[INFO] Running org.apache.phoenix.end2end.UpgradeIT
[INFO] Running org.apache.phoenix.end2end.UngroupedIT
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 231.595 s - in org.apache.phoenix.end2end.StoreNullsIT
[INFO] Running org.apache.phoenix.end2end.UpperLowerFunctionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 s - in org.apache.phoenix.end2end.UpperLowerFunctionIT
[INFO] Running org.apache.phoenix.end2end.UpsertBigValuesIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.182 s - in org.apache.phoenix.end2end.UpsertBigValuesIT
[INFO] Running org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
[INFO] Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.753 s - in org.apache.phoenix.end2end.UnionAllIT
[INFO] Running org.apache.phoenix.end2end.UpsertSelectIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.399 s - in org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
[INFO] Running org.apache.phoenix.end2end.UpsertValuesIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.146 s - in org.apache.phoenix.end2end.UpsertValuesIT
[INFO] Running org.apache.phoenix.end2end.UseSchemaIT
[INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.41 s - in org.apache.phoenix.end2end.UseSchemaIT
[INFO] Running org.apache.phoenix.end2end.VariableLengthPKIT
[INFO] Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 112.348 s - in org.apache.phoenix.end2end.UpsertSelectIT
[INFO] Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 207.128 s - in org.apache.phoenix.end2end.UngroupedIT
[INFO] Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 226.046 s - in org.apache.phoenix.end2end.UpgradeIT
[INFO] Running org.apache.phoenix.end2end.index.AsyncIndexDisabledIT
[INFO] Running org.apache.phoenix.end2end.index.ChildViewsUseParentViewIndexIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.517 s - in org.apache.phoenix.end2end.index.AsyncIndexDisabledIT
[INFO] Running org.apache.phoenix.end2end.index.DropMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.DropColumnIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.315 s - in org.apache.phoenix.end2end.index.ChildViewsUseParentViewIndexIT
[INFO] Running org.apache.phoenix.end2end.index.GlobalImmutableNonTxIndexIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.719 s - in org.apache.phoenix.end2end.index.DropMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT
[INFO] Tests run: 50, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 122.115 s - in org.apache.phoenix.end2end.VariableLengthPKIT
[INFO] Running org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.203 s - in org.apache.phoenix.end2end.index.GlobalIndexOptimizationIT
[INFO] Running org.apache.phoenix.end2end.index.GlobalMutableNonTxIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 181.193 s - in org.apache.phoenix.end2end.index.GlobalImmutableNonTxIndexIT
[ERROR] Tests run: 40, Failures: 4, Errors: 1, Skipped: 0, Time elapsed: 193.7 s <<< FAILURE! - in org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT
[ERROR] testIndexWithDecimalCol[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT) Time elapsed: 0.399 s <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: org.jboss.netty.channel.ChannelException: Failed to bind to: 0.0.0.0/0.0.0.0:282
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 0.0.0.0/0.0.0.0:282
Caused by: java.net.SocketException: Permission denied

[ERROR] testCreateIndexAfterUpsertStarted[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT) Time elapsed: 4.857 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<3>

[ERROR] testCreateIndexAfterUpsertStartedTxnl[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT) Time elapsed: 4.817 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<3>

[ERROR] testCreateIndexAfterUpsertStarted[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT) Time elapsed: 4.764 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<3>

[ERROR] testCreateIndexAfterUpsertStartedTxnl[GlobalImmutableTxIndexIT_localIndex=false,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.GlobalImmutableTxIndexIT) Time elapsed: 4.742 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<3>

[INFO] Running org.apache.phoenix.end2end.index.GlobalMutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.index.IndexMaintenanceIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 180.768 s - in org.apache.phoenix.end2end.index.GlobalMutableNonTxIndexIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 319.265 s - in org.apache.phoenix.end2end.index.DropColumnIT
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 93.095 s - in org.apache.phoenix.end2end.index.IndexMaintenanceIT
[INFO] Running org.apache.phoenix.end2end.index.IndexWithTableSchemaChangeIT
[INFO] Running org.apache.phoenix.end2end.index.IndexMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.IndexUsageIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 199.404 s - in org.apache.phoenix.end2end.index.GlobalMutableTxIndexIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.732 s - in org.apache.phoenix.end2end.index.IndexMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT
[INFO] Running org.apache.phoenix.end2end.index.LocalImmutableNonTxIndexIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 171.955 s - in org.apache.phoenix.end2end.index.IndexWithTableSchemaChangeIT
[INFO] Running org.apache.phoenix.end2end.index.LocalMutableNonTxIndexIT
[INFO] Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 234.964 s - in org.apache.phoenix.end2end.index.IndexUsageIT
[INFO] Running org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 339.768 s - in org.apache.phoenix.end2end.index.LocalImmutableNonTxIndexIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 339.887 s - in org.apache.phoenix.end2end.index.LocalMutableNonTxIndexIT
[INFO] Running org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.075 s - in org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Running org.apache.phoenix.end2end.index.ViewIndexIT
[WARNING] Tests run: 14, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 81.349 s - in org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Running org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[ERROR] Tests run: 6, Failures: 0, Errors: 4, Skipped: 0, Time elapsed: 1,649.021 s <<< FAILURE! - in org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[ERROR] testCheckpointAndRollback[MutableRollbackIT_localIndex=false](org.apache.phoenix.end2end.index.txn.MutableRollbackIT) Time elapsed: 4.774 s <<< ERROR!
org.apache.phoenix.execute.CommitException: java.lang.IllegalArgumentException: Timestamp not allowed in transactional user operations
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testCheckpointAndRollback(MutableRollbackIT.java:475)
Caused by: java.lang.IllegalArgumentException: Timestamp not allowed in transactional user operations
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testCheckpointAndRollback(MutableRollbackIT.java:475)

[ERROR] testRollbackOfUncommittedExistingRowKeyIndexUpdate[MutableRollbackIT_localIndex=true](org.apache.phoenix.end2end.index.txn.MutableRollbackIT) Time elapsed: 548.915 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testRollbackOfUncommittedExistingRowKeyIndexUpdate(MutableRollbackIT.java:219)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testRollbackOfUncommittedExistingRowKeyIndexUpdate(MutableRollbackIT.java:219)

[ERROR] testCheckpointAndRollback[MutableRollbackIT_localIndex=true](org.apache.phoenix.end2end.index.txn.MutableRollbackIT) Time elapsed: 540.451 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testCheckpointAndRollback(MutableRollbackIT.java:457)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testCheckpointAndRollback(MutableRollbackIT.java:457)

[ERROR] testMultiRollbackOfUncommittedExistingRowKeyIndexUpdate[MutableRollbackIT_localIndex=true](org.apache.phoenix.end2end.index.txn.MutableRollbackIT) Time elapsed: 540.236 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testMultiRollbackOfUncommittedExistingRowKeyIndexUpdate(MutableRollbackIT.java:354)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.MutableRollbackIT.testMultiRollbackOfUncommittedExistingRowKeyIndexUpdate(MutableRollbackIT.java:354)

[INFO] Running org.apache.phoenix.end2end.index.txn.RollbackIT
[ERROR] Tests run: 8, Failures: 0, Errors: 4, Skipped: 0, Time elapsed: 2,180.311 s <<< FAILURE! - in org.apache.phoenix.end2end.index.txn.RollbackIT
[ERROR] testRollbackOfUncommittedRowKeyIndexInsert[RollbackIT_localIndex=true,mutable=false](org.apache.phoenix.end2end.index.txn.RollbackIT) Time elapsed: 539.807 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert(RollbackIT.java:128)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert(RollbackIT.java:128)

[ERROR] testRollbackOfUncommittedKeyValueIndexInsert[RollbackIT_localIndex=true,mutable=false](org.apache.phoenix.end2end.index.txn.RollbackIT) Time elapsed: 540.445 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert(RollbackIT.java:85)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert(RollbackIT.java:85)

[ERROR] testRollbackOfUncommittedRowKeyIndexInsert[RollbackIT_localIndex=true,mutable=true](org.apache.phoenix.end2end.index.txn.RollbackIT) Time elapsed: 540.347 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert(RollbackIT.java:128)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert(RollbackIT.java:128)

[ERROR] testRollbackOfUncommittedKeyValueIndexInsert[RollbackIT_localIndex=true,mutable=true](org.apache.phoenix.end2end.index.txn.RollbackIT) Time elapsed: 540.402 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert(RollbackIT.java:85)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,61745,1527234378458, 
at org.apache.phoenix.end2end.index.txn.RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert(RollbackIT.java:85)

[INFO] Running org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.681 s - in org.apache.phoenix.end2end.join.HashJoinCacheIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 324.487 s - in org.apache.phoenix.end2end.join.HashJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 590.536 s - in org.apache.phoenix.end2end.join.HashJoinLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.79 s - in org.apache.phoenix.end2end.join.HashJoinMoreIT
[INFO] Running org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[ERROR] Tests run: 120, Failures: 2, Errors: 10, Skipped: 0, Time elapsed: 5,120.941 s <<< FAILURE! - in org.apache.phoenix.end2end.index.MutableIndexIT
[ERROR] testMultipleUpdatesToSingleRow[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 540.015 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testMultipleUpdatesToSingleRow(MutableIndexIT.java:476)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testMultipleUpdatesToSingleRow(MutableIndexIT.java:476)

[ERROR] testCompoundIndexKey[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 540.121 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCompoundIndexKey(MutableIndexIT.java:354)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCompoundIndexKey(MutableIndexIT.java:354)

[ERROR] testCoveredColumns[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 540.442 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumns(MutableIndexIT.java:245)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumns(MutableIndexIT.java:245)

[ERROR] testCoveredColumnUpdates[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 9.083 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumnUpdates(MutableIndexIT.java:144)

[ERROR] testUpsertingNullForIndexedColumns[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 539.702 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testUpsertingNullForIndexedColumns(MutableIndexIT.java:544)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testUpsertingNullForIndexedColumns(MutableIndexIT.java:544)

[ERROR] testIndexHalfStoreFileReader[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=false](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 8.965 s <<< ERROR!
java.sql.SQLException: 
java.util.concurrent.ExecutionException: java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.index.MutableIndexIT.testIndexHalfStoreFileReader(MutableIndexIT.java:664)
Caused by: java.util.concurrent.ExecutionException: 
java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.index.MutableIndexIT.testIndexHalfStoreFileReader(MutableIndexIT.java:664)
Caused by: java.lang.Exception: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more


[ERROR] testMultipleUpdatesToSingleRow[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 539.877 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testMultipleUpdatesToSingleRow(MutableIndexIT.java:476)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testMultipleUpdatesToSingleRow(MutableIndexIT.java:476)

[ERROR] testCompoundIndexKey[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 540.302 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCompoundIndexKey(MutableIndexIT.java:354)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCompoundIndexKey(MutableIndexIT.java:354)

[ERROR] testCoveredColumns[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 559.924 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumns(MutableIndexIT.java:245)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumns(MutableIndexIT.java:245)

[ERROR] testCoveredColumnUpdates[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 8.947 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.index.MutableIndexIT.testCoveredColumnUpdates(MutableIndexIT.java:144)

[ERROR] testUpsertingNullForIndexedColumns[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 539.969 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testUpsertingNullForIndexedColumns(MutableIndexIT.java:544)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,62716,1527234175780, 
at org.apache.phoenix.end2end.index.MutableIndexIT.testUpsertingNullForIndexedColumns(MutableIndexIT.java:544)

[ERROR] testIndexHalfStoreFileReader[MutableIndexIT_localIndex=true,transactional=OMID,columnEncoded=true](org.apache.phoenix.end2end.index.MutableIndexIT) Time elapsed: 8.85 s <<< ERROR!
java.sql.SQLException: 
java.util.concurrent.ExecutionException: java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.index.MutableIndexIT.testIndexHalfStoreFileReader(MutableIndexIT.java:664)
Caused by: java.util.concurrent.ExecutionException: 
java.lang.Exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

at org.apache.phoenix.end2end.index.MutableIndexIT.testIndexHalfStoreFileReader(MutableIndexIT.java:664)
Caused by: java.lang.Exception: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:109)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:80)
at org.apache.phoenix.coprocessor.generated.ServerCachingProtos$ServerCachingService.callMethod(ServerCachingProtos.java:8414)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8086)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2068)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2050)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.thrift.transport.TTransportException: Cannot read. Remote side has closed. Tried to read 2 bytes, but only got 1 bytes. (This is often indicative of an internal error on the server side. Please check your server logs.)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI16(TBinaryProtocol.java:278)
at org.apache.thrift.protocol.TBinaryProtocol.readFieldBegin(TBinaryProtocol.java:229)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1028)
at org.apache.tephra.distributed.thrift.TTransaction$TTransactionStandardScheme.read(TTransaction.java:1021)
at org.apache.tephra.distributed.thrift.TTransaction.read(TTransaction.java:921)
at org.apache.thrift.TDeserializer.deserialize(TDeserializer.java:69)
at org.apache.tephra.TransactionCodec.decode(TransactionCodec.java:51)
at org.apache.phoenix.transaction.TephraTransactionContext.<init>(TephraTransactionContext.java:71)
at org.apache.phoenix.transaction.TephraTransactionProvider.getTransactionContext(TephraTransactionProvider.java:64)
at org.apache.phoenix.transaction.TransactionFactory.getTransactionContext(TransactionFactory.java:70)
at org.apache.phoenix.index.IndexMetaDataCacheFactory.newCache(IndexMetaDataCacheFactory.java:55)
at org.apache.phoenix.cache.TenantCacheImpl.addServerCache(TenantCacheImpl.java:113)
at org.apache.phoenix.coprocessor.ServerCachingEndpointImpl.addServerCache(ServerCachingEndpointImpl.java:76)
... 9 more


[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 187.11 s - in org.apache.phoenix.end2end.join.HashJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 334.252 s - in org.apache.phoenix.end2end.join.SortMergeJoinGlobalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 193.338 s - in org.apache.phoenix.end2end.join.SortMergeJoinNoIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 592.179 s - in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 229.948 s - in org.apache.phoenix.end2end.join.SubqueryIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.695 s - in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.657 s - in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.254 s - in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.738 s - in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 164.61 s - in org.apache.phoenix.end2end.join.SubqueryUsingSortMergeJoinIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.609 s - in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.096 s - in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.297 s - in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.63 s - in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[ERROR] Tests run: 4, Failures: 2, Errors: 1, Skipped: 0, Time elapsed: 10.69 s <<< FAILURE! - in org.apache.phoenix.tx.FlappingTransactionIT
[ERROR] testInflightUpdateNotSeen(org.apache.phoenix.tx.FlappingTransactionIT) Time elapsed: 2.411 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.FlappingTransactionIT.testInflightUpdateNotSeen(FlappingTransactionIT.java:140)

[ERROR] testExternalTxContext(org.apache.phoenix.tx.FlappingTransactionIT) Time elapsed: 3.431 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.FlappingTransactionIT.testExternalTxContext(FlappingTransactionIT.java:241)

[ERROR] testInflightDeleteNotSeen(org.apache.phoenix.tx.FlappingTransactionIT) Time elapsed: 2.407 s <<< FAILURE!
java.lang.AssertionError: expected:<2> but was:<1>
at org.apache.phoenix.tx.FlappingTransactionIT.testInflightDeleteNotSeen(FlappingTransactionIT.java:193)

[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.253 s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 92.215 s - in org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[ERROR] Tests run: 52, Failures: 8, Errors: 4, Skipped: 0, Time elapsed: 180.632 s <<< FAILURE! - in org.apache.phoenix.tx.ParameterizedTransactionIT
[ERROR] testCreateTableToBeTransactional[TransactionIT_mutable=false,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 2.286 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testCreateTableToBeTransactional(ParameterizedTransactionIT.java:379)

[ERROR] testNonTxToTxTableFailure[TransactionIT_mutable=false,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 8.788 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTableFailure(ParameterizedTransactionIT.java:349)

[ERROR] testNonTxToTxTable[TransactionIT_mutable=false,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 4.531 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=T001887
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:282)

[ERROR] testCreateTableToBeTransactional[TransactionIT_mutable=false,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 2.275 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testCreateTableToBeTransactional(ParameterizedTransactionIT.java:379)

[ERROR] testNonTxToTxTableFailure[TransactionIT_mutable=false,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 8.769 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTableFailure(ParameterizedTransactionIT.java:349)

[ERROR] testNonTxToTxTable[TransactionIT_mutable=false,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 4.554 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=T001905
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:282)

[ERROR] testCreateTableToBeTransactional[TransactionIT_mutable=true,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 2.3 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testCreateTableToBeTransactional(ParameterizedTransactionIT.java:379)

[ERROR] testNonTxToTxTableFailure[TransactionIT_mutable=true,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 8.765 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTableFailure(ParameterizedTransactionIT.java:349)

[ERROR] testNonTxToTxTable[TransactionIT_mutable=true,columnEncoded=false](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 4.518 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=T001923
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:282)

[ERROR] testCreateTableToBeTransactional[TransactionIT_mutable=true,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 2.288 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testCreateTableToBeTransactional(ParameterizedTransactionIT.java:379)

[ERROR] testNonTxToTxTableFailure[TransactionIT_mutable=true,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 8.774 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTableFailure(ParameterizedTransactionIT.java:349)

[ERROR] testNonTxToTxTable[TransactionIT_mutable=true,columnEncoded=true](org.apache.phoenix.tx.ParameterizedTransactionIT) Time elapsed: 4.558 s <<< ERROR!
java.sql.SQLException: ERROR 1093 (44A24): Cannot alter table from non transactional to transactional for OMID. tableName=T001941
at org.apache.phoenix.tx.ParameterizedTransactionIT.testNonTxToTxTable(ParameterizedTransactionIT.java:282)

[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.75 s - in org.apache.phoenix.util.IndexScrutinyIT
[ERROR] Tests run: 40, Failures: 20, Errors: 12, Skipped: 0, Time elapsed: 6,819.976 s <<< FAILURE! - in org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT
[ERROR] testIndexWithDecimalCol[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.542 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableFixedWithCols[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.861 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableDateCol[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.85 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithCaseSensitiveCols[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 560.021 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testSelectAllAndAliasWithIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 539.918 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testDeleteFromAllPKColumnIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.021 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testMultipleUpdatesAcrossRegions[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 540.286 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testSelectDistinctOnTableWithSecondaryImmutableIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.03 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testCreateIndexAfterUpsertStarted[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.09 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<0>

[ERROR] testCreateIndexAfterUpsertStartedTxnl[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.149 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<0>

[ERROR] testInClauseWithIndexOnColumnOfUsignedIntType[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.985 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testDeleteFromNonPKColumnIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.954 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testGroupByCount[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.008 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testSelectCF[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 540.677 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testUpsertAfterIndexDrop[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 539.942 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testReturnedTimestamp[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 540.666 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testIndexWithDecimalCol[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.943 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableFixedWithCols[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.92 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableDateCol[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.922 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithCaseSensitiveCols[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 539.916 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testSelectAllAndAliasWithIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 540.002 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testDeleteFromAllPKColumnIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.92 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testMultipleUpdatesAcrossRegions[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 540.281 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testSelectDistinctOnTableWithSecondaryImmutableIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.948 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testCreateIndexAfterUpsertStarted[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 9.025 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<0>

[ERROR] testCreateIndexAfterUpsertStartedTxnl[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.952 s <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<0>

[ERROR] testInClauseWithIndexOnColumnOfUsignedIntType[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.843 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testDeleteFromNonPKColumnIndex[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.856 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testGroupByCount[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 8.913 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testSelectCF[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 559.61 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testUpsertAfterIndexDrop[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 559.771 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] testReturnedTimestamp[LocalImmutableTxIndexIT_localIndex=true,mutable=false,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalImmutableTxIndexIT) Time elapsed: 559.459 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60855,1527233706989, 

[ERROR] Tests run: 40, Failures: 16, Errors: 16, Skipped: 0, Time elapsed: 8,925.293 s <<< FAILURE! - in org.apache.phoenix.end2end.index.LocalMutableTxIndexIT
[ERROR] testIndexWithDecimalCol[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.673 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableFixedWithCols[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.943 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableDateCol[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.936 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithCaseSensitiveCols[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 539.691 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testSelectAllAndAliasWithIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.115 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testDeleteFromAllPKColumnIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.129 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testMultipleUpdatesAcrossRegions[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 539.98 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testSelectDistinctOnTableWithSecondaryImmutableIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.055 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testCreateIndexAfterUpsertStarted[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 560.272 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testCreateIndexAfterUpsertStartedTxnl[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.338 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testInClauseWithIndexOnColumnOfUsignedIntType[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.07 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testDeleteFromNonPKColumnIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.011 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testGroupByCount[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.047 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testSelectCF[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 559.537 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testUpsertAfterIndexDrop[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.323 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testReturnedTimestamp[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.396 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testIndexWithDecimalCol[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.484 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableFixedWithCols[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 9.016 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithNullableDateCol[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.934 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testIndexWithCaseSensitiveCols[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.488 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testSelectAllAndAliasWithIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.148 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testDeleteFromAllPKColumnIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.939 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testMultipleUpdatesAcrossRegions[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 539.96 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testSelectDistinctOnTableWithSecondaryImmutableIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.412 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testCreateIndexAfterUpsertStarted[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.225 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testCreateIndexAfterUpsertStartedTxnl[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.399 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testInClauseWithIndexOnColumnOfUsignedIntType[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.917 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testDeleteFromNonPKColumnIndex[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.891 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>

[ERROR] testGroupByCount[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 8.874 s <<< FAILURE!
java.lang.AssertionError

[ERROR] testSelectCF[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 539.894 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testUpsertAfterIndexDrop[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 540.094 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] testReturnedTimestamp[LocalMutableTxIndexIT_localIndex=true,mutable=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.LocalMutableTxIndexIT) Time elapsed: 560.124 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64465,1527233976950, 

[ERROR] Tests run: 40, Failures: 0, Errors: 28, Skipped: 0, Time elapsed: 11,011.38 s <<< FAILURE! - in org.apache.phoenix.tx.TxCheckpointIT
[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=false,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 5.846 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.upsertRows(TxCheckpointIT.java:271)
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:240)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=false,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 7.074 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:335)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=false,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 4.84 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.upsertRows(TxCheckpointIT.java:271)
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:240)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=false,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 7.053 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:335)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=false,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 4.748 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.upsertRows(TxCheckpointIT.java:271)
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:240)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=false,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 7.017 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:335)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=false,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 4.722 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.upsertRows(TxCheckpointIT.java:271)
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:240)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=false,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 7.012 s <<< ERROR!
java.sql.SQLException: ERROR 1092 (44A23): Cannot mix transaction providers: TEPHRA and OMID
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:335)

[ERROR] testUpsertSelectDoesntSeeUpsertedData[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 540.038 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)

[ERROR] testRollbackOfUncommittedDeleteMultiCol[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.821 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 559.574 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)

[ERROR] testRollbackOfUncommittedDeleteSingleCol[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 540.045 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 542.26 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)

[ERROR] testUpsertSelectDoesntSeeUpsertedData[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.608 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)

[ERROR] testRollbackOfUncommittedDeleteMultiCol[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.827 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.634 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)

[ERROR] testRollbackOfUncommittedDeleteSingleCol[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.529 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=true,mutable=false,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 541.972 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)

[ERROR] testUpsertSelectDoesntSeeUpsertedData[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.687 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)

[ERROR] testRollbackOfUncommittedDeleteMultiCol[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.337 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 560.06 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)

[ERROR] testRollbackOfUncommittedDeleteSingleCol[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 559.455 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=false](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 541.871 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)

[ERROR] testUpsertSelectDoesntSeeUpsertedData[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 558.797 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData(TxCheckpointIT.java:108)

[ERROR] testRollbackOfUncommittedDeleteMultiCol[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 539.857 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol(TxCheckpointIT.java:132)

[ERROR] testCheckpointForUpsertSelect[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 559.292 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForUpsertSelect(TxCheckpointIT.java:238)

[ERROR] testRollbackOfUncommittedDeleteSingleCol[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 540.041 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDelete(TxCheckpointIT.java:148)
at org.apache.phoenix.tx.TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol(TxCheckpointIT.java:123)

[ERROR] testCheckpointForDeleteAndUpsert[TxCheckpointIT_localIndex=true,mutable=true,columnEncoded=true](org.apache.phoenix.tx.TxCheckpointIT) Time elapsed: 542.043 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,60659,1527240562158, 
at org.apache.phoenix.tx.TxCheckpointIT.testCheckpointForDeleteAndUpsert(TxCheckpointIT.java:330)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<3>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:342 expected:<4> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromAllPKColumnIndex:203 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testDeleteFromNonPKColumnIndex:384 expected:<3> but was:<0>
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testGroupByCount:432
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testInClauseWithIndexOnColumnOfUsignedIntType:477
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1048
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableDateCol:558
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithNullableFixedWithCols:153
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectDistinctOnTableWithSecondaryImmutableIndex:452
[ERROR] MutableIndexIT.testCoveredColumnUpdates:144
[ERROR] MutableIndexIT.testCoveredColumnUpdates:144
[ERROR] FlappingTransactionIT.testInflightDeleteNotSeen:193 expected:<2> but was:<1>
[ERROR] FlappingTransactionIT.testInflightUpdateNotSeen:140
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testCreateTableToBeTransactional:379
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] ParameterizedTransactionIT.testNonTxToTxTableFailure:349
[ERROR] Errors: 
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] AlterTableWithViewsIT.testMakeBaseTableTransactional:784 » SQL ERROR 1093 (44A...
[ERROR] GlobalImmutableTxIndexIT>BaseIndexIT.testIndexWithDecimalCol:1033 » UncheckedExecution
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.ap...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.ap...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoo...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoo...
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org....
[ERROR] LocalImmutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStartedTxnl:266->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testCreateIndexAfterUpsertStarted:258->BaseIndexIT.testCreateIndexAfterUpsertStarted:337 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testIndexWithCaseSensitiveCols:915 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testMultipleUpdatesAcrossRegions:834 » Commit
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.apac...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testReturnedTimestamp:1186 » Commit org.apac...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectAllAndAliasWithIndex:620 » Commit ...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoop....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testSelectCF:700 » Commit org.apache.hadoop....
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org.ap...
[ERROR] LocalMutableTxIndexIT>BaseIndexIT.testUpsertAfterIndexDrop:760 » Commit org.ap...
[ERROR] MutableIndexIT.testCompoundIndexKey:354 » Commit org.apache.hadoop.hbase.clien...
[ERROR] MutableIndexIT.testCompoundIndexKey:354 » Commit org.apache.hadoop.hbase.clien...
[ERROR] MutableIndexIT.testCoveredColumns:245 » Commit org.apache.hadoop.hbase.client....
[ERROR] MutableIndexIT.testCoveredColumns:245 » Commit org.apache.hadoop.hbase.client....
[ERROR] MutableIndexIT.testIndexHalfStoreFileReader:664 » SQL java.util.concurrent.Exe...
[ERROR] MutableIndexIT.testIndexHalfStoreFileReader:664 » SQL java.util.concurrent.Exe...
[ERROR] MutableIndexIT.testMultipleUpdatesToSingleRow:476 » Commit org.apache.hadoop.h...
[ERROR] MutableIndexIT.testMultipleUpdatesToSingleRow:476 » Commit org.apache.hadoop.h...
[ERROR] MutableIndexIT.testUpsertingNullForIndexedColumns:544 » Commit org.apache.hado...
[ERROR] MutableIndexIT.testUpsertingNullForIndexedColumns:544 » Commit org.apache.hado...
[ERROR] MutableRollbackIT.testCheckpointAndRollback:475 » Commit java.lang.IllegalArgu...
[ERROR] MutableRollbackIT.testCheckpointAndRollback:457 » Commit org.apache.hadoop.hba...
[ERROR] MutableRollbackIT.testMultiRollbackOfUncommittedExistingRowKeyIndexUpdate:354 » Commit
[ERROR] MutableRollbackIT.testRollbackOfUncommittedExistingRowKeyIndexUpdate:219 » Commit
[ERROR] RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert:85 » Commit org.apache...
[ERROR] RollbackIT.testRollbackOfUncommittedKeyValueIndexInsert:85 » Commit org.apache...
[ERROR] RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert:128 » Commit org.apache....
[ERROR] RollbackIT.testRollbackOfUncommittedRowKeyIndexInsert:128 » Commit org.apache....
[ERROR] FlappingTransactionIT.testExternalTxContext:241 » SQL ERROR 1092 (44A23): Cann...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] ParameterizedTransactionIT.testNonTxToTxTable:282 » SQL ERROR 1093 (44A24): Ca...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:335 » SQL ERROR 1092 (44A23): ...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForDeleteAndUpsert:330 » Commit org.apache.hadoop...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:240->upsertRows:271 » SQL ERROR 1...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testCheckpointForUpsertSelect:238 » Commit org.apache.hadoop.hb...
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteMultiCol:132->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testRollbackOfUncommittedDeleteSingleCol:123->testRollbackOfUncommittedDelete:148 » Commit
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[ERROR] TxCheckpointIT.testUpsertSelectDoesntSeeUpsertedData:108 » Commit org.apache.h...
[INFO] 
[ERROR] Tests run: 3477, Failures: 52, Errors: 84, Skipped: 3
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ChangePermissionsIT
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.239 s - in org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.009 s - in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 102.873 s - in org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 107.985 s - in org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 102.941 s - in org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.211 s - in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.417 s - in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.511 s - in org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Running org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 108.703 s - in org.apache.phoenix.end2end.ColumnEncodedMutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.726 s - in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.412 s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 290.59 s - in org.apache.phoenix.end2end.CostBasedDecisionIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 136.026 s - in org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 140.516 s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.IndexScrutinyToolIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 573.308 s - in org.apache.phoenix.end2end.ChangePermissionsIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.821 s - in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.014 s - in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Running org.apache.phoenix.end2end.IndexToolIT
[INFO] Running org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Running org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 244.722 s - in org.apache.phoenix.end2end.MigrateSystemTablesToSystemNamespaceIT
[INFO] Running org.apache.phoenix.end2end.NonColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 345.414 s - in org.apache.phoenix.end2end.IndexScrutinyToolIT
[INFO] Running org.apache.phoenix.end2end.NonColumnEncodedImmutableTxStatsCollectorIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 101.538 s - in org.apache.phoenix.end2end.NonColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.PartialResultServerConfigurationIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.103 s - in org.apache.phoenix.end2end.PartialResultServerConfigurationIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 109.193 s - in org.apache.phoenix.end2end.NonColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.PhoenixDriverIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 459.453 s - in org.apache.phoenix.end2end.LocalIndexSplitMergeIT
[INFO] Running org.apache.phoenix.end2end.QueryLoggerIT
[INFO] Running org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.396 s - in org.apache.phoenix.end2end.PhoenixDriverIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.71 s - in org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.185 s - in org.apache.phoenix.end2end.QueryLoggerIT
[INFO] Running org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.898 s - in org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Running org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.365 s - in org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Running org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.351 s - in org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Running org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.392 s - in org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Running org.apache.phoenix.end2end.SystemCatalogCreationOnConnectionIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.29 s - in org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.SystemTablePermissionsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.779 s - in org.apache.phoenix.end2end.SystemTablePermissionsIT
[INFO] Running org.apache.phoenix.end2end.TableDDLPermissionsIT
[WARNING] Tests run: 26, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 100.177 s - in org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.TableSnapshotReadsMapReduceIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.233 s - in org.apache.phoenix.end2end.TableSnapshotReadsMapReduceIT
[INFO] Running org.apache.phoenix.end2end.UpdateCacheAcrossDifferentClientsIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.654 s - in org.apache.phoenix.end2end.UpdateCacheAcrossDifferentClientsIT
[INFO] Running org.apache.phoenix.end2end.UserDefinedFunctionsIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 116.485 s - in org.apache.phoenix.end2end.UserDefinedFunctionsIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 349.765 s - in org.apache.phoenix.end2end.TableDDLPermissionsIT
[INFO] Running org.apache.phoenix.end2end.index.ImmutableIndexIT
[INFO] Running org.apache.phoenix.end2end.index.LocalIndexIT
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 607.245 s - in org.apache.phoenix.end2end.SystemCatalogCreationOnConnectionIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
[ERROR] Tests run: 40, Failures: 6, Errors: 0, Skipped: 16, Time elapsed: 343.573 s <<< FAILURE! - in org.apache.phoenix.end2end.index.ImmutableIndexIT
[ERROR] testDeleteFromPartialPK[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.443 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDeleteFromPartialPK(ImmutableIndexIT.java:191)

[ERROR] testDropIfImmutableKeyValueColumn[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.958 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDropIfImmutableKeyValueColumn(ImmutableIndexIT.java:145)

[ERROR] testDeleteFromNonPK[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=false](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.975 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDeleteFromNonPK(ImmutableIndexIT.java:233)

[ERROR] testDeleteFromPartialPK[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.875 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDeleteFromPartialPK(ImmutableIndexIT.java:191)

[ERROR] testDropIfImmutableKeyValueColumn[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.962 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDropIfImmutableKeyValueColumn(ImmutableIndexIT.java:145)

[ERROR] testDeleteFromNonPK[ImmutableIndexIT_localIndex=true,transactional=true,columnEncoded=true](org.apache.phoenix.end2end.index.ImmutableIndexIT) Time elapsed: 8.891 s <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<0>
at org.apache.phoenix.end2end.index.ImmutableIndexIT.testDeleteFromNonPK(ImmutableIndexIT.java:233)

[INFO] Running org.apache.phoenix.end2end.index.MutableIndexRebuilderIT
[INFO] Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 358.051 s - in org.apache.phoenix.end2end.index.LocalIndexIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.828 s - in org.apache.phoenix.end2end.index.MutableIndexRebuilderIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[ERROR] Tests run: 21, Failures: 0, Errors: 4, Skipped: 0, Time elapsed: 273.775 s <<< FAILURE! - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
[ERROR] testIndexWriteFailure[MutableIndexFailureIT_transactional=true,localIndex=false,isNamespaceMapped=false,disableIndexOnWriteFailure=true,failRebuildTask=false,throwIndexWriteFailure=null](org.apache.phoenix.end2end.index.MutableIndexFailureIT) Time elapsed: 11.232 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59289,1527253586021, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59289,1527253586021, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)

[ERROR] testIndexWriteFailure[MutableIndexFailureIT_transactional=true,localIndex=false,isNamespaceMapped=true,disableIndexOnWriteFailure=true,failRebuildTask=false,throwIndexWriteFailure=null](org.apache.phoenix.end2end.index.MutableIndexFailureIT) Time elapsed: 11.279 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59289,1527253586021, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59289,1527253586021, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)

[ERROR] testIndexWriteFailure[MutableIndexFailureIT_transactional=true,localIndex=true,isNamespaceMapped=false,disableIndexOnWriteFailure=true,failRebuildTask=false,throwIndexWriteFailure=null](org.apache.phoenix.end2end.index.MutableIndexFailureIT) Time elapsed: 9.173 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59282,1527253585944, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59282,1527253585944, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)

[ERROR] testIndexWriteFailure[MutableIndexFailureIT_transactional=true,localIndex=true,isNamespaceMapped=true,disableIndexOnWriteFailure=null,failRebuildTask=false,throwIndexWriteFailure=null](org.apache.phoenix.end2end.index.MutableIndexFailureIT) Time elapsed: 10.039 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59294,1527253586103, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,59294,1527253586103, 
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.initializeTable(MutableIndexFailureIT.java:406)
at org.apache.phoenix.end2end.index.MutableIndexFailureIT.testIndexWriteFailure(MutableIndexFailureIT.java:273)

[INFO] Running org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.813 s - in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[INFO] Running org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[INFO] Running org.apache.phoenix.execute.PartialCommitIT
[ERROR] Tests run: 12, Failures: 4, Errors: 2, Skipped: 0, Time elapsed: 0.729 s <<< FAILURE! - in org.apache.phoenix.execute.PartialCommitIT
[ERROR] testOrderOfMutationsIsPredicatable[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.076 s <<< FAILURE!
java.lang.AssertionError: org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:280)
at org.apache.phoenix.execute.PartialCommitIT.testOrderOfMutationsIsPredicatable(PartialCommitIT.java:216)

[ERROR] testNoFailure[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.069 s <<< ERROR!
java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:274)
at org.apache.phoenix.execute.PartialCommitIT.testNoFailure(PartialCommitIT.java:170)

[ERROR] testDeleteFailure[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.06 s <<< FAILURE!
java.lang.AssertionError: org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:280)
at org.apache.phoenix.execute.PartialCommitIT.testDeleteFailure(PartialCommitIT.java:202)

[ERROR] testUpsertSelectFailure[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.123 s <<< ERROR!
java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:274)
at org.apache.phoenix.execute.PartialCommitIT.testUpsertSelectFailure(PartialCommitIT.java:192)

[ERROR] testUpsertFailure[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.063 s <<< FAILURE!
java.lang.AssertionError: org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:280)
at org.apache.phoenix.execute.PartialCommitIT.testUpsertFailure(PartialCommitIT.java:176)

[ERROR] testStatementOrderMaintainedInConnection[PartialCommitIT_transactional=true](org.apache.phoenix.execute.PartialCommitIT) Time elapsed: 0.04 s <<< FAILURE!
java.lang.AssertionError: org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
at org.apache.phoenix.execute.PartialCommitIT.testPartialCommit(PartialCommitIT.java:280)
at org.apache.phoenix.execute.PartialCommitIT.testStatementOrderMaintainedInConnection(PartialCommitIT.java:228)

[INFO] Running org.apache.phoenix.execute.UpsertSelectOverlappingBatchesIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 67.185 s - in org.apache.phoenix.execute.UpsertSelectOverlappingBatchesIT
[INFO] Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.608 s - in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.767 s - in org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 250.882 s - in org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] Running org.apache.phoenix.iterate.ScannerLeaseRenewalIT
[INFO] Running org.apache.phoenix.monitoring.PhoenixMetricsIT
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.781 s - in org.apache.phoenix.monitoring.PhoenixMetricsIT
[INFO] Running org.apache.phoenix.rpc.PhoenixClientRpcIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.935 s - in org.apache.phoenix.rpc.PhoenixClientRpcIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 162.948 s - in org.apache.phoenix.iterate.ScannerLeaseRenewalIT
[INFO] Running org.apache.phoenix.rpc.PhoenixServerRpcIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.72 s - in org.apache.phoenix.rpc.PhoenixServerRpcIT
[ERROR] Tests run: 8, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 1,099.021 s <<< FAILURE! - in org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[ERROR] testDataTableWriteFailure[TxWriteFailureIT_localIndex=true,mutable=false](org.apache.phoenix.end2end.index.txn.TxWriteFailureIT) Time elapsed: 539.765 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,63422,1527253916491, 
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.helpTestWriteFailure(TxWriteFailureIT.java:149)
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.testDataTableWriteFailure(TxWriteFailureIT.java:115)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,63422,1527253916491, 
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.helpTestWriteFailure(TxWriteFailureIT.java:149)
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.testDataTableWriteFailure(TxWriteFailureIT.java:115)

[ERROR] testDataTableWriteFailure[TxWriteFailureIT_localIndex=true,mutable=true](org.apache.phoenix.end2end.index.txn.TxWriteFailureIT) Time elapsed: 539.393 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,63422,1527253916491, 
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.helpTestWriteFailure(TxWriteFailureIT.java:149)
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.testDataTableWriteFailure(TxWriteFailureIT.java:115)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,63422,1527253916491, 
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.helpTestWriteFailure(TxWriteFailureIT.java:149)
at org.apache.phoenix.end2end.index.txn.TxWriteFailureIT.testDataTableWriteFailure(TxWriteFailureIT.java:115)

[ERROR] Tests run: 64, Failures: 4, Errors: 12, Skipped: 0, Time elapsed: 2,828.588 s <<< FAILURE! - in org.apache.phoenix.end2end.IndexToolIT
[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = false, directApi = false, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 8.72 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:204)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = false, directApi = false, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 14.262 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:204)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = false, directApi = true, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 7.805 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:204)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = false, directApi = true, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 12.631 s <<< FAILURE!
java.lang.AssertionError
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:204)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = true, directApi = false, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 544.483 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = true, directApi = false, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 547.028 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = true, directApi = true, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 542.76 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)

[ERROR] testSecondaryIndex[transactional = true , mutable = false , localIndex = true, directApi = true, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 545.717 s <<< ERROR!
org.apache.phoenix.execute.CommitException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: IOException: 1 time, servers with issues: 10.0.0.3,64120,1527252285438, 
at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:196)

[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = false, directApi = false, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 4.678 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000148.T000149,\x80\x00\x00\x01,1527255003814.9a83456ecd58418cd9477e946ddf5847.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = false, directApi = false, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 4.645 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000154.T000155,\x80\x00\x00\x01,1527255021056.3c0fd749481a0f04ed70324e9d1c78f2.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = false, directApi = true, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 4.65 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000160.T000161,\x80\x00\x00\x01,1527255032799.8e80974085cdf0496884392d33fd0a0f.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = false, directApi = true, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 4.668 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000166.T000167,\x80\x00\x00\x01,1527255049752.8ca7ecf7eb682ae2547d876330eccfbb.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = true, directApi = false, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 8.836 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000172.T000173,\x80\x00\x00\x01,1527255066017.5123ee046abcf6286acbbc7f024c0823.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = true, directApi = false, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 8.824 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000178.T000179,\x80\x00\x00\x01,1527255089614.4e9d1b6d8153301889974709bf14f508.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = true, directApi = true, useSnapshot = false](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 8.823 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000184.T000185,\x80\x00\x00\x01,1527255109690.abe75121875a310cd2e1af27fbf112d0.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[ERROR] testSecondaryIndex[transactional = true , mutable = true , localIndex = true, directApi = true, useSnapshot = true](org.apache.phoenix.end2end.IndexToolIT) Time elapsed: 8.823 s <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: java.util.concurrent.ExecutionException: 
org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

at org.apache.phoenix.end2end.IndexToolIT.testSecondaryIndex(IndexToolIT.java:185)
Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
org.apache.hadoop.hbase.DoNotRetryIOException: T000190.T000191,\x80\x00\x00\x01,1527255132908.a7bfbcaa10e24b26a74382426fd7faec.: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:86)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:212)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:82)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:288)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2629)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2833)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:34950)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2339)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:123)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:188)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:168)
Caused by: org.apache.phoenix.schema.PTable$QualifierEncodingScheme$InvalidQualifierBytesException: Invalid number of qualifier bytes. Expected length: 2. Actual: 6
at org.apache.phoenix.schema.PTable$QualifierEncodingScheme$3.decode(PTable.java:343)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:117)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter$1.apply(EncodedQualifiersColumnProjectionFilter.java:114)
at com.google.common.collect.Iterables.removeIfFromRandomAccessList(Iterables.java:201)
at com.google.common.collect.Iterables.removeIf(Iterables.java:186)
at org.apache.phoenix.filter.EncodedQualifiersColumnProjectionFilter.filterRowCells(EncodedQualifiersColumnProjectionFilter.java:114)
at org.apache.hadoop.hbase.filter.FilterList.filterRowCells(FilterList.java:325)
at org.apache.hadoop.hbase.filter.FilterWrapper.filterRowCellsWithRet(FilterWrapper.java:163)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:6105)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5858)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5844)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:116)
at org.apache.hadoop.hbase.regionserver.OmidRegionScanner.nextRaw(OmidRegionScanner.java:96)
at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:175)
... 10 more


[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] IndexToolIT.testSecondaryIndex:204
[ERROR] ImmutableIndexIT.testDeleteFromNonPK:233 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromNonPK:233 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromPartialPK:191 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDeleteFromPartialPK:191 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDropIfImmutableKeyValueColumn:145 expected:<3> but was:<0>
[ERROR] ImmutableIndexIT.testDropIfImmutableKeyValueColumn:145 expected:<3> but was:<0>
[ERROR] PartialCommitIT.testDeleteFailure:202->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testOrderOfMutationsIsPredicatable:216->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testStatementOrderMaintainedInConnection:228->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] PartialCommitIT.testUpsertFailure:176->testPartialCommit:280 org.apache.phoenix.exception.PhoenixIOException: java.lang.NullPointerException
[ERROR] Errors: 
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:196 » Commit org.apache.hadoop.hbase.client.Ret...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] IndexToolIT.testSecondaryIndex:185 » PhoenixIO org.apache.phoenix.exception.Ph...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] MutableIndexFailureIT.testIndexWriteFailure:273->initializeTable:406 » Commit ...
[ERROR] TxWriteFailureIT.testDataTableWriteFailure:115->helpTestWriteFailure:149 » Commit
[ERROR] TxWriteFailureIT.testDataTableWriteFailure:115->helpTestWriteFailure:149 » Commit
[ERROR] PartialCommitIT.testNoFailure:170->testPartialCommit:274 » NullPointer
[ERROR] PartialCommitIT.testUpsertSelectFailure:192->testPartialCommit:274 » NullPointer
[INFO] 
[ERROR] Tests run: 640, Failures: 14, Errors: 20, Skipped: 44
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [ 1.462 s]
[INFO] Phoenix Core ....................................... FAILURE [ 07:14 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Kafka .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:14 h
[INFO] Finished at: 2018-05-25T06:32:52-07:00
[INFO] Final Memory: 68M/942M
[INFO] ------------------------------------------------------------------------
{code}

> Integrate Omid with Phoenix
> ---------------------------
>
>                 Key: PHOENIX-3623
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3623
>             Project: Phoenix
>          Issue Type: New Feature
>            Reporter: Ohad Shacham
>            Assignee: Ohad Shacham
>            Priority: Major
>
> The purpose of this Jira is to propose a work plan for connecting Omid to Phoenix.
> Each task of the following will be handled in a seperate sub Jira. Subtasks 4.* are related to augmenting Omid to support features required by Phoenix and therefore, their corresponding Jiras will appear under Omid and not under Phoenix. 
> Each task is completed by a commit.
> Task 1: Adding transaction abstraction layer (TAL) - Currently Tephra calls are integrated inside Phoenix code. Therefore, in order to support both Omid and Tephra, we need to add another abstraction layer that later-on will be connected to both Tephra and Omid. The first tasks is to define such an interface.
> Task 2: Implement TAL functionality for Tephra. 
> Task 3: Refactor Phoenix to use TAL instead of direct calls to Tephra.
> Task 4: Implement Omid required features for Phoenix:
> Task 4.1: Add checkpoints to Omid. A checkpoint is a point in a transaction where every write occurs after the checkpoint is not visible by the transaction. Explanations for this feature can be seen in [TEPHRA-96].
> Task 4.2: Add an option to mark a key as non-conflicting. The motivation is to reduce the size of the write set needed by the transaction manager upon commit as well as reduce the conflict detection work.
> Task 4.3: Add support for transactions that never abort. Such transactions will only make other inflight transactions abort and will abort only in case of a transaction manager failure. 
> These transactions are needed for ‘create index’ and the scenario was discussed in [TEPHRA-157] and [PHOENIX-2478]. Augmenting Omid with this kind of transactions was also discussed in [OMID-56].
> Task 4.4: Add support for returning multiple versions in a scan. The use case is described in [TEPHRA-134].
> Task 4.5: Change Omid's timestamp mechanism to return real time based timestamp, while keeping monotonicity.
> Task 5: Implement TAL functionality for Omid.
> Task 6: Implement performance tests and tune Omid for Phoenix use. This task requires understanding of common usage scenarios in Phoenix as well as defining the tradeoff between throughput and latency. 
> Could you please review the proposed work plan?
> Also, could you please let me know whether I missed any augmentation needed for Omid in order to support Phoenix operations?
> I opened a jira [OMID-82] that encapsulates all Omid related development for Phoenix.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Mime
View raw message