carbondata-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: CarbonData-master #155
Date Fri, 04 Nov 2016 19:13:52 GMT
See <https://builds.apache.org/job/CarbonData-master/155/changes>

Changes:

[ravipesala] add interface

------------------------------------------
[...truncated 1419 lines...]
INFO  04-11 19:13:13,467 - [testtable: Graph - MDKeyGentesttable][partitionID:0] All blocklets
have been finished writing
INFO  04-11 19:13:13,467 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286792000.carbondata>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:13,468 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 28 KB, choose the max value 1024 MB
as the block size on HDFS
INFO  04-11 19:13:13,469 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286792000.carbondata>
is 2
INFO  04-11 19:13:13,476 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286792000.carbonindex>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:13,476 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 707 Byte, choose the max value 1024
MB as the block size on HDFS
INFO  04-11 19:13:13,476 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286792000.carbonindex>
is 0
INFO  04-11 19:13:13,479 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Record Procerssed For table: testtable
INFO  04-11 19:13:13,479 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Summary: Carbon Slice Merger Step: Read: 1: Write: 0
INFO  04-11 19:13:13,479 - main Graph execution is finished.
INFO  04-11 19:13:13,480 - main Graph execution task is over with No error.
INFO  04-11 19:13:13,554 - pool-14-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,554 - pool-15-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,554 - pool-13-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,554 - pool-13-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,554 - pool-15-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,555 - pool-14-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,556 - pool-16-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,570 - pool-14-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,574 - main Table block size not specified for testdb_testtable. Therefore
considering the default value 1024 MB
INFO  04-11 19:13:13,591 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=8d42fe7f-4666-41f0-8b6a-8b5622292bfb] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/8d42fe7f-4666-41f0-8b6a-8b5622292bfb.dictmeta>
INFO  04-11 19:13:13,597 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=40e73dde-aec4-4190-8d10-a0dd89f147b8] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/40e73dde-aec4-4190-8d10-a0dd89f147b8.dictmeta>
INFO  04-11 19:13:13,598 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=e67e9ad0-5517-4379-b0af-638fc834790b] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/e67e9ad0-5517-4379-b0af-638fc834790b.dictmeta>
INFO  04-11 19:13:13,600 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=890a6197-4ce7-4ceb-a8c5-72e941431727] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/890a6197-4ce7-4ceb-a8c5-72e941431727.dictmeta>
INFO  04-11 19:13:13,622 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=857081c7-84dd-42ad-b676-8f297e79f15f] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/857081c7-84dd-42ad-b676-8f297e79f15f.dictmeta>
INFO  04-11 19:13:13,639 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=3399d8a5-58ed-4f1a-a1f7-fc4609d10985] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/3399d8a5-58ed-4f1a-a1f7-fc4609d10985.dictmeta>
INFO  04-11 19:13:13,642 - main ************* Is Columnar Storagetrue
INFO  04-11 19:13:13,651 - main Kettle environment initialized
INFO  04-11 19:13:13,666 - main ** Using csv file **
INFO  04-11 19:13:13,670 - testtable: Graph - CSV Input *****************Started all csv reading***********
INFO  04-11 19:13:13,670 - [pool-18-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-18-thread-1]
*****************started csv reading by thread***********
INFO  04-11 19:13:13,670 - main Graph execution is started <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/etl/testdb/testtable/0/1/testtable.ktr>
INFO  04-11 19:13:13,679 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Sort size for table: 100000
INFO  04-11 19:13:13,680 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Number of intermediate file to be merged: 20
INFO  04-11 19:13:13,680 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
File Buffer Size: 524288
INFO  04-11 19:13:13,680 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
temp file location<https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/sortrowtmp>
INFO  04-11 19:13:13,690 - [pool-18-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-18-thread-1]
Total Number of records processed by this thread is: 1000
INFO  04-11 19:13:13,690 - [pool-18-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-18-thread-1]
Time taken to processed 1000 Number of records: 20
INFO  04-11 19:13:13,690 - [pool-18-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-18-thread-1]
*****************Completed csv reading by thread***********
INFO  04-11 19:13:13,691 - testtable: Graph - CSV Input *****************Completed all csv
reading***********
INFO  04-11 19:13:13,704 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Level cardinality file written to : <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/levelmetadata_testtable.metadata>
INFO  04-11 19:13:13,704 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Record Procerssed For table: testtable
INFO  04-11 19:13:13,704 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Summary: Carbon CSV Based Seq Gen Step : 1000: Write: 1000
INFO  04-11 19:13:13,707 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
File based sorting will be used
INFO  04-11 19:13:13,721 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Record Processed For table: testtable
INFO  04-11 19:13:13,721 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Summary: Carbon Sort Key Step: Read: 1000: Write: 1000
INFO  04-11 19:13:13,724 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Initializing
writer executors
INFO  04-11 19:13:13,725 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Blocklet
Size: 120000
INFO  04-11 19:13:13,726 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total file
size: 1073741824 and dataBlock Size: 966367642
INFO  04-11 19:13:13,727 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Number of
temp file: 1
INFO  04-11 19:13:13,727 - [testtable: Graph - MDKeyGentesttable][partitionID:0] File Buffer
Size: 10485760
INFO  04-11 19:13:13,727 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Started adding
first record from each file
INFO  04-11 19:13:13,737 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Heap Size1
INFO  04-11 19:13:13,750 - pool-23-thread-1 Number Of records processed: 1000
INFO  04-11 19:13:13,750 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Record Procerssed
For table: testtable
INFO  04-11 19:13:13,750 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Finished
Carbon Mdkey Generation Step: Read: 1000: Write: 1000
INFO  04-11 19:13:13,750 - pool-24-thread-1 A new blocklet is added, its data size is: 27912
Byte
INFO  04-11 19:13:13,801 - [testtable: Graph - MDKeyGentesttable][partitionID:0] All blocklets
have been finished writing
INFO  04-11 19:13:13,801 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286793000.carbondata>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:13,802 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 28 KB, choose the max value 1024 MB
as the block size on HDFS
INFO  04-11 19:13:13,802 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286793000.carbondata>
is 1
INFO  04-11 19:13:13,803 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286793000.carbonindex>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:13,803 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 707 Byte, choose the max value 1024
MB as the block size on HDFS
INFO  04-11 19:13:13,803 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286793000.carbonindex>
is 0
INFO  04-11 19:13:13,805 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Record Procerssed For table: testtable
INFO  04-11 19:13:13,805 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Summary: Carbon Slice Merger Step: Read: 1: Write: 0
INFO  04-11 19:13:13,805 - main Graph execution is finished.
INFO  04-11 19:13:13,805 - main Graph execution task is over with No error.
INFO  04-11 19:13:13,815 - pool-29-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:13,864 - main Table block size not specified for testdb_testtable. Therefore
considering the default value 1024 MB
INFO  04-11 19:13:13,869 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=121fc6c1-d123-4ca3-a5bd-71987da0c918] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/121fc6c1-d123-4ca3-a5bd-71987da0c918.dictmeta>
INFO  04-11 19:13:13,873 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=1575a43e-666e-4910-8a64-ceb082eab468] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/1575a43e-666e-4910-8a64-ceb082eab468.dictmeta>
INFO  04-11 19:13:13,875 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=3950c734-59e7-417c-a5f4-b6a90eb156ad] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/3950c734-59e7-417c-a5f4-b6a90eb156ad.dictmeta>
INFO  04-11 19:13:13,878 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=697f0c3d-d35b-47b1-8d9d-4d493ad9fec0] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/697f0c3d-d35b-47b1-8d9d-4d493ad9fec0.dictmeta>
INFO  04-11 19:13:13,884 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=114d10fe-2c62-456b-8abf-f68bfdf2f1fd] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/114d10fe-2c62-456b-8abf-f68bfdf2f1fd.dictmeta>
INFO  04-11 19:13:13,892 - main Dictionary metadata file written successfully for column ColumnIdentifier
[columnId=66e704f9-8cb3-47e5-b6aa-2c4d4dd71f22] at path <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Metadata/66e704f9-8cb3-47e5-b6aa-2c4d4dd71f22.dictmeta>
INFO  04-11 19:13:13,897 - main ************* Is Columnar Storagetrue
INFO  04-11 19:13:13,908 - main Kettle environment initialized
INFO  04-11 19:13:13,923 - main ** Using csv file **
INFO  04-11 19:13:13,929 - main Graph execution is started <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/etl/testdb/testtable/0/1/testtable.ktr>
INFO  04-11 19:13:13,929 - testtable: Graph - CSV Input *****************Started all csv reading***********
INFO  04-11 19:13:13,930 - [pool-30-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-30-thread-1]
*****************started csv reading by thread***********
INFO  04-11 19:13:13,941 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Sort size for table: 100000
INFO  04-11 19:13:13,942 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Number of intermediate file to be merged: 20
INFO  04-11 19:13:13,942 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
File Buffer Size: 524288
INFO  04-11 19:13:13,943 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
temp file location<https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/sortrowtmp>
INFO  04-11 19:13:13,948 - [pool-30-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-30-thread-1]
Total Number of records processed by this thread is: 1000
INFO  04-11 19:13:13,948 - [pool-30-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-30-thread-1]
Time taken to processed 1000 Number of records: 18
INFO  04-11 19:13:13,948 - [pool-30-thread-1][partitionID:PROCESS_BLOCKS;queryID:pool-30-thread-1]
*****************Completed csv reading by thread***********
INFO  04-11 19:13:13,948 - testtable: Graph - CSV Input *****************Completed all csv
reading***********
INFO  04-11 19:13:13,957 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Level cardinality file written to : <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/levelmetadata_testtable.metadata>
INFO  04-11 19:13:13,957 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Record Procerssed For table: testtable
INFO  04-11 19:13:13,957 - [testtable: Graph - Carbon Surrogate Key Generator][partitionID:0]
Summary: Carbon CSV Based Seq Gen Step : 1000: Write: 1000
INFO  04-11 19:13:13,960 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
File based sorting will be used
INFO  04-11 19:13:13,963 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Record Processed For table: testtable
INFO  04-11 19:13:13,963 - [testtable: Graph - Sort Key: Sort keystesttable][partitionID:0]
Summary: Carbon Sort Key Step: Read: 1000: Write: 1000
INFO  04-11 19:13:13,971 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Initializing
writer executors
INFO  04-11 19:13:13,972 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Blocklet
Size: 120000
INFO  04-11 19:13:13,972 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total file
size: 1073741824 and dataBlock Size: 966367642
INFO  04-11 19:13:13,973 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Number of
temp file: 1
INFO  04-11 19:13:13,973 - [testtable: Graph - MDKeyGentesttable][partitionID:0] File Buffer
Size: 10485760
INFO  04-11 19:13:13,973 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Started adding
first record from each file
INFO  04-11 19:13:13,977 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Heap Size1
INFO  04-11 19:13:13,993 - pool-35-thread-1 Number Of records processed: 1000
INFO  04-11 19:13:13,993 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Record Procerssed
For table: testtable
INFO  04-11 19:13:13,993 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Finished
Carbon Mdkey Generation Step: Read: 1000: Write: 1000
INFO  04-11 19:13:14,033 - pool-36-thread-1 A new blocklet is added, its data size is: 27912
Byte
INFO  04-11 19:13:14,045 - [testtable: Graph - MDKeyGentesttable][partitionID:0] All blocklets
have been finished writing
INFO  04-11 19:13:14,045 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286793000.carbondata>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:14,045 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 28 KB, choose the max value 1024 MB
as the block size on HDFS
INFO  04-11 19:13:14,045 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/part-0-1-1478286793000.carbondata>
is 0
INFO  04-11 19:13:14,046 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Copying <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286793000.carbonindex>
--> <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0>
INFO  04-11 19:13:14,046 - [testtable: Graph - MDKeyGentesttable][partitionID:0] The configured
block size is 1024 MB, the actual carbon file size is 707 Byte, choose the max value 1024
MB as the block size on HDFS
INFO  04-11 19:13:14,046 - [testtable: Graph - MDKeyGentesttable][partitionID:0] Total copy
time (ms) to copy file <https://builds.apache.org/job/CarbonData-master/ws/processing/target/store/testdb/testtable/Fact/Part0/Segment_0/1/1-1478286793000.carbonindex>
is 0
INFO  04-11 19:13:14,048 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Record Procerssed For table: testtable
INFO  04-11 19:13:14,048 - [testtable: Graph - Carbon Slice Mergertesttable][partitionID:testtable]
Summary: Carbon Slice Merger Step: Read: 1: Write: 0
INFO  04-11 19:13:14,048 - main Graph execution is finished.
INFO  04-11 19:13:14,049 - main Graph execution task is over with No error.
INFO  04-11 19:13:14,056 - pool-42-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,056 - pool-42-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,056 - pool-43-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,057 - pool-43-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,057 - pool-44-thread-2 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,060 - pool-44-thread-1 ****************************Total Number Rows
In BTREE: 1000
INFO  04-11 19:13:14,060 - pool-43-thread-1 ****************************Total Number Rows
In BTREE: 1000
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.307 sec - in org.apache.carbondata.carbon.datastore.BlockIndexStoreTest

Results :

Tests run: 12, Failures: 0, Errors: 0, Skipped: 0

[JENKINS] Recording test results
[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ carbondata-processing ---
[INFO] Building jar: <https://builds.apache.org/job/CarbonData-master/ws/processing/target/carbondata-processing-0.2.0-incubating-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ carbondata-processing
---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache CarbonData :: Hadoop 0.2.0-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ carbondata-hadoop ---
[INFO] Deleting <https://builds.apache.org/job/CarbonData-master/ws/hadoop/target>
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ carbondata-hadoop ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ carbondata-hadoop ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:compile (default-compile) @ carbondata-hadoop ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 32 source files to <https://builds.apache.org/job/CarbonData-master/ws/hadoop/target/classes>
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonRecordReader.java>:
Some input files use unchecked or unsafe operations.
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/CarbonRecordReader.java>:
Recompile with -Xlint:unchecked for details.
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/internal/index/impl/InMemoryBTreeIndexLoader.java>:[26,8]
org.apache.carbondata.hadoop.internal.index.impl.InMemoryBTreeIndexLoader is not abstract
and does not override abstract method load(org.apache.carbondata.hadoop.internal.segment.Segment)
in org.apache.carbondata.hadoop.internal.index.IndexLoader
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/internal/index/impl/InMemoryBTreeIndexLoader.java>:[27,3]
method does not override or implement a method from a supertype
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableInputFormat.java>:[61,12]
an enum switch case label must be the unqualified name of an enumeration constant
[INFO] 3 errors 
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache CarbonData :: Parent ........................ SUCCESS [  3.417 s]
[INFO] Apache CarbonData :: Common ........................ SUCCESS [  9.466 s]
[INFO] Apache CarbonData :: Core .......................... SUCCESS [ 39.723 s]
[INFO] Apache CarbonData :: Processing .................... SUCCESS [ 12.799 s]
[INFO] Apache CarbonData :: Hadoop ........................ FAILURE [  3.055 s]
[INFO] Apache CarbonData :: Spark ......................... SKIPPED
[INFO] Apache CarbonData :: Assembly ...................... SKIPPED
[INFO] Apache CarbonData :: Examples ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:15 min
[INFO] Finished at: 2016-11-04T19:13:22+00:00
[INFO] Final Memory: 62M/779M
[INFO] ------------------------------------------------------------------------
Waiting for Jenkins to finish collecting data
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.2:compile
(default-compile) on project carbondata-hadoop: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/internal/index/impl/InMemoryBTreeIndexLoader.java>:[26,8]
org.apache.carbondata.hadoop.internal.index.impl.InMemoryBTreeIndexLoader is not abstract
and does not override abstract method load(org.apache.carbondata.hadoop.internal.segment.Segment)
in org.apache.carbondata.hadoop.internal.index.IndexLoader
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/internal/index/impl/InMemoryBTreeIndexLoader.java>:[27,3]
method does not override or implement a method from a supertype
[ERROR] <https://builds.apache.org/job/CarbonData-master/ws/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableInputFormat.java>:[61,12]
an enum switch case label must be the unqualified name of an enumeration constant
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.2:compile
(default-compile) on project carbondata-hadoop: Compilation failure
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
	at org.jvnet.hudson.maven3.launcher.Maven32Launcher.main(Maven32Launcher.java:132)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:330)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:238)
	at jenkins.maven3.agent.Maven32Main.launch(Maven32Main.java:186)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at hudson.maven.Maven3Builder.call(Maven3Builder.java:136)
	at hudson.maven.Maven3Builder.call(Maven3Builder.java:71)
	at hudson.remoting.UserRequest.perform(UserRequest.java:153)
	at hudson.remoting.UserRequest.perform(UserRequest.java:50)
	at hudson.remoting.Request$2.run(Request.java:332)
	at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.compiler.CompilationFailureException: Compilation failure
	at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:909)
	at org.apache.maven.plugin.compiler.CompilerMojo.execute(CompilerMojo.java:129)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
	... 31 more
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following
articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :carbondata-hadoop
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/processing/pom.xml>
to org.apache.carbondata/carbondata-processing/0.2.0-incubating-SNAPSHOT/carbondata-processing-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/processing/target/carbondata-processing-0.2.0-incubating-SNAPSHOT.jar>
to org.apache.carbondata/carbondata-processing/0.2.0-incubating-SNAPSHOT/carbondata-processing-0.2.0-incubating-SNAPSHOT.jar
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/assembly/pom.xml>
to org.apache.carbondata/carbondata-assembly/0.2.0-incubating-SNAPSHOT/carbondata-assembly-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/pom.xml> to
org.apache.carbondata/carbondata-parent/0.2.0-incubating-SNAPSHOT/carbondata-parent-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/core/pom.xml>
to org.apache.carbondata/carbondata-core/0.2.0-incubating-SNAPSHOT/carbondata-core-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/core/target/carbondata-core-0.2.0-incubating-SNAPSHOT.jar>
to org.apache.carbondata/carbondata-core/0.2.0-incubating-SNAPSHOT/carbondata-core-0.2.0-incubating-SNAPSHOT.jar
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/hadoop/pom.xml>
to org.apache.carbondata/carbondata-hadoop/0.2.0-incubating-SNAPSHOT/carbondata-hadoop-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/examples/pom.xml>
to org.apache.carbondata/carbondata-examples/0.2.0-incubating-SNAPSHOT/carbondata-examples-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/common/pom.xml>
to org.apache.carbondata/carbondata-common/0.2.0-incubating-SNAPSHOT/carbondata-common-0.2.0-incubating-SNAPSHOT.pom
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/common/target/carbondata-common-0.2.0-incubating-SNAPSHOT.jar>
to org.apache.carbondata/carbondata-common/0.2.0-incubating-SNAPSHOT/carbondata-common-0.2.0-incubating-SNAPSHOT.jar
[JENKINS] Archiving <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/pom.xml>
to org.apache.carbondata/carbondata-spark/0.2.0-incubating-SNAPSHOT/carbondata-spark-0.2.0-incubating-SNAPSHOT.pom
Sending e-mails to: commits@carbondata.incubator.apache.org
channel stopped

Mime
View raw message