Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0ABE21060E for ; Fri, 2 Jan 2015 23:34:35 +0000 (UTC) Received: (qmail 99597 invoked by uid 500); 2 Jan 2015 23:34:35 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 99529 invoked by uid 500); 2 Jan 2015 23:34:35 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 99514 invoked by uid 500); 2 Jan 2015 23:34:35 -0000 Delivered-To: apmail-hadoop-hive-dev@hadoop.apache.org Received: (qmail 99511 invoked by uid 99); 2 Jan 2015 23:34:35 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Jan 2015 23:34:35 +0000 Date: Fri, 2 Jan 2015 23:34:35 +0000 (UTC) From: "Hive QA (JIRA)" To: hive-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-9244) Upgrade 0.23 hadoop-shims to latest stable hadoop-2.6.0 MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-9244?page=3Dcom.atlassian.= jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D14263= 312#comment-14263312 ]=20 Hive QA commented on HIVE-9244: ------------------------------- {color:red}Overall{color}: -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12689887/HIVE-9244.1.patch Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job= /PreCommit-HIVE-TRUNK-Build/2237/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/j= ob/PreCommit-HIVE-TRUNK-Build/2237/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit= -HIVE-TRUNK-Build-2237/ Messages: {noformat} **** This message was trimmed, see log for full details **** Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestCompareCliDr= iver.java from template TestCompareCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/ql/src/test/= templates Starting Generation of: TestMinimrCliDriver Include Files: auto_sortmerge_join_16.q,bucket4.q,bucket5.q,bucket6.q,bucke= t_num_reducers.q,bucket_num_reducers2.q,bucketizedhiveinputformat.q,bucketm= apjoin6.q,bucketmapjoin7.q,constprog_partitioner.q,disable_merge_for_bucket= ing.q,empty_dir_in_table.q,external_table_with_space_in_location_path.q,fil= e_with_header_footer.q,groupby2.q,import_exported_table.q,index_bitmap3.q,i= ndex_bitmap_auto.q,infer_bucket_sort_bucketed_table.q,infer_bucket_sort_dyn= _part.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,infer_b= ucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,input16_cc.= q,join1.q,leftsemijoin_mr.q,list_bucket_dml_10.q,load_fs2.q,load_hdfs_file_= with_space_in_the_name.q,optrstat_groupby.q,parallel_orderby.q,ql_rewrite_g= btoidx.q,ql_rewrite_gbtoidx_cbo_1.q,ql_rewrite_gbtoidx_cbo_2.q,quotedid_smb= .q,reduce_deduplicate.q,remote_script.q,root_dir_external_table.q,schemeAut= hority.q,schemeAuthority2.q,scriptfile1.q,scriptfile1_win.q,smb_mapjoin_8.q= ,stats_counter.q,stats_counter_partitioned.q,temp_table_external.q,truncate= _column_buckets.q,uber_reduce.q,udf_using.q Excluded Files: null Query Files:=20 Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestMinimrCliDri= ver.java from template TestCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/ql/src/test/= templates Starting Generation of: TestMiniTezCliDriver Include Files: bucket_map_join_tez1.q,bucket_map_join_tez2.q,dynamic_partit= ion_pruning.q,dynamic_partition_pruning_2.q,mapjoin_decimal.q,lvj_mapjoin.q= , mrr.q,tez_bmj_schema_evolution.q,tez_dml.q,tez_fsstat.q,tez_insert_overwr= ite_local_directory_1.q,tez_join_hash.q,tez_join_tests.q,tez_joins_explain.= q,tez_schema_evolution.q,tez_union.q,tez_union2.q,tez_union_decimal.q,tez_u= nion_group_by.q,tez_smb_main.q,tez_smb_1.q,vectorized_dynamic_partition_pru= ning.q,tez_multi_union.q,alter_merge_2_orc.q,alter_merge_orc.q,alter_merge_= stats_orc.q,auto_join0.q,auto_join1.q,bucket2.q,bucket3.q,bucket4.q,cbo_gby= .q,cbo_gby_empty.q,cbo_join.q,cbo_limit.q,cbo_semijoin.q,cbo_simple_select.= q,cbo_stats.q,cbo_subq_exists.q,cbo_subq_in.q,cbo_subq_not_in.q,cbo_udf_uda= f.q,cbo_union.q,cbo_views.q,cbo_windowing.q,correlationoptimizer1.q,count.q= ,create_merge_compressed.q,cross_join.q,cross_product_check_1.q,cross_produ= ct_check_2.q,ctas.q,custom_input_output_format.q,delete_all_non_partitioned= .q,delete_all_partitioned.q,delete_orig_table.q,delete_tmp_table.q,delete_w= here_no_match.q,delete_where_non_partitioned.q,delete_where_partitioned.q,d= elete_whole_partition.q,disable_merge_for_bucketing.q,dynpart_sort_opt_vect= orization.q,dynpart_sort_optimization.q,dynpart_sort_optimization2.q,enforc= e_order.q,filter_join_breaktask.q,filter_join_breaktask2.q,groupby1.q,group= by2.q,groupby3.q,having.q,identity_project_remove_skip.qinsert1.q,insert_in= to1.q,insert_into2.q,insert_orig_table.q,insert_values_dynamic_partitioned.= q,insert_values_non_partitioned.q,insert_values_orig_table.qinsert_values_p= artitioned.q,insert_values_tmp_table.q,insert_update_delete.q,join0.q,join1= .q,join_nullsafe.q,leftsemijoin.q,limit_pushdown.q,load_dyn_part1.q,load_dy= n_part2.q,load_dyn_part3.q,mapjoin_mapjoin.q,mapreduce1.q,mapreduce2.q,merg= e1.q,merge2.q,metadataonly1.q,metadata_only_queries.q,optimize_nullscan.q,o= rc_analyze.q,orc_merge1.q,orc_merge2.q,orc_merge3.q,orc_merge4.q,orc_merge5= .q,orc_merge6.q,orc_merge7.q,orc_merge_incompat1.q,orc_merge_incompat2.q,or= c_vectorization_ppd.q,parallel.q,ptf.q,sample1.q,script_env_var1.q,script_e= nv_var2.q,script_pipe.q,scriptfile1.q,select_dummy_source.q,skewjoin.q,stat= s_counter.q,stats_counter_partitioned.q,stats_noscan_1.q,subquery_exists.q,= subquery_in.q,temp_table.q,transform1.q,transform2.q,transform_ppr1.q,trans= form_ppr2.q,union2.q,union3.q,union4.q,union5.q,union6.q,union7.q,union8.q,= union9.q,update_after_multiple_inserts.q,update_all_non_partitioned.q,updat= e_all_partitioned.q,update_all_types.q,update_orig_table.q,update_tmp_table= .q,update_where_no_match.q,update_where_non_partitioned.q,update_where_part= itioned.q,update_two_cols.q,vector_between_in.q,vector_bucket.q,vector_cast= _constant.q,vector_char_4.q,vector_char_simple.q,vector_coalesce.q,vector_c= oalesce_2.q,vector_count_distinct.q,vector_data_types.q,vector_decimal_1.q,= vector_decimal_10_0.q,vector_decimal_2.q,vector_decimal_3.q,vector_decimal_= 4.q,vector_decimal_5.q,vector_decimal_6.q,vector_decimal_aggregate.q,vector= _decimal_cast.q,vector_decimal_expressions.q,vector_decimal_mapjoin.q,vecto= r_decimal_math_funcs.q,vector_decimal_precision.q,vector_decimal_trailing.q= ,vector_decimal_udf.q,vector_decimal_udf2.q,vector_distinct_2.q,vector_elt.= q,vector_groupby_3.q,vector_groupby_reduce.q,vector_left_outer_join.q,vecto= r_mapjoin_reduce.q,vector_non_string_partition.q,vector_orderby_5.q,vector_= partition_diff_num_cols.q,vector_partitioned_date_time.q,vector_reduce_grou= pby_decimal.q,vector_string_concat.q,vector_varchar_4.q,vector_varchar_simp= le.q,vectorization_0.q,vectorization_1.q,vectorization_10.q,vectorization_1= 1.q,vectorization_12.q,vectorization_13.q,vectorization_14.q,vectorization_= 15.q,vectorization_16.q,vectorization_2.q,vectorization_3.q,vectorization_4= .q,vectorization_5.q,vectorization_6.q,vectorization_7.q,vectorization_8.q,= vectorization_9.q,vectorization_decimal_date.q,vectorization_div0.q,vectori= zation_limit.q,vectorization_nested_udf.q,vectorization_not.q,vectorization= _part.q,vectorization_part_project.q,vectorization_pushdown.q,vectorization= _short_regress.q,vectorized_bucketmapjoin1.q,vectorized_case.q,vectorized_c= asts.q,vectorized_context.q,vectorized_date_funcs.q,vectorized_distinct_gby= .q,vectorized_mapjoin.q,vectorized_math_funcs.q,vectorized_nested_mapjoin.q= ,vectorized_parquet.q,vectorized_ptf.q,vectorized_rcfile_columnar.q,vectori= zed_shufflejoin.q,vectorized_string_funcs.q,vectorized_timestamp_funcs.q,au= to_sortmerge_join_1.q,auto_sortmerge_join_10.q,auto_sortmerge_join_11.q,aut= o_sortmerge_join_12.q,auto_sortmerge_join_13.q,auto_sortmerge_join_14.q,aut= o_sortmerge_join_15.q,auto_sortmerge_join_16.q,auto_sortmerge_join_2.q,auto= _sortmerge_join_3.q,auto_sortmerge_join_4.q,auto_sortmerge_join_5.q,auto_so= rtmerge_join_7.q,auto_sortmerge_join_8.q,auto_sortmerge_join_9.q Excluded Files: null Query Files:=20 Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestMiniTezCliDr= iver.java from template TestCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/ql/src/test/= templates Starting Generation of: TestNegativeMinimrCliDriver Include Files: cluster_tasklog_retrieval.q,file_with_header_footer_negative= .q,local_mapred_error_cache.q,mapreduce_stack_trace.q,mapreduce_stack_trace= _hadoop20.q,mapreduce_stack_trace_turnoff.q,mapreduce_stack_trace_turnoff_h= adoop20.q,minimr_broken_pipe.q,udf_local_resource.q Excluded Files: null Query Files:=20 Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestNegativeMini= mrCliDriver.java from template TestNegativeCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/hbase-handle= r/src/test/templates Starting Generation of: TestHBaseCliDriver Include Files: null Excluded Files: null Query Files:=20 Query Files Regex: null Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestHBaseCliDriv= er.java from template TestHBaseCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/hbase-handle= r/src/test/templates Starting Generation of: TestHBaseMinimrCliDriver Include Files: null Excluded Files: null Query Files: hbase_bulk.m Query Files Regex: null Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestHBaseMinimrC= liDriver.java from template TestHBaseCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/hbase-handle= r/src/test/templates Starting Generation of: TestHBaseNegativeCliDriver Include Files: null Excluded Files: null Query Files:=20 Query Files Regex: null Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestHBaseNegativ= eCliDriver.java from template TestHBaseNegativeCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/accumulo-han= dler/src/test/templates Starting Generation of: TestAccumuloCliDriver Include Files: null Excluded Files: null Query Files:=20 Query Files Regex: null Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestAccumuloCliD= river.java from template TestAccumuloCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/ql/src/test/= templates Starting Generation of: TestContribCliDriver Include Files: null Excluded Files: null Query Files:=20 Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestContribCliDr= iver.java from template TestCliDriver.vm Template Path:/data/hive-ptest/working/apache-svn-trunk-source/ql/src/test/= templates Starting Generation of: TestContribNegativeCliDriver Include Files: null Excluded Files: null Query Files:=20 Query Files Regex:=20 Generated /data/hive-ptest/working/apache-svn-trunk-source/itests/qtest/tar= get/generated-test-sources/java/org/apache/hadoop/hive/cli/TestContribNegat= iveCliDriver.java from template TestNegativeCliDriver.vm [INFO] Executed tasks [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources)= @ hive-it-qfile --- [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-sou= rce/itests/qtest/target/generated-test-sources/java added. [INFO]=20 [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources)= @ hive-it-qfile --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/qtest/src/test/resources [INFO] Copying 3 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-qfile --= - [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/qtest/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/qtest/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/qtest/target/tmp/conf [copy] Copying 8 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/itests/qtest/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-it-qfile --- [INFO] Compiling 14 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/itests/qtest/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-qfile -= -- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-qfile --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itest= s/qtest/target/hive-it-qfile-0.15.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hi= ve-it-qfile --- [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-qfi= le --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/q= test/target/hive-it-qfile-0.15.0-SNAPSHOT.jar to /data/hive-ptest/working/m= aven/org/apache/hive/hive-it-qfile/0.15.0-SNAPSHOT/hive-it-qfile-0.15.0-SNA= PSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/q= test/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it-qfil= e/0.15.0-SNAPSHOT/hive-it-qfile-0.15.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Integration - Unit Tests - Hadoop 2 0.15.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-unit-hado= op2 --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/hiv= e-unit-hadoop2 (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ hiv= e-it-unit-hadoop2 --- [INFO]=20 [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-it-un= it-hadoop2 --- [INFO]=20 [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-= it-unit-hadoop2 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/hive-unit-hadoop2/src/main/resources [INFO] Copying 3 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-unit-ha= doop2 --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-un= it-hadoop2 --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources)= @ hive-it-unit-hadoop2 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/hive-unit-hadoop2/src/test/resources [INFO] Copying 3 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-unit-had= oop2 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/hive-unit-hadoop2/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/hive-unit-hadoop2/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/hive-unit-hadoop2/target/tmp/conf [copy] Copying 8 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/itests/hive-unit-hadoop2/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-metastore-scripts) @ hive-it-= unit-hadoop2 --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/hive-unit-hadoop2/target/tmp/scripts/metastore [copy] Copying 192 files to /data/hive-ptest/working/apache-svn-trunk-= source/itests/hive-unit-hadoop2/target/tmp/scripts/metastore [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-it-unit-hadoop2 --- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trun= k-source/itests/hive-unit-hadoop2/target/test-classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING :=20 [INFO] ------------------------------------------------------------- [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit= -hadoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridg= e.java: /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit-h= adoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridge.= java uses or overrides a deprecated API. [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit= -hadoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridg= e.java: Recompile with -Xlint:deprecation for details. [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit= -hadoop2/src/test/java/org/apache/hadoop/hive/ql/security/TestPasswordWithC= redentialProvider.java: /data/hive-ptest/working/apache-svn-trunk-source/it= ests/hive-unit-hadoop2/src/test/java/org/apache/hadoop/hive/ql/security/Tes= tPasswordWithCredentialProvider.java uses unchecked or unsafe operations. [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit= -hadoop2/src/test/java/org/apache/hadoop/hive/ql/security/TestPasswordWithC= redentialProvider.java: Recompile with -Xlint:unchecked for details. [INFO] 4 warnings=20 [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR :=20 [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit-h= adoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridge.= java:[134,49] non-static method getProxySuperuserIpConfKey(java.lang.String= ) cannot be referenced from a static context [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit-h= adoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridge.= java:[297,35] non-static method getProxySuperuserGroupConfKey(java.lang.Str= ing) cannot be referenced from a static context [INFO] 2 errors=20 [INFO] ------------------------------------------------------------- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Hive Integration - Parent ......................... SUCCESS [8.686s] [INFO] Hive Integration - Custom Serde ................... SUCCESS [14.259s= ] [INFO] Hive Integration - HCatalog Unit Tests ............ SUCCESS [17.052s= ] [INFO] Hive Integration - Testing Utilities .............. SUCCESS [15.452s= ] [INFO] Hive Integration - Unit Tests ..................... SUCCESS [14.274s= ] [INFO] Hive Integration - Test Serde ..................... SUCCESS [1.458s] [INFO] Hive Integration - QFile Tests .................... SUCCESS [9.111s] [INFO] Hive Integration - Unit Tests - Hadoop 2 .......... FAILURE [3.880s] [INFO] Hive Integration - Unit Tests with miniKdc ........ SKIPPED [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 1:28.140s [INFO] Finished at: Fri Jan 02 18:32:57 EST 2015 [INFO] Final Memory: 77M/217M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plug= in:3.1:testCompile (default-testCompile) on project hive-it-unit-hadoop2: C= ompilation failure: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit-h= adoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridge.= java:[134,49] non-static method getProxySuperuserIpConfKey(java.lang.String= ) cannot be referenced from a static context [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/hive-unit-h= adoop2/src/test/java/org/apache/hadoop/hive/thrift/TestHadoop20SAuthBridge.= java:[297,35] non-static method getProxySuperuserGroupConfKey(java.lang.Str= ing) cannot be referenced from a static context [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hive-it-unit-hadoop2 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12689887 - PreCommit-HIVE-TRUNK-Build > Upgrade 0.23 hadoop-shims to latest stable hadoop-2.6.0 > ------------------------------------------------------- > > Key: HIVE-9244 > URL: https://issues.apache.org/jira/browse/HIVE-9244 > Project: Hive > Issue Type: Improvement > Components: Shims > Affects Versions: 0.15.0 > Reporter: Gopal V > Assignee: Gopal V > Fix For: 0.15.0 > > Attachments: HIVE-9244.1.patch > > > Upgrade hadoop version in 0.23 shims to latest stable version. -- This message was sent by Atlassian JIRA (v6.3.4#6332)