Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 67779200BBD for ; Tue, 8 Nov 2016 11:12:12 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 65F46160B0A; Tue, 8 Nov 2016 10:12:12 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8FCC6160AFA for ; Tue, 8 Nov 2016 11:12:09 +0100 (CET) Received: (qmail 36566 invoked by uid 500); 8 Nov 2016 10:12:08 -0000 Mailing-List: contact user-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flink.apache.org Delivered-To: mailing list user@flink.apache.org Received: (qmail 36552 invoked by uid 99); 8 Nov 2016 10:12:07 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 08 Nov 2016 10:12:07 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 5CF75183994 for ; Tue, 8 Nov 2016 10:12:07 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.881 X-Spam-Level: * X-Spam-Status: No, score=1.881 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, HTML_OBFUSCATE_05_10=0.001, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id KvK8ldQhr95G for ; Tue, 8 Nov 2016 10:11:54 +0000 (UTC) Received: from mail-lf0-f51.google.com (mail-lf0-f51.google.com [209.85.215.51]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 0AB7A5F19B for ; Tue, 8 Nov 2016 10:11:54 +0000 (UTC) Received: by mail-lf0-f51.google.com with SMTP id c13so135198460lfg.0 for ; Tue, 08 Nov 2016 02:11:54 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=g+h+YsFoCBAIVXcmGrQdKpTS78q/DS01jgTFQ1bwk50=; b=0Mnrh1ua6m5Fg+Tt5mlHi5/0oo8wzk2/FjrxPK4P0OjTQ3DjP39ff3JZ7vV4taRfL1 XROLs0VNcdxSlQZbkt316vHrGIn2njh5WvgMR87FR3eu1qh8S7JcvcqAu0XYetS+CjLp kn3vQDvMkguE3rNrJmK00y1WaGDRlgSOyVzvChNwdpinrV53osKorNGB0762EGYhCJFT WvV2rqaxbCzTcrbD2I83qxxJcyuGpEmJyOwHlIort9Vh0dz3lt1QcaZ7lRYW1uCsJLKG G79CfKcBh0rsLUFV3Fz6t8f37mmaJYMqtnqIBU9lGSEuVaobblG2ftYj7KtoOyJhX1Vr BS/A== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=g+h+YsFoCBAIVXcmGrQdKpTS78q/DS01jgTFQ1bwk50=; b=UikkjJDHZt/WeBVBrCYkUWy9MD6qo+11qlxEeBKjFqqBBDCYbetn+1zzXgFxQBjNbl uckuNIJZylm5050lHeMyAT7seAly3aRe1nfkmwbuZ5/UqK0l2ky2hfDv6DbOHJ6x1SFE IYvnPoa5ZQ846GQ0GeIJjoPevQyar2TTJHkNkHn5sFRtK54HlPCpOI9C69nm/lB9/25S 4JnPcGd6fegbzn6EbaV7ZYGdOmgZyrSIjkGH3qLeWpQsSu2XkjdNvUUpIL0VWJDLtuMJ xR0q2+kArrHbmAecx2z7U1EbNbCNCx3o1iu5pPoqX201z/LuQZg9jideb6v82z+KlTYh LrKA== X-Gm-Message-State: ABUngvfGPcLffh7OBRVX0DXhzOjjmIY3yQ4s4V71Qvs4H5QV1GdUcJ7NePltnbAoeP28JYyuwnVJd3gAnDQf5Q== X-Received: by 10.25.205.72 with SMTP id d69mr6566662lfg.23.1478599910176; Tue, 08 Nov 2016 02:11:50 -0800 (PST) MIME-Version: 1.0 Received: by 10.25.137.4 with HTTP; Tue, 8 Nov 2016 02:11:19 -0800 (PST) In-Reply-To: References: <00D94A02-E305-4DFB-BBA0-07FA91482867@amazon.com> From: Fabian Hueske Date: Tue, 8 Nov 2016 11:11:19 +0100 Message-ID: Subject: Re: Kinesis Connector Dependency Problems To: user@flink.apache.org Content-Type: multipart/alternative; boundary=001a1141fd20693ebc0540c75e4a archived-at: Tue, 08 Nov 2016 10:12:12 -0000 --001a1141fd20693ebc0540c75e4a Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi, I encountered this issue before as well. Which Maven version are you using? Maven 3.3.x does not properly shade dependencies. You have to use Maven 3.0.3 (see [1]). Best, Fabian [1] https://ci.apache.org/projects/flink/flink-docs-release-1.1/setup/building.= html 2016-11-08 11:05 GMT+01:00 Till Rohrmann : > Yes this definitely looks like a similar issue. Once we shade the aws > dependencies in the Kinesis connector, the problem should be (hopefully) > resolved. I've added your problem description to the JIRA. Thanks for > reporting it. > > Cheers, > Till > > On Mon, Nov 7, 2016 at 8:01 PM, Foster, Craig wrote= : > >> I think this is a similar issue but it was brought to my attention that >> we=E2=80=99re also seeing this on EMR 5.1.0 with the FlinkKinesisConsume= r. What I >> did to duplicate this issue: >> >> 1) I have used the Wikiedit quickstart but used Kinesis instead of >> Kafka to publish results with a FlinkKinesisProducer. This works fine. I >> can use a separate script to read what was published to my stream. >> >> 2) When using a FlinkKinesisConsumer, however, I get an error: >> >> >> >> java.lang.NoSuchMethodError: org.apache.http.params.HttpCon >> nectionParams.setSoKeepalive(Lorg/apache/http/params/HttpParams;Z)V >> >> at com.amazonaws.http.HttpClientF >> actory.createHttpClient(HttpClientFactory.java:96) >> >> at com.amazonaws.http.AmazonHttpC >> lient.(AmazonHttpClient.java:187) >> >> at com.amazonaws.AmazonWebService >> Client.(AmazonWebServiceClient.java:136) >> >> at com.amazonaws.services.kinesis >> .AmazonKinesisClient.(AmazonKinesisClient.java:221) >> >> at com.amazonaws.services.kinesis >> .AmazonKinesisClient.(AmazonKinesisClient.java:197) >> >> at org.apache.flink.streaming.con >> nectors.kinesis.util.AWSUtil.createKinesisClient(AWSUtil.java:56) >> >> at org.apache.flink.streaming.con >> nectors.kinesis.proxy.KinesisProxy.(KinesisProxy.java:118) >> >> at org.apache.flink.streaming.con >> nectors.kinesis.proxy.KinesisProxy.create(KinesisProxy.java:176) >> >> at org.apache.flink.streaming.con >> nectors.kinesis.internals.KinesisDataFetcher.(KinesisD >> ataFetcher.java:188) >> >> at org.apache.flink.streaming.con >> nectors.kinesis.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:198) >> >> at org.apache.flink.streaming.api >> .operators.StreamSource.run(StreamSource.java:80) >> >> at org.apache.flink.streaming.api >> .operators.StreamSource.run(StreamSource.java:53) >> >> at org.apache.flink.streaming.run >> time.tasks.SourceStreamTask.run(SourceStreamTask.java:56) >> >> at org.apache.flink.streaming.run >> time.tasks.StreamTask.invoke(StreamTask.java:266) >> >> at org.apache.flink.runtime.taskm >> anager.Task.run(Task.java:585) >> >> at java.lang.Thread.run(Thread.java:745) >> >> >> >> >> >> >> >> >> >> *From: *Robert Metzger >> *Reply-To: *"user@flink.apache.org" >> *Date: *Friday, November 4, 2016 at 2:57 AM >> *To: *"user@flink.apache.org" >> *Subject: *Re: Kinesis Connector Dependency Problems >> >> >> >> Thank you for helping to investigate the issue. I've filed an issue in >> our bugtracker: https://issues.apache.org/jira/browse/FLINK-5013 >> >> >> >> On Wed, Nov 2, 2016 at 10:09 PM, Justin Yan >> wrote: >> >> Sorry it took me a little while, but I'm happy to report back that it >> seems to be working properly with EMR 4.8. It seems so obvious in >> retrospect... thanks again for the assistance! >> >> >> >> Cheers, >> >> >> >> Justin >> >> >> >> On Tue, Nov 1, 2016 at 11:44 AM, Robert Metzger >> wrote: >> >> Hi Justin, >> >> >> >> thank you for sharing the classpath of the Flink container with us. It >> contains what Till was already expecting: An older version of the AWS SD= K. >> >> >> >> If you have some spare time, could you quickly try to run your program >> with a newer EMR version, just to validate our suspicion? >> >> If the error doesn't occur on a more recent EMR version, then we know wh= y >> its happening. >> >> >> >> We'll then probably need to shade (relocate) the Kinesis code to make it >> work with older EMR libraries. >> >> >> >> Regards, >> >> Robert >> >> >> >> >> >> On Tue, Nov 1, 2016 at 6:27 PM, Justin Yan >> wrote: >> >> Hi there, >> >> >> >> We're using EMR 4.4.0 -> I suppose this is a bit old, and I can migrate >> forward if you think that would be best. >> >> >> >> I've appended the classpath that the Flink cluster was started with at >> the end of this email (with a slight improvement to the formatting to ma= ke >> it readable). >> >> >> >> Willing to poke around or fiddle with this as necessary - thanks very >> much for the help! >> >> >> >> Justin >> >> >> >> Task Manager's classpath from logs: >> >> >> >> lib/flink-dist_2.11-1.1.3.jar >> >> lib/flink-python_2.11-1.1.3.jar >> >> lib/log4j-1.2.17.jar >> >> lib/slf4j-log4j12-1.7.7.jar >> >> logback.xml >> >> log4j.properties >> >> flink.jar >> >> flink-conf.yaml >> >> /etc/hadoop/conf >> >> /usr/lib/hadoop/hadoop-annotations-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-extras.jar >> >> /usr/lib/hadoop/hadoop-archives-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-aws-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-sls-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-auth-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-sls.jar >> >> /usr/lib/hadoop/hadoop-gridmix.jar >> >> /usr/lib/hadoop/hadoop-auth.jar >> >> /usr/lib/hadoop/hadoop-gridmix-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-rumen.jar >> >> /usr/lib/hadoop/hadoop-azure-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-common-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-azure.jar >> >> /usr/lib/hadoop/hadoop-datajoin-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-nfs.jar >> >> /usr/lib/hadoop/hadoop-aws.jar >> >> /usr/lib/hadoop/hadoop-streaming-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-archives.jar >> >> /usr/lib/hadoop/hadoop-openstack.jar >> >> /usr/lib/hadoop/hadoop-distcp.jar >> >> /usr/lib/hadoop/hadoop-annotations.jar >> >> /usr/lib/hadoop/hadoop-distcp-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-streaming.jar >> >> /usr/lib/hadoop/hadoop-rumen-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-common.jar >> >> /usr/lib/hadoop/hadoop-nfs-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-common-2.7.1-amzn-1-tests.jar >> >> /usr/lib/hadoop/hadoop-ant-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-datajoin.jar >> >> /usr/lib/hadoop/hadoop-ant.jar >> >> /usr/lib/hadoop/hadoop-extras-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/hadoop-openstack-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop/lib/jackson-xc-1.9.13.jar >> >> /usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar >> >> /usr/lib/hadoop/lib/curator-client-2.7.1.jar >> >> /usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar >> >> /usr/lib/hadoop/lib/commons-io-2.4.jar >> >> /usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar >> >> /usr/lib/hadoop/lib/log4j-1.2.17.jar >> >> /usr/lib/hadoop/lib/junit-4.11.jar >> >> /usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar >> >> /usr/lib/hadoop/lib/commons-cli-1.2.jar >> >> /usr/lib/hadoop/lib/curator-recipes-2.7.1.jar >> >> /usr/lib/hadoop/lib/xmlenc-0.52.jar >> >> /usr/lib/hadoop/lib/zookeeper-3.4.6.jar >> >> /usr/lib/hadoop/lib/jsr305-3.0.0.jar >> >> /usr/lib/hadoop/lib/htrace-core-3.1.0-incubating.jar >> >> /usr/lib/hadoop/lib/httpclient-4.3.4.jar >> >> /usr/lib/hadoop/lib/jettison-1.1.jar >> >> /usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar >> >> /usr/lib/hadoop/lib/commons-math3-3.1.1.jar >> >> /usr/lib/hadoop/lib/jersey-core-1.9.jar >> >> /usr/lib/hadoop/lib/httpcore-4.3.2.jar >> >> /usr/lib/hadoop/lib/commons-compress-1.4.1.jar >> >> /usr/lib/hadoop/lib/asm-3.2.jar >> >> /usr/lib/hadoop/lib/slf4j-api-1.7.10.jar >> >> /usr/lib/hadoop/lib/xz-1.0.jar >> >> /usr/lib/hadoop/lib/commons-collections-3.2.1.jar >> >> /usr/lib/hadoop/lib/commons-net-3.1.jar >> >> /usr/lib/hadoop/lib/commons-configuration-1.6.jar >> >> /usr/lib/hadoop/lib/jetty-util-6.1.26-emr.jar >> >> /usr/lib/hadoop/lib/commons-codec-1.4.jar >> >> /usr/lib/hadoop/lib/protobuf-java-2.5.0.jar >> >> /usr/lib/hadoop/lib/jetty-6.1.26-emr.jar >> >> /usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar >> >> /usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar >> >> /usr/lib/hadoop/lib/commons-logging-1.1.3.jar >> >> /usr/lib/hadoop/lib/jersey-json-1.9.jar >> >> /usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar >> >> /usr/lib/hadoop/lib/gson-2.2.4.jar >> >> /usr/lib/hadoop/lib/stax-api-1.0-2.jar >> >> /usr/lib/hadoop/lib/commons-digester-1.8.jar >> >> /usr/lib/hadoop/lib/servlet-api-2.5.jar >> >> /usr/lib/hadoop/lib/curator-framework-2.7.1.jar >> >> /usr/lib/hadoop/lib/commons-httpclient-3.1.jar >> >> /usr/lib/hadoop/lib/jets3t-0.9.0.jar >> >> /usr/lib/hadoop/lib/jaxb-api-2.2.2.jar >> >> /usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar >> >> /usr/lib/hadoop/lib/mockito-all-1.8.5.jar >> >> /usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar >> >> /usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar >> >> /usr/lib/hadoop/lib/paranamer-2.3.jar >> >> /usr/lib/hadoop/lib/avro-1.7.4.jar >> >> /usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar >> >> /usr/lib/hadoop/lib/jsp-api-2.1.jar >> >> /usr/lib/hadoop/lib/api-util-1.0.0-M20.jar >> >> /usr/lib/hadoop/lib/activation-1.1.jar >> >> /usr/lib/hadoop/lib/emr-metrics-client-2.1.0.jar >> >> /usr/lib/hadoop/lib/commons-lang-2.6.jar >> >> /usr/lib/hadoop/lib/jersey-server-1.9.jar >> >> /usr/lib/hadoop/lib/guava-11.0.2.jar >> >> /usr/lib/hadoop/lib/jsch-0.1.42.jar >> >> /usr/lib/hadoop/lib/netty-3.6.2.Final.jar >> >> /usr/lib/hadoop/lib/hamcrest-core-1.3.jar >> >> /usr/lib/hadoop-hdfs/hadoop-hdfs.jar >> >> /usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.1-amzn-1-tests.jar >> >> /usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar >> >> /usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar >> >> /usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar >> >> /usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar >> >> /usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar >> >> /usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar >> >> /usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar >> >> /usr/lib/hadoop-hdfs/lib/httpcore-4.3.2.jar >> >> /usr/lib/hadoop-hdfs/lib/asm-3.2.jar >> >> /usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar >> >> /usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar >> >> /usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar >> >> /usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26-emr.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar >> >> /usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar >> >> /usr/lib/hadoop-hdfs/lib/jetty-6.1.26-emr.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar >> >> /usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar >> >> /usr/lib/hadoop-hdfs/lib/gson-2.2.4.jar >> >> /usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar >> >> /usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar >> >> /usr/lib/hadoop-hdfs/lib/emr-metrics-client-2.1.0.jar >> >> /usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar >> >> /usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar >> >> /usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar >> >> /usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient- >> 2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-rds-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-extras.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar >> >> /usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-elasticbeanstalk-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar >> >> /usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar >> >> /usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-marketplacecommercean >> alytics-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-datapipeline-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins >> -2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2. >> 7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/commons-io-2.4.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-archives-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudtrail-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/log4j-1.2.17.jar >> >> /usr/lib/hadoop-mapreduce/junit-4.11.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-aws-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudfront-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-machinelearning-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-iam-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jackson-databind-2.4.4.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-sls-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar >> >> /usr/lib/hadoop-mapreduce/commons-cli-1.2.jar >> >> /usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar >> >> /usr/lib/hadoop-mapreduce/xmlenc-0.52.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-efs-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-devicefarm-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-auth-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.jar >> >> /usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar >> >> /usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar >> >> /usr/lib/hadoop-mapreduce/htrace-core-3.1.0-incubating.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-core-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cognitoidentity-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/httpclient-4.3.4.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.7.1-amzn-1.ja= r >> >> /usr/lib/hadoop-mapreduce/jettison-1.1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-autoscaling-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-simpledb-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-kms-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-api-gateway-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-dynamodb-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar >> >> /usr/lib/hadoop-mapreduce/jersey-core-1.9.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-config-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-ssm-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-sls.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudwatchmetrics-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-gridmix.jar >> >> /usr/lib/hadoop-mapreduce/httpcore-4.3.2.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-ses-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-auth.jar >> >> /usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/asm-3.2.jar >> >> /usr/lib/hadoop-mapreduce/xz-1.0.jar >> >> /usr/lib/hadoop-mapreduce/commons-collections-3.2.1.jar >> >> /usr/lib/hadoop-mapreduce/commons-net-3.1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudformation-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-rumen.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-azure-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-emr-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient- >> 2.7.1-amzn-1-tests.jar >> >> /usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-ecr-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-ec2-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jetty-util-6.1.26-emr.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-azure.jar >> >> /usr/lib/hadoop-mapreduce/commons-codec-1.4.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-importexport-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-iot-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/jetty-6.1.26-emr.jar >> >> /usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar >> >> /usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-glacier-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-waf-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jackson-core-2.4.4.jar >> >> /usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-elastictranscoder-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-events-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-codepipeline-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-elasticache-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jersey-json-1.9.jar >> >> /usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-aws.jar >> >> /usr/lib/hadoop-mapreduce/gson-2.2.4.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-redshift-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cognitosync-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-route53-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar >> >> /usr/lib/hadoop-mapreduce/commons-digester-1.8.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudhsm-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/servlet-api-2.5.jar >> >> /usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-storagegateway-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-archives.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-openstack.jar >> >> /usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar >> >> /usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-opsworks-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-distcp.jar >> >> /usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar >> >> /usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-ecs-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-sts-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-codedeploy-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jackson-annotations-2.4.4.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-directory-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudsearch-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/paranamer-2.3.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-swf-libraries-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/avro-1.7.4.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-support-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-elasticloadbalancing-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jsp-api-2.1.jar >> >> /usr/lib/hadoop-mapreduce/azure-storage-2.0.0.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-logs-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-streaming.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-sqs-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-kinesis-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar >> >> /usr/lib/hadoop-mapreduce/activation-1.1.jar >> >> /usr/lib/hadoop-mapreduce/emr-metrics-client-2.1.0.jar >> >> /usr/lib/hadoop-mapreduce/commons-lang-2.6.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-directconnect-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-sns-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-workspaces-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jersey-server-1.9.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-s3-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-cloudwatch-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/guava-11.0.2.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-ant-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-datajoin.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-elasticsearch-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-ant.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-codecommit-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/jsch-0.1.42.jar >> >> /usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-lambda-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/joda-time-2.8.1.jar >> >> /usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-extras-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-simpleworkflow-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-mapreduce/aws-java-sdk-inspector-1.10.48.jar >> >> /usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar >> >> /usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar >> >> /usr/lib/hadoop-mapreduce/lib/junit-4.11.jar >> >> /usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar >> >> /usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar >> >> /usr/lib/hadoop-mapreduce/lib/guice-3.0.jar >> >> /usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar >> >> /usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar >> >> /usr/lib/hadoop-mapreduce/lib/asm-3.2.jar >> >> /usr/lib/hadoop-mapreduce/lib/xz-1.0.jar >> >> /usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar >> >> /usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar >> >> /usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar >> >> /usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar >> >> /usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar >> >> /usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar >> >> /usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar >> >> /usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar >> >> /usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar >> >> /usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final.jar >> >> /usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-client.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshe >> ll-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-sharedcachemanager- >> 2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-registry.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-api.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryse >> rvice-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-common.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am- >> launcher-2.7.1-amzn-1.jar >> >> /usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar >> >> /usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar >> >> /usr/lib/hadoop-yarn/lib/commons-io-2.4.jar >> >> /usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar >> >> /usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar >> >> /usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar >> >> /usr/lib/hadoop-yarn/lib/javax.inject-1.jar >> >> /usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar >> >> /usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar >> >> /usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar >> >> /usr/lib/hadoop-yarn/lib/jettison-1.1.jar >> >> /usr/lib/hadoop-yarn/lib/guice-3.0.jar >> >> /usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar >> >> /usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar >> >> /usr/lib/hadoop-yarn/lib/asm-3.2.jar >> >> /usr/lib/hadoop-yarn/lib/xz-1.0.jar >> >> /usr/lib/hadoop-yarn/lib/commons-collections-3.2.1.jar >> >> /usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar >> >> /usr/lib/hadoop-yarn/lib/jetty-util-6.1.26-emr.jar >> >> /usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar >> >> /usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar >> >> /usr/lib/hadoop-yarn/lib/jetty-6.1.26-emr.jar >> >> /usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar >> >> /usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar >> >> /usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar >> >> /usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar >> >> /usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar >> >> /usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar >> >> /usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.jar >> >> /usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar >> >> /usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar >> >> /usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar >> >> /usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar >> >> /usr/lib/hadoop-yarn/lib/activation-1.1.jar >> >> /usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar >> >> /usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar >> >> /usr/lib/hadoop-yarn/lib/guava-11.0.2.jar >> >> /usr/lib/hadoop-lzo/lib/hadoop-lzo-0.4.19.jar >> >> /usr/lib/hadoop-lzo/lib/hadoop-lzo.jar >> >> /usr/share/aws/emr/emrfs/conf >> >> /usr/share/aws/emr/emrfs/lib/jsr-275-0.9.1.jar >> >> /usr/share/aws/emr/emrfs/lib/junit-4.11.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-cli-1.2.jar >> >> /usr/share/aws/emr/emrfs/lib/javax.inject-1.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-codec-1.9.jar >> >> /usr/share/aws/emr/emrfs/lib/httpclient-4.3.4.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-httpclient-3.0.jar >> >> /usr/share/aws/emr/emrfs/lib/guice-3.0.jar >> >> /usr/share/aws/emr/emrfs/lib/httpcore-4.3.2.jar >> >> /usr/share/aws/emr/emrfs/lib/joda-time-2.3.jar >> >> /usr/share/aws/emr/emrfs/lib/bcprov-jdk15on-1.51.jar >> >> /usr/share/aws/emr/emrfs/lib/emrfs-hadoop-2.4.0.jar >> >> /usr/share/aws/emr/emrfs/lib/protobuf-java-2.5.0.jar >> >> /usr/share/aws/emr/emrfs/lib/slf4j-api-1.7.16.jar >> >> /usr/share/aws/emr/emrfs/lib/aopalliance-1.0.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-logging-1.1.3.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-lang3-3.3.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-math-2.1.jar >> >> /usr/share/aws/emr/emrfs/lib/gson-2.2.4.jar >> >> /usr/share/aws/emr/emrfs/lib/jsr305-2.0.1.jar >> >> /usr/share/aws/emr/emrfs/lib/emr-core-2.5.0.jar >> >> /usr/share/aws/emr/emrfs/lib/emr-metrics-client-2.1.0.jar >> >> /usr/share/aws/emr/emrfs/lib/commons-exec-1.2.jar >> >> /usr/share/aws/emr/emrfs/lib/guava-15.0.jar >> >> /usr/share/aws/emr/emrfs/lib/bcpkix-jdk15on-1.51.jar >> >> /usr/share/aws/emr/emrfs/lib/hamcrest-core-1.3.jar >> >> /usr/share/aws/emr/emrfs/auxlib/* >> >> /usr/share/aws/emr/lib/jsr-275-0.9.1.jar >> >> /usr/share/aws/emr/lib/commons-httpclient-3.0.jar >> >> /usr/share/aws/emr/lib/joda-time-2.3.jar >> >> /usr/share/aws/emr/lib/slf4j-api-1.7.16.jar >> >> /usr/share/aws/emr/lib/commons-codec-1.2.jar >> >> /usr/share/aws/emr/lib/gson-2.2.4.jar >> >> /usr/share/aws/emr/lib/commons-logging-1.0.3.jar >> >> /usr/share/aws/emr/lib/jsr305-2.0.1.jar >> >> /usr/share/aws/emr/lib/emr-core-2.5.0.jar >> >> /usr/share/aws/emr/ddb/lib/emr-ddb-hadoop.jar >> >> /usr/share/aws/emr/goodies/lib/emr-hadoop-goodies.jar >> >> /usr/share/aws/emr/kinesis/lib/emr-kinesis-hadoop.jar >> >> /usr/share/aws/emr/cloudwatch-sink/lib/cloudwatch-sink-1.0.0.jar >> >> /usr/share/aws/emr/cloudwatch-sink/lib/cloudwatch-sink.jar >> >> >> >> On Tue, Nov 1, 2016 at 3:57 AM, Till Rohrmann >> wrote: >> >> Hi Justin, >> >> >> >> I think this might be a problem in Flink's Kinesis consumer. The Flink >> Kinesis consumer uses the aws-java-sdk version 1.10.71 which indeed >> contains the afore mentioned methods. However, already version 1.10.46 n= o >> longer contains this method. Thus, I suspect, that Yarn puts some older >> version of this jar into the classpath. For these cases, I think we have= to >> shade our aws-java-sdk dependency so that it also works with older versi= ons >> of EMR. >> >> >> >> In order to verify this, could you tell us which EMR version you're >> running? Additionally, it would be helpful if you sent us the classpath >> with which the Flink cluster was started on Yarn. You can find this >> information at the beginning of your TaskManager log file. Thanks a lot. >> >> >> >> Cheers, >> >> Till >> >> >> >> On Mon, Oct 31, 2016 at 8:22 PM, Justin Yan >> wrote: >> >> Hi all - first time on the mailing list, so my apologies if I break >> protocol on anything. Really excited to be using Flink, and hoping to b= e >> active here in the future! Also, apologies for the length of this email= - >> I tried to include details but may have gone overboard. >> >> >> >> The gist of my problem is an issue with packaging the Flink Kinesis >> Connector into my user code for execution on a YARN cluster in EMR - >> there's some dependency trouble happening, but after about 48 hours of >> attempts, I'm not sure how to make progress, and I'd really appreciate a= ny >> ideas or assistance. Thank you in advance! >> >> >> >> ### First, Some Context. >> >> >> >> We're hoping to write our Flink jobs in scala 2.11. The Flink JM/TMs >> currently run on an EMR cluster with Hadoop 2.7 as YARN containers. We = run >> our jobs via an Azkaban server, which has the Hadoop and Flink clients >> installed, and the configurations are set to point at the YARN master on >> our EMR cluster (with $HADOOP_HOME set so Flink can discover the hadoop >> configs). We're using Java OpenJDK7 everywhere, and Maven 3.3.9 when >> building Flink from source. >> >> >> >> We use SBT and the assembly plugin to create an Uberjar of our code and >> its dependencies. This gets uploaded to Azkaban, whereupon the followin= g >> command is run on the azkaban server to execute a Flink job: >> >> >> >> flink run -c usercodeuberjar-assembly-1.0.jar >> >> >> >> I've successfully run a few flink jobs that execute on our EMR cluster i= n >> this fashion (the WordCount example, etc.). >> >> >> >> ### The Problem >> >> >> >> We use AWS Kinesis, and are hoping to integrate Flink with it. >> Naturally, we were hoping to use the Kinesis connector: < >> https://ci.apache.org/projects/flink/flink-docs-release-1. >> 1/apis/streaming/connectors/kinesis.html>. >> >> >> >> After following the instructions with some experimentation, I was able t= o >> run a Flink Kinesis application on my laptop in Local Cluster mode. >> (Ubuntu 16.04, local cluster initiated with the `./start-local.sh` >> command, job submitted via `flink run -c >> usercodeuberjar-assembly-1.0.jar`) >> >> >> >> I uploaded the same JAR to Azkaban and tried to run the same command to >> submit to our EMR cluster, and got a `java.lang.NoSuchMethodError: >> com.amazonaws.SDKGlobalConfiguration.isInRegionOptimizedModeEnabled()` >> (I've included the full stack trace at the bottom of this email). I wen= t >> to inspect the uploaded JAR with a `unzip usercodeuberjar-assembly-1.0.j= ar`, >> looked in `com/amazonaws` and found the SDKGlobalConfiguration.class fil= e. >> I decompiled and inspected it, and the isInRegionOptimizedModeEnabled >> method that was purportedly missing was indeed present. >> >> >> >> I've included the steps I took to manifest this problem below, along wit= h >> a variety of things that I tried to do to resolve the problem - any help= or >> insight is greatly appreciated! >> >> >> >> ### Repro >> >> >> >> I'm not sure how to provide a clear repro, but I'll try to include as >> much detail as I can about the sequence of actions and commands I ran si= nce >> there may be some obvious mistakes: >> >> >> >> Downloading the flink release to my laptop: >> >> >> >> wget http://www-us.apache.org/dist/flink/flink-1.1.3/flink-1.1.3- >> bin-hadoop27-scala_2.11.tgz >> >> tar xfzv flink-1.1.3-bin-hadoop27-scala_2.11.tgz >> >> >> >> I then SSH'd into Azkaban, and ran the same two commands, while adding >> the bin/ directory to my PATH and tweaking the config for >> fs.hdfs.hadoopconf. Next, after getting the flink binaries, I went to >> fetch the source code in order to follow the instructions here: < >> https://ci.apache.org/projects/flink/flink-docs-release-1. >> 1/apis/streaming/connectors/kinesis.html> >> >> >> >> wget https://github.com/apache/flink/archive/release-1.1.3.tar.gz >> >> tar xfzv release-1.1.3.tar.gz >> >> >> >> Here, I wanted to leverage our EMR instance profile Role instead of >> passing in credentials, hence I wanted the AUTO value for the >> "aws.credentials.provider" config, which seems to have been added after >> 1.1.3 - I made a couple of small tweaks to AWSConfigConstants.java and >> AWSUtil.java to allow for that AUTO value. >> >> >> >> Next, we're using Scala 2.11, so per the instructions here, I changed th= e >> scala version: > s/flink/flink-docs-release-1.1/setup/building.html#scala-versions> >> >> >> >> tools/change-scala-version.sh 2.11 >> >> >> >> Back to the Kinesis Connector documentation... >> >> >> >> mvn clean install -Pinclude-kinesis -DskipTests >> >> cd flink-dist >> >> mvn clean install -Pinclude-kinesis -DskipTests >> >> >> >> When running that second mvn clean install, I get some warnings about th= e >> maven shade plugin having conflicting versions. I also get a "[WARNING] >> The requested profile "include-kinesis" could not be activated because i= t >> does not exist." >> >> >> >> At this point, the instructions are not too clear on what to do. I >> proceed to this section to try and figure it out: < >> https://ci.apache.org/projects/flink/flink-docs-release-1. >> 1/apis/cluster_execution.html#linking-with-modules-not- >> contained-in-the-binary-distribution> >> >> >> >> My goal is to package everything in my usercode JAR, and I'll try to do >> that with SBT. My first try is to install the Flink Kinesis Connector J= AR >> generated by mvn clean install to my local Maven Repo: >> >> >> >> mvn install:install-file -Dfile=3Dflink-connector-kinesis_2.11-1.1.3.jar >> >> >> >> I then build the jar with a build.sbt that looks like this (extraneous >> details removed): >> >> >> >> scalaVersion in ThisBuild :=3D "2.11.8" >> >> >> >> val flinkVersion =3D "1.1.3" >> >> >> >> val flinkDependencies =3D Seq( >> >> "org.apache.flink" %% "flink-scala" % flinkVersion, >> >> "org.apache.flink" %% "flink-streaming-scala" % flinkVersion, >> >> "org.apache.flink" %% "flink-connector-kinesis" % flinkVersion >> >> ) >> >> >> >> lazy val proj =3D (project in file(".")). >> >> settings( >> >> libraryDependencies ++=3D flinkDependencies >> >> ) >> >> >> >> After this builds, I unzip the jar and use JD to decompile the >> com.amazonaws.SDKGlobalConfiguration class file to see if the method in >> question is present or not (it is). I then run the jar locally with a >> `flink run -c usercodeuberjar-assembly-1.0.jar`, and I see >> it running just fine when navigating to localhost:8081. I then upload t= his >> same JAR to our Azkaban server, and run the same `flink run -c >> usercodeuberjar-assembly-1.0.jar` command to submit as a YARN >> application - this time, I get the `NoSuchMethodError`. >> >> >> >> I've tried a variety of permutations of this, so I'll attempt to list >> them out along with their results: >> >> >> >> 1. A non-kinesis Flink job: I was able to successfully the example >> WordCount Flink job as a YARN application. >> >> 2. I mvn installed the newly built flink-scala and flink-streaming-scala >> JARs to my local maven repository in case these were different - after >> building and running on Azkaban... same error. >> >> 3. Using the newly-built flink-dist JAR (with the -Pinclude-kinesis >> flag): After replacing the flink-dist JAR in the /lib dir on Azkaban (th= at >> the `flink` command runs), I still had the same error. >> >> 4. Packaging the JAR in different ways: >> >> - I tried adding the flink-connector-kinesis JAR by adding it to a >> /lib directory in my SBT project for direct inclusion. This actually >> caused the NoSuchMethodError to occur during *local* execution as well. >> >> - I tried using mvn-assembly to package all of the >> flink-connector-kinesis dependencies into that JAR, and then added it to >> the /lib directory in my SBT project. Local execution no longer has err= or, >> but submission from Azkaban still has same error. >> >> 5. I thought it might be a classpath issue (since my laptop doesn't have >> a hadoop installation, so I figured there may be some kind of collision >> with the AWS SDK included by EMR), so I set, on Azkaban, the environment >> variable FLINK_CLASSPATH=3Dusercodeuberjar-assembly-1.0.jar in order to >> get it prepended - same error. >> >> 6. I realized this wasn't necessarily doing anything to the resolution >> of classnames of the Flink job executing in YARN. So I dug into the cli= ent >> source, which eventually led me to flink-clients/.../program/PackagedPro= gram.java >> which has the following line of code setting the ClassLoader: >> >> >> >> this.userCodeClassLoader =3D JobWithJars.buildUserCodeClassLoader(getAll= Libraries(), >> classpaths, getClass().getClassLoader()); >> >> >> >> getAllLibraries() does seem to set the jar that you pass into the `flink= ` >> command at the top of the class resolution hierarchy, which, as my previ= ous >> foray into decompilation shows, does seem to include the method that is >> supposedly missing. >> >> >> >> At this point, I ran out of ideas to investigate, and so I'm hoping >> someone here is able to help me. Thanks in advance for reading this! >> >> >> >> Full Stack Trace: >> >> >> >> java.lang.NoSuchMethodError: com.amazonaws.SDKGlobalConfigu >> ration.isInRegionOptimizedModeEnabled()Z >> >> at com.amazonaws.ClientConfigurationFactory.getConfig(ClientCon >> figurationFactory.java:35) >> >> at org.apache.flink.streaming.connectors.kinesis.util.AWSUtil. >> createKinesisClient(AWSUtil.java:50) >> >> at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisP >> roxy.(KinesisProxy.java:118) >> >> at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisP >> roxy.create(KinesisProxy.java:176) >> >> at org.apache.flink.streaming.connectors.kinesis.internals.Kine >> sisDataFetcher.(KinesisDataFetcher.java:188) >> >> at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisCo >> nsumer.run(FlinkKinesisConsumer.java:198) >> >> at org.apache.flink.streaming.api.operators.StreamSource.run( >> StreamSource.java:80) >> >> at org.apache.flink.streaming.api.operators.StreamSource.run( >> StreamSource.java:53) >> >> at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.ru >> n(SourceStreamTask.java:56) >> >> at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke( >> StreamTask.java:266) >> >> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:585) >> >> at java.lang.Thread.run(Thread.java:745) >> >> >> >> >> >> >> >> >> >> >> > > --001a1141fd20693ebc0540c75e4a Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

I encountered this issue b= efore as well.

Which Maven version are you using?
Maven 3.= 3.x does not properly shade dependencies.
You have to use Maven 3.0.3 (= see [1]).

Best, Fabian

2016-11-08 11:05 GMT+01:00 Till Rohrmann <trohrmann@apache.org>= :
Yes this defini= tely looks like a similar issue. Once we shade the aws dependencies in the = Kinesis connector, the problem should be (hopefully) resolved. I've add= ed your problem description to the JIRA. Thanks for reporting it.

<= /div>
Cheers,
Till

On Mon, = Nov 7, 2016 at 8:01 PM, Foster, Craig <foscraig@amazon.com> wrote:

I think this is a similar issue but it was brought to my attention that we= =E2=80=99re also seeing this on EMR 5.1.0 with the FlinkKinesisConsumer. Wh= at I did to duplicate this issue:

<= u>1)=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 I have used the Wikiedit quickstart but used Kinesis instead of Kafka= to publish results with a FlinkKinesisProducer. This works fine. I can use= a separate script to read what was published to my stream.

<= u>2)=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 When using a FlinkKinesisConsumer, however, I get an error:=

=C2=A0

java.lang.NoSuchMethodError: org.apache.http.params.HttpConnectionPar= ams.setSoKeepalive(Lorg/apache/http/params/HttpParams;Z)V<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at com.amazonaws.http.HttpClientFactory.createHttpC= lient(HttpClientFactory.java:96)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at com.amazonaws.http.AmazonHttpClient.<init>= (AmazonHttpClient.java:187)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at com.amazonaws.AmazonWebServiceClient.<init>= ;(AmazonWebServiceClient.java:136)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at com.amazonaws.services.kinesis.AmazonKinesisClie= nt.<init>(AmazonKinesisClient.java:221)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at com.amazonaws.services.kinesis.AmazonKinesisClie= nt.<init>(AmazonKinesisClient.java:197)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.connectors.kinesis.ut= il.AWSUtil.createKinesisClient(AWSUtil.java:56)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.connectors.kinesis.pr= oxy.KinesisProxy.<init>(KinesisProxy.java:118)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.connectors.kinesis.pr= oxy.KinesisProxy.create(KinesisProxy.java:176)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.connectors.kinesis.in= ternals.KinesisDataFetcher.<init>(KinesisDataFetcher.java:1= 88)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.connectors.kinesis.Fl= inkKinesisConsumer.run(FlinkKinesisConsumer.java:198)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.api.operators.StreamS= ource.run(StreamSource.java:80)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.api.operators.StreamS= ource.run(StreamSource.java:53)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.runtime.tasks.SourceS= treamTask.run(SourceStreamTask.java:56)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.streaming.runtime.tasks.StreamT= ask.invoke(StreamTask.java:266)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at org.apache.flink.runtime.taskmanager.Task.run(Ta= sk.java:585)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)<= /u>

=C2=A0

=C2=A0

=C2=A0

=C2=A0

F= rom: Robert Metzger <rmetzger@apache.org>
Reply-To: "
user@flink.apache.org" <user@flink.apache.org>
Date: Friday, November 4, 2016 at 2:57 AM
To: "user@flink.apache.org" <user@flink.apache.org>
Subject: Re: Kinesis Connector Dependency Problems

=C2=A0

Thank you for helping to investigate the issue. I= 9;ve filed an issue in our bugtracker:=C2=A0https://issues.apache.= org/jira/browse/FLINK-5013=C2=A0

=C2=A0

On Wed, Nov 2, 2016 at 10:09 PM, Justin Yan <justin.yan@remitly.= com> wrote:

Sorry it took me a little while, but I'm happy t= o report back that it seems to be working properly with EMR 4.8.=C2=A0 It s= eems so obvious in retrospect... thanks again for the assistance!

=C2=A0

Cheers,

=C2=A0

Justin

=C2=A0

On Tue, Nov 1, 2016 at 11:44 AM, Robert Metzger <= rmetzger@apache.or= g> wrote:

Hi Justin,

=C2=A0

thank you for sharing the classpath of the Flink con= tainer with us. It contains what Till was already expecting: An older versi= on of the AWS SDK.

=C2=A0

If you have some spare time, could you quickly try t= o run your program with a newer EMR version, just to validate our suspicion= ?=C2=A0

If the error doesn't occur on a more recent EMR = version, then we know why its happening.

=C2=A0

We'll then probably need to shade (relocate) the= Kinesis code to make it work with older EMR libraries.

=C2=A0

Regards,

Robert

=C2=A0

=C2=A0

On Tue, Nov 1, 2016 at 6:27 PM, Justin Yan <justin.yan@remitly.c= om> wrote:

Hi there,

=C2=A0

We're using EMR 4.4.0 -> I suppose this is a = bit old, and I can migrate forward if you think that would be best.<= u>

=C2=A0

I've appended the classpath that the Flink clust= er was started with at the end of this email (with a slight improvement to = the formatting to make it readable).

=C2=A0

Willing to poke around or fiddle with this as necess= ary - thanks very much for the help!

=C2=A0

Justin

=C2=A0

Task Manager's classpath from logs:

=C2=A0

lib/flink-dist_2.11-1.1.3.jar

lib/flink-python_2.11-1.1.3.jar

lib/log4j-1.2.17.jar

lib/slf4j-log4j12-1.7.7.jar

logback.xml

log4j.properties

flink.jar

flink-conf.yaml

/etc/hadoop/conf

/usr/lib/hadoop/hadoop-annotations-2.7.1-amzn-1= .jar

/usr/lib/hadoop/hadoop-extras.jar=

/usr/lib/hadoop/hadoop-archives-2.7.1-amzn-1.ja= r

/usr/lib/hadoop/hadoop-aws-2.7.1-amzn-1.jar<= /u>

/usr/lib/hadoop/hadoop-sls-2.7.1-amzn-1.jar<= /u>

/usr/lib/hadoop/hadoop-auth-2.7.1-amzn-1.jar=

/usr/lib/hadoop/hadoop-sls.jar

/usr/lib/hadoop/hadoop-gridmix.jar

/usr/lib/hadoop/hadoop-auth.jar

/usr/lib/hadoop/hadoop-gridmix-2.7.1-amzn-1.jar=

/usr/lib/hadoop/hadoop-rumen.jar<= /p>

/usr/lib/hadoop/hadoop-azure-2.7.1-amzn-1.jar

/usr/lib/hadoop/hadoop-common-2.7.1-amzn-1.jar<= u>

/usr/lib/hadoop/hadoop-azure.jar<= /p>

/usr/lib/hadoop/hadoop-datajoin-2.7.1-amzn-1.ja= r

/usr/lib/hadoop/hadoop-nfs.jar

/usr/lib/hadoop/hadoop-aws.jar

/usr/lib/hadoop/hadoop-streaming-2.7.1-amzn-1.j= ar

/usr/lib/hadoop/hadoop-archives.jar

/usr/lib/hadoop/hadoop-openstack.jar<= /u>

/usr/lib/hadoop/hadoop-distcp.jar=

/usr/lib/hadoop/hadoop-annotations.jar

/usr/lib/hadoop/hadoop-distcp-2.7.1-amzn-1.jar<= u>

/usr/lib/hadoop/hadoop-streaming.jar<= /u>

/usr/lib/hadoop/hadoop-rumen-2.7.1-amzn-1.jar

/usr/lib/hadoop/hadoop-common.jar=

/usr/lib/hadoop/hadoop-nfs-2.7.1-amzn-1.jar<= /u>

/usr/lib/hadoop/hadoop-common-2.7.1-amzn-1-test= s.jar

/usr/lib/hadoop/hadoop-ant-2.7.1-amzn-1.jar<= /u>

/usr/lib/hadoop/hadoop-datajoin.jar

/usr/lib/hadoop/hadoop-ant.jar

/usr/lib/hadoop/hadoop-extras-2.7.1-amzn-1.jar<= u>

/usr/lib/hadoop/hadoop-openstack-2.7.1-amzn-1.j= ar

/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar

/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar<= u>

/usr/lib/hadoop/lib/curator-client-2.7.1.jar=

/usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.j= ar

/usr/lib/hadoop/lib/commons-io-2.4.jar

/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar=

/usr/lib/hadoop/lib/log4j-1.2.17.jar<= /u>

/usr/lib/hadoop/lib/junit-4.11.jar

/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar=

/usr/lib/hadoop/lib/commons-cli-1.2.jar<= u>

/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar

/usr/lib/hadoop/lib/xmlenc-0.52.jar

/usr/lib/hadoop/lib/zookeeper-3.4.6.jar<= u>

/usr/lib/hadoop/lib/jsr305-3.0.0.jar<= /u>

/usr/lib/hadoop/lib/htrace-core-3.1.0-incubatin= g.jar

/usr/lib/hadoop/lib/httpclient-4.3.4.jar=

/usr/lib/hadoop/lib/jettison-1.1.jar<= /u>

/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar=

/usr/lib/hadoop/lib/commons-math3-3.1.1.jar<= /u>

/usr/lib/hadoop/lib/jersey-core-1.9.jar<= u>

/usr/lib/hadoop/lib/httpcore-4.3.2.jar

/usr/lib/hadoop/lib/commons-compress-1.4.1.jar<= u>

/usr/lib/hadoop/lib/asm-3.2.jar

/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar=

/usr/lib/hadoop/lib/xz-1.0.jar

/usr/lib/hadoop/lib/commons-collections-3.2.1.j= ar

/usr/lib/hadoop/lib/commons-net-3.1.jar<= u>

/usr/lib/hadoop/lib/commons-configuration-1.6.j= ar

/usr/lib/hadoop/lib/jetty-util-6.1.26-emr.jar

/usr/lib/hadoop/lib/commons-codec-1.4.jar

/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar<= /u>

/usr/lib/hadoop/lib/jetty-6.1.26-emr.jar=

/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar<= /u>

/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0= .0-M15.jar

/usr/lib/hadoop/lib/commons-logging-1.1.3.jar

/usr/lib/hadoop/lib/jersey-json-1.9.jar<= u>

/usr/lib/hadoop/lib/jackson-core-asl-1.9.13.jar=

/usr/lib/hadoop/lib/gson-2.2.4.jar

/usr/lib/hadoop/lib/stax-api-1.0-2.jar

/usr/lib/hadoop/lib/commons-digester-1.8.jar=

/usr/lib/hadoop/lib/servlet-api-2.5.jar<= u>

/usr/lib/hadoop/lib/curator-framework-2.7.1.jar=

/usr/lib/hadoop/lib/commons-httpclient-3.1.jar<= u>

/usr/lib/hadoop/lib/jets3t-0.9.0.jar<= /u>

/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar

/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar=

/usr/lib/hadoop/lib/mockito-all-1.8.5.jar

/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar<= /u>

/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar

/usr/lib/hadoop/lib/paranamer-2.3.jar=

/usr/lib/hadoop/lib/avro-1.7.4.jar

/usr/lib/hadoop/lib/commons-beanutils-core-1.8.= 0.jar

/usr/lib/hadoop/lib/jsp-api-2.1.jar

/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar

/usr/lib/hadoop/lib/activation-1.1.jar

/usr/lib/hadoop/lib/emr-metrics-client-2.1.0.ja= r

/usr/lib/hadoop/lib/commons-lang-2.6.jar=

/usr/lib/hadoop/lib/jersey-server-1.9.jar

/usr/lib/hadoop/lib/guava-11.0.2.jar<= /u>

/usr/lib/hadoop/lib/jsch-0.1.42.jar

/usr/lib/hadoop/lib/netty-3.6.2.Final.jar

/usr/lib/hadoop/lib/hamcrest-core-1.3.jar

/usr/lib/hadoop-hdfs/hadoop-hdfs.jar<= /u>

/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.1-amzn= -1.jar

/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.1-amzn-1.j= ar

/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.1-amzn-1-t= ests.jar

/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar=

/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.9= .13.jar

/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar<= /u>

/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar

/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.= jar

/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar=

/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar=

/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar

/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0-incu= bating.jar

/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar=

/usr/lib/hadoop-hdfs/lib/httpcore-4.3.2.jar<= /u>

/usr/lib/hadoop-hdfs/lib/asm-3.2.jar<= /u>

/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final= .jar

/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar=

/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar=

/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26-emr.= jar

/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar<= u>

/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.ja= r

/usr/lib/hadoop-hdfs/lib/jetty-6.1.26-emr.jar

/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.= jar

/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.9.1= 3.jar

/usr/lib/hadoop-hdfs/lib/gson-2.2.4.jar<= u>

/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar=

/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar

/usr/lib/hadoop-hdfs/lib/emr-metrics-client-2.1= .0.jar

/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar

/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar<= u>

/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar

/usr/lib/hadoop-hdfs/lib/netty-3.6.2.Final.jar<= u>

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-jobclient-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-rds-1.10= .48.jar

/usr/lib/hadoop-mapreduce/hadoop-extras.jar<= /u>

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-exam= ples.jar

/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M2= 0.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-elasticb= eanstalk-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-hs-plugins.jar

/usr/lib/hadoop-mapreduce/curator-client-2.7.1.= jar

/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.= 9.13.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-marketpl= acecommerceanalytics-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-datapipe= line-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-hs-plugins-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-shuffle-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/commons-io-2.4.jar=

/usr/lib/hadoop-mapreduce/hadoop-archives-2.7.1= -amzn-1.jar

/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.= jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudtra= il-1.10.48.jar

/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar

/usr/lib/hadoop-mapreduce/junit-4.11.jar=

/usr/lib/hadoop-mapreduce/hadoop-aws-2.7.1-amzn= -1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudfro= nt-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-machinel= earning-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-iam-1.10= .48.jar

/usr/lib/hadoop-mapreduce/jackson-databind-2.4.= 4.jar

/usr/lib/hadoop-mapreduce/hadoop-sls-2.7.1-amzn= -1.jar

/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M= 15.jar

/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar

/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1= .jar

/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-efs-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-devicefa= rm-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-auth-2.7.1-amz= n-1.jar

/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2.j= ar

/usr/lib/hadoop-mapreduce/zookeeper-3.4.6.jar

/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar

/usr/lib/hadoop-mapreduce/htrace-core-3.1.0-inc= ubating.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-core-1.1= 0.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cognitoi= dentity-1.10.48.jar

/usr/lib/hadoop-mapreduce/httpclient-4.3.4.jar<= u>

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-common-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/jettison-1.1.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-hs.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-autoscal= ing-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-simpledb= -1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-kms-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-api-gate= way-1.10.48.jar

/usr/lib/hadoop-mapreduce/commons-beanutils-1.7= .0.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-dynamodb= -1.10.48.jar

/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.j= ar

/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-config-1= .10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-hs-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-ssm-1.10= .48.jar

/usr/lib/hadoop-mapreduce/hadoop-sls.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudwat= chmetrics-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar=

/usr/lib/hadoop-mapreduce/httpcore-4.3.2.jar=

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-app-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-ses-1.10= .48.jar

/usr/lib/hadoop-mapreduce/hadoop-auth.jar

/usr/lib/hadoop-mapreduce/commons-compress-1.4.= 1.jar

/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.7.1-= amzn-1.jar

/usr/lib/hadoop-mapreduce/asm-3.2.jar=

/usr/lib/hadoop-mapreduce/xz-1.0.jar<= /u>

/usr/lib/hadoop-mapreduce/commons-collections-3= .2.1.jar

/usr/lib/hadoop-mapreduce/commons-net-3.1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudfor= mation-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-rumen.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-shuffle.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-core.jar

/usr/lib/hadoop-mapreduce/hadoop-azure-2.7.1-am= zn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-emr-1.10= .48.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-jobclient-2.7.1-amzn-1-tests.jar

/usr/lib/hadoop-mapreduce/commons-configuration= -1.6.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-ecr-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-ec2-1.10= .48.jar

/usr/lib/hadoop-mapreduce/jetty-util-6.1.26-emr= .jar

/usr/lib/hadoop-mapreduce/hadoop-azure.jar

/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-importex= port-1.10.48.jar

/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.j= ar

/usr/lib/hadoop-mapreduce/aws-java-sdk-iot-1.10= .48.jar

/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.7.1= -amzn-1.jar

/usr/lib/hadoop-mapreduce/jetty-6.1.26-emr.jar<= u>

/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.j= ar

/usr/lib/hadoop-mapreduce/apacheds-kerberos-cod= ec-2.0.0-M15.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-glacier-= 1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-waf-1.10= .48.jar

/usr/lib/hadoop-mapreduce/jackson-core-2.4.4.ja= r

/usr/lib/hadoop-mapreduce/commons-logging-1.1.3= .jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-elastict= ranscoder-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-events-1= .10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-codepipe= line-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-elastica= che-1.10.48.jar

/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar

/usr/lib/hadoop-mapreduce/jackson-core-asl-1.9.= 13.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-common.jar

/usr/lib/hadoop-mapreduce/hadoop-aws.jar=

/usr/lib/hadoop-mapreduce/gson-2.2.4.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-redshift= -1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cognitos= ync-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-route53-= 1.10.48.jar

/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar=

/usr/lib/hadoop-mapreduce/commons-digester-1.8.= jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudhsm= -1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-streaming-2.7.= 1-amzn-1.jar

/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar

/usr/lib/hadoop-mapreduce/curator-framework-2.7= .1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-storageg= ateway-1.10.48.jar

/usr/lib/hadoop-mapreduce/commons-httpclient-3.= 1.jar

/usr/lib/hadoop-mapreduce/hadoop-archives.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-1.10.48.= jar

/usr/lib/hadoop-mapreduce/hadoop-openstack.jar<= u>

/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar

/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-opsworks= -1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-distcp.jar<= /u>

/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar=

/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.j= ar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-app.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-ecs-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-sts-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-codedepl= oy-1.10.48.jar

/usr/lib/hadoop-mapreduce/jackson-annotations-2= .4.4.jar

/usr/lib/hadoop-mapreduce/hadoop-distcp-2.7.1-a= mzn-1.jar

/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-director= y-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudsea= rch-1.10.48.jar

/usr/lib/hadoop-mapreduce/paranamer-2.3.jar<= /u>

/usr/lib/hadoop-mapreduce/aws-java-sdk-swf-libr= aries-1.10.48.jar

/usr/lib/hadoop-mapreduce/avro-1.7.4.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-support-= 1.10.48.jar

/usr/lib/hadoop-mapreduce/commons-beanutils-cor= e-1.8.0.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-elasticl= oadbalancing-1.10.48.jar

/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar

/usr/lib/hadoop-mapreduce/azure-storage-2.0.0.j= ar

/usr/lib/hadoop-mapreduce/aws-java-sdk-logs-1.1= 0.48.jar

/usr/lib/hadoop-mapreduce/metrics-core-3.0.1.ja= r

/usr/lib/hadoop-mapreduce/hadoop-streaming.jar<= u>

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-exam= ples-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-sqs-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-kinesis-= 1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.1-am= zn-1.jar

/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.ja= r

/usr/lib/hadoop-mapreduce/activation-1.1.jar=

/usr/lib/hadoop-mapreduce/emr-metrics-client-2.= 1.0.jar

/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar<= u>

/usr/lib/hadoop-mapreduce/aws-java-sdk-directco= nnect-1.10.48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-sns-1.10= .48.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-workspac= es-1.10.48.jar

/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-s3-1.10.= 48.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-jobclient.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-cloudwat= ch-1.10.48.jar

/usr/lib/hadoop-mapreduce/guava-11.0.2.jar

/usr/lib/hadoop-mapreduce/hadoop-ant-2.7.1-amzn= -1.jar

/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar

/usr/lib/hadoop-mapreduce/hadoop-mapreduce-clie= nt-core-2.7.1-amzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-elastics= earch-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-ant.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-codecomm= it-1.10.48.jar

/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar

/usr/lib/hadoop-mapreduce/netty-3.6.2.Final.jar=

/usr/lib/hadoop-mapreduce/aws-java-sdk-lambda-1= .10.48.jar

/usr/lib/hadoop-mapreduce/joda-time-2.8.1.jar

/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar=

/usr/lib/hadoop-mapreduce/hadoop-extras-2.7.1-a= mzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-simplewo= rkflow-1.10.48.jar

/usr/lib/hadoop-mapreduce/hadoop-openstack-2.7.= 1-amzn-1.jar

/usr/lib/hadoop-mapreduce/aws-java-sdk-inspecto= r-1.10.48.jar

/usr/lib/hadoop-mapreduce/lib/jackson-mapper-as= l-1.9.13.jar

/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.ja= r

/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar<= u>

/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar=

/usr/lib/hadoop-mapreduce/lib/javax.inject-1.ja= r

/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.= jar

/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar<= /u>

/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.j= ar

/usr/lib/hadoop-mapreduce/lib/commons-compress-= 1.4.1.jar

/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar

/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar=

/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.= 8.jar

/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5= .0.jar

/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.j= ar

/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-= 1.9.13.jar

/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4= .1.jar

/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar=

/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar=

/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0= .jar

/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9= .jar

/usr/lib/hadoop-mapreduce/lib/netty-3.6.2.Final= .jar

/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3= .jar

/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.7.1= -amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-client.jar<= /u>

/usr/lib/hadoop-yarn/hadoop-yarn-applications-d= istributedshell.jar

/usr/lib/hadoop-yarn/hadoop-yarn-api-2.7.1-amzn= -1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.1-a= mzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-applications-d= istributedshell-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-resourc= emanager.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedc= achemanager.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-common.= jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-web-pro= xy-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-sharedc= achemanager-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2= .7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-nodeman= ager-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar

/usr/lib/hadoop-yarn/hadoop-yarn-applications-u= nmanaged-am-launcher.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-web-pro= xy.jar

/usr/lib/hadoop-yarn/hadoop-yarn-api.jar=

/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.j= ar

/usr/lib/hadoop-yarn/hadoop-yarn-server-resourc= emanager-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-applica= tionhistoryservice-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-nodeman= ager.jar

/usr/lib/hadoop-yarn/hadoop-yarn-common-2.7.1-a= mzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-common.jar<= /u>

/usr/lib/hadoop-yarn/hadoop-yarn-server-applica= tionhistoryservice.jar

/usr/lib/hadoop-yarn/hadoop-yarn-server-common-= 2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/hadoop-yarn-applications-u= nmanaged-am-launcher-2.7.1-amzn-1.jar

/usr/lib/hadoop-yarn/lib/jackson-xc-1.9.13.jar<= u>

/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9= .13.jar

/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar<= /u>

/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.9.13.j= ar

/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar

/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar=

/usr/lib/hadoop-yarn/lib/javax.inject-1.jar<= /u>

/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar

/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6.jar=

/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar

/usr/lib/hadoop-yarn/lib/jettison-1.1.jar

/usr/lib/hadoop-yarn/lib/guice-3.0.jar

/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar=

/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1= .jar

/usr/lib/hadoop-yarn/lib/asm-3.2.jar<= /u>

/usr/lib/hadoop-yarn/lib/xz-1.0.jar

/usr/lib/hadoop-yarn/lib/commons-collections-3.= 2.1.jar

/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar=

/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26-emr.= jar

/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar<= u>

/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.ja= r

/usr/lib/hadoop-yarn/lib/jetty-6.1.26-emr.jar

/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar=

/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.= jar

/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar=

/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.1= 3.jar

/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar<= /u>

/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar=

/usr/lib/hadoop-yarn/lib/zookeeper-3.4.6-tests.= jar

/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar<= /u>

/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar<= u>

/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar<= u>

/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar<= u>

/usr/lib/hadoop-yarn/lib/activation-1.1.jar<= /u>

/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar

/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar<= u>

/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar

/usr/lib/hadoop-lzo/lib/hadoop-lzo-0.4.19.jar

/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar

/usr/share/aws/emr/emrfs/conf

/usr/share/aws/emr/emrfs/lib/jsr-275-0.9.1.jar<= u>

/usr/share/aws/emr/emrfs/lib/junit-4.11.jar<= /u>

/usr/share/aws/emr/emrfs/lib/commons-cli-1.2.ja= r

/usr/share/aws/emr/emrfs/lib/javax.inject-1.jar=

/usr/share/aws/emr/emrfs/lib/commons-codec-1.9.= jar

/usr/share/aws/emr/emrfs/lib/httpclient-4.3.4.j= ar

/usr/share/aws/emr/emrfs/lib/commons-httpclient= -3.0.jar

/usr/share/aws/emr/emrfs/lib/guice-3.0.jar

/usr/share/aws/emr/emrfs/lib/httpcore-4.3.2.jar=

/usr/share/aws/emr/emrfs/lib/joda-time-2.3.jar<= u>

/usr/share/aws/emr/emrfs/lib/bcprov-jdk15on-1.5= 1.jar

/usr/share/aws/emr/emrfs/lib/emrfs-hadoop-2.4.0= .jar

/usr/share/aws/emr/emrfs/lib/protobuf-java-2.5.= 0.jar

/usr/share/aws/emr/emrfs/lib/slf4j-api-1.7.16.j= ar

/usr/share/aws/emr/emrfs/lib/aopalliance-1.0.ja= r

/usr/share/aws/emr/emrfs/lib/commons-logging-1.= 1.3.jar

/usr/share/aws/emr/emrfs/lib/commons-lang3-3.3.= jar

/usr/share/aws/emr/emrfs/lib/commons-math-2.1.j= ar

/usr/share/aws/emr/emrfs/lib/gson-2.2.4.jar<= /u>

/usr/share/aws/emr/emrfs/lib/jsr305-2.0.1.jar

/usr/share/aws/emr/emrfs/lib/emr-core-2.5.0.jar=

/usr/share/aws/emr/emrfs/lib/emr-metrics-client= -2.1.0.jar

/usr/share/aws/emr/emrfs/lib/commons-exec-1.2.j= ar

/usr/share/aws/emr/emrfs/lib/guava-15.0.jar<= /u>

/usr/share/aws/emr/emrfs/lib/bcpkix-jdk15on-1.5= 1.jar

/usr/share/aws/emr/emrfs/lib/hamcrest-core-1.3.= jar

/usr/share/aws/emr/emrfs/auxlib/*=

/usr/share/aws/emr/lib/jsr-275-0.9.1.jar=

/usr/share/aws/emr/lib/commons-httpclient-3.0.j= ar

/usr/share/aws/emr/lib/joda-time-2.3.jar=

/usr/share/aws/emr/lib/slf4j-api-1.7.16.jar<= /u>

/usr/share/aws/emr/lib/commons-codec-1.2.jar=

/usr/share/aws/emr/lib/gson-2.2.4.jar=

/usr/share/aws/emr/lib/commons-logging-1.0.3.ja= r

/usr/share/aws/emr/lib/jsr305-2.0.1.jar<= u>

/usr/share/aws/emr/lib/emr-core-2.5.0.jar

/usr/share/aws/emr/ddb/lib/emr-ddb-hadoop.jar

/usr/share/aws/emr/goodies/lib/emr-hadoop-goodi= es.jar

/usr/share/aws/emr/kinesis/lib/emr-kinesis-hado= op.jar

/usr/share/aws/emr/cloudwatch-sink/lib/cloudwat= ch-sink-1.0.0.jar

/usr/share/aws/emr/cloudwatch-sink/lib/cloudwat= ch-sink.jar

=C2=A0

On Tue, Nov 1, 2016 at 3:57 AM, Till Rohrmann <trohrmann@apache.or= g> wrote:

Hi Justin,

=C2=A0

I think this might be a problem in Flink's Kines= is consumer. The Flink Kinesis consumer uses the aws-java-sdk version 1.10.= 71 which indeed contains the afore mentioned methods. However, already vers= ion 1.10.46 no longer contains this method. Thus, I suspect, that Yarn puts some older version of this jar into the cl= asspath. For these cases, I think we have to shade our aws-java-sdk depende= ncy so that it also works with older versions of EMR.

=C2=A0

In order to verify this, could you tell us which EMR= version you're running? Additionally, it would be helpful if you sent = us the classpath with which the Flink cluster was started on Yarn. You can = find this information at the beginning of your TaskManager log file. Thanks a lot.

=C2=A0

Cheers,

Till

=C2=A0

On Mon, Oct 31, 2016 at 8:22 PM, Justin Yan <justin.yan@remitly.= com> wrote:

Hi all - first time on the mailing list, so my apolo= gies if I break protocol on anything.=C2=A0 Really excited to be using Flin= k, and hoping to be active here in the future!=C2=A0 Also, apologies for th= e length of this email - I tried to include details but may have gone overboard.

=C2=A0

The gist of my problem is an issue with packaging th= e Flink Kinesis Connector into my user code for execution on a YARN cluster= in EMR - there's some dependency trouble happening, but after about 48= hours of attempts, I'm not sure how to make progress, and I'd really appreciate any ideas or assistance. Than= k you in advance!

=C2=A0

### First, Some Context.

=C2=A0

We're hoping to write our Flink jobs in scala 2.= 11.=C2=A0 The Flink JM/TMs currently run on an EMR cluster with Hadoop 2.7 = as YARN containers.=C2=A0 We run our jobs via an Azkaban server, which has = the Hadoop and Flink clients installed, and the configurations are set to point at the YARN master on our EMR cluster (wit= h $HADOOP_HOME set so Flink can discover the hadoop configs).=C2=A0 We'= re using Java OpenJDK7 everywhere, and Maven 3.3.9 when building Flink from= source.

=C2=A0

We use SBT and the assembly plugin to create an Uber= jar of our code and its dependencies.=C2=A0 This gets uploaded to Azkaban, = whereupon the following command is run on the azkaban server to execute a F= link job:

=C2=A0

flink run -c <className> usercodeuberjar-assem= bly-1.0.jar

=C2=A0

I've successfully run a few flink jobs that exec= ute on our EMR cluster in this fashion (the WordCount example, etc.).

=C2=A0

### The Problem

=C2=A0

We use AWS Kinesis, and are hoping to integrate Flin= k with it.=C2=A0 Naturally, we were hoping to use the Kinesis connector: &l= t;https://ci.apache.o= rg/projects/flink/flink-docs-release-1.1/apis/streaming/connector= s/kinesis.html>.

=C2=A0

After following the instructions with some experimen= tation, I was able to run a Flink Kinesis application on my laptop in Local= Cluster mode. =C2=A0(Ubuntu 16.04, local cluster initiated with the `./sta= rt-local.sh` command, job submitted via `flink run -c <className> usercodeuberjar-assembly-1.0.jar`)=

=C2=A0

I uploaded the same JAR to Azkaban and tried to run = the same command to submit to our EMR cluster, and got a `java.lang.NoSuchM= ethodError: com.amazonaws.SDKGlobalConfiguration.isInRegionOptimizedMo= deEnabled()` (I've included the full stack trace at the bottom of this email).=C2=A0 I went to inspect the uploaded J= AR with a `unzip usercodeuberjar-assembly-1.0.jar`, looked in `com/ama= zonaws` and found the SDKGlobalConfiguration.class file.=C2=A0 I decompiled= and inspected it, and the isInRegionOptimizedModeEnabled method that was purportedly missing was indeed present.

=C2=A0

I've included the steps I took to manifest this = problem below, along with a variety of things that I tried to do to resolve= the problem - any help or insight is greatly appreciated!

=C2=A0

### Repro

=C2=A0

I'm not sure how to provide a clear repro, but I= 'll try to include as much detail as I can about the sequence of action= s and commands I ran since there may be some obvious mistakes:

=C2=A0

Downloading the flink release to my laptop:

=C2=A0

tar xfzv flink-1.1.3-bin-hadoop27-scala_2.11.tg= z

=C2=A0

I then SSH'd into Azkaban, and ran the same two = commands, while adding the bin/ directory to my PATH and tweaking the confi= g for fs.hdfs.hadoopconf.=C2=A0 Next, after getting the flink binaries, I w= ent to fetch the source code in order to follow the instructions here: <https://ci.apache.org/projects/flink/flink-docs-release-1.= 1/apis/streaming/connectors/kinesis.html>

=C2=A0

tar xfzv release-1.1.3.tar.gz

=C2=A0

Here, I wanted to leverage our EMR instance profile = Role instead of passing in credentials, hence I wanted the AUTO value for t= he "aws.credentials.provider" config, which seems to have been ad= ded after 1.1.3 - I made a couple of small tweaks to AWSConfigConstants.java and AWSUtil.java to allow for that AUTO value.<= u>

=C2=A0

Next, we're using Scala 2.11, so per the instruc= tions here, I changed the scala version: <https://ci.apache.org/projects/flink/flink-docs-re= lease-1.1/setup/building.html#scala-versions>

=C2=A0

tools/change-scala-version.sh 2.11

=C2=A0

Back to the Kinesis Connector documentation...

=C2=A0

mvn clean install -Pinclude-kinesis -DskipTests

cd flink-dist

mvn clean install -Pinclude-kinesis -DskipTests

=C2=A0

When running that second mvn clean install, I get so= me warnings about the maven shade plugin having conflicting versions.=C2=A0= I also get a "[WARNING] The requested profile "include-kinesis&q= uot; could not be activated because it does not exist."<= /p>

=C2=A0

At this point, the instructions are not too clear on= what to do.=C2=A0 I proceed to this section to try and figure it out: <= https://ci.apache.org/projects/flink/fli= nk-docs-release-1.1/apis/cluster_execution.html#linking-with-modu= les-not-contained-in-the-binary-distribution>

=C2=A0

My goal is to package everything in my usercode JAR,= and I'll try to do that with SBT.=C2=A0 My first try is to install the= Flink Kinesis Connector JAR generated by mvn clean install to my local Mav= en Repo:

=C2=A0

mvn install:install-file -Dfile=3Dflink-connector-ki= nesis_2.11-1.1.3.jar

=C2=A0

I then build the jar with a build.sbt that looks lik= e this (extraneous details removed):

=C2=A0

scalaVersion in ThisBuild :=3D "2.11.8"=

=C2=A0

val flinkVersion =3D "1.1.3"=

=C2=A0

val flinkDependencies =3D Seq(

=C2=A0 "org.apache.flink" %% "flink-s= cala" % flinkVersion,

=C2=A0 "org.apache.flink" %% "flink-s= treaming-scala" % flinkVersion,

=C2=A0 "org.apache.flink" %% "flink-c= onnector-kinesis" % flinkVersion

)

=C2=A0

lazy val proj =3D (project in file(".")).<= u>

=C2=A0 settings(

=C2=A0 =C2=A0 libraryDependencies ++=3D flinkDepende= ncies

=C2=A0 )

=C2=A0

After this builds, I unzip the jar and use JD to dec= ompile the com.amazonaws.SDKGlobalConfiguration class file to see if t= he method in question is present or not (it is).=C2=A0 I then run the jar l= ocally with a `flink run -c <className> usercodeuberjar-assembly-1.0.= jar`, and I see it running just fine when navigating to localhost:8081.=C2=A0 I = then upload this same JAR to our Azkaban server, and run the same `flink ru= n -c <className> usercodeuberjar-assembly-1.0.jar` command to su= bmit as a YARN application - this time, I get the `NoSuchMethodError`.

=C2=A0

I've tried a variety of permutations of this, so= I'll attempt to list them out along with their results:<= /p>

=C2=A0

1. A non-kinesis Flink job: I was able to successful= ly the example WordCount Flink job as a YARN application.

2. I mvn installed the newly built flink-scala and f= link-streaming-scala JARs to my local maven repository in case these were d= ifferent - after building and running on Azkaban... same error.

3. Using the newly-built flink-dist JAR (with the -P= include-kinesis flag): After replacing the flink-dist JAR in the /lib dir o= n Azkaban (that the `flink` command runs), I still had the same error.

4. Packaging the JAR in different ways:

=C2=A0 =C2=A0 - I tried adding the flink-connector-k= inesis JAR by adding it to a /lib directory in my SBT project for direct in= clusion.=C2=A0 This actually caused the NoSuchMethodError to occur during *= local* execution as well.

=C2=A0 =C2=A0 - I tried using mvn-assembly to packag= e all of the flink-connector-kinesis dependencies into that JAR, and then a= dded it to the /lib directory in my SBT project.=C2=A0 Local execution no l= onger has error, but submission from Azkaban still has same error.

5. I thought it might be a classpath issue (since my= laptop doesn't have a hadoop installation, so I figured there may be s= ome kind of collision with the AWS SDK included by EMR), so I set, on Azkab= an, the environment variable FLINK_CLASSPATH=3Dusercodeuberjar-assembl= y-1.0.jar in order to get it prepended - same error.

6.=C2=A0 I realized this wasn't necessarily doin= g anything to the resolution of classnames of the Flink job executing in YA= RN.=C2=A0 So I dug into the client source, which eventually led me to flink= -clients/.../program/PackagedProgram.java which has the following line of code setting the ClassLoader:

=C2=A0

this.userCodeClassLoader =3D JobWithJars.buildUserCo= deClassLoader(getAllLibraries(), classpaths, getClass().getClassLoader= ());

=C2=A0

getAllLibraries() does seem to set the jar that you = pass into the `flink` command at the top of the class resolution hierarchy,= which, as my previous foray into decompilation shows, does seem to include= the method that is supposedly missing.

=C2=A0

At this point, I ran out of ideas to investigate, an= d so I'm hoping someone here is able to help me.=C2=A0 Thanks in advanc= e for reading this!

=C2=A0

Full Stack Trace:

=C2=A0

java.lang.NoSuchMethodError: com.amazonaws.SDKGlobal= Configuration.isInRegionOptimizedModeEnabled()Z

at com.amazonaws.ClientConfigurationFactory.get= Config(ClientConfigurationFactory.java:35)

at org.apache.flink.streaming.connectors.kinesi= s.util.AWSUtil.createKinesisClient(AWSUtil.java:50)=

at org.apache.flink.streaming.connectors.kinesi= s.proxy.KinesisProxy.(KinesisProxy.java:118)

at org.apache.flink.streaming.connectors.kinesi= s.proxy.KinesisProxy.create(KinesisProxy.java:176)<= /p>

at org.apache.flink.streaming.connectors.kinesi= s.internals.KinesisDataFetcher.(KinesisDataFetcher.java:188)

at org.apache.flink.streaming.connectors.kinesi= s.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:198)<= u>

at org.apache.flink.streaming.api.operators.Str= eamSource.run(StreamSource.java:80)

at org.apache.flink.streaming.api.operators.Str= eamSource.run(StreamSource.java:53)

at org.apache.flink.streaming.runtime.tasks.SourceStre= amTask.run(SourceStreamTask.java:56)

at org.apache.flink.streaming.runtime.tasks.Str= eamTask.invoke(StreamTask.java:266)

at org.apache.flink.runtime.taskmanager.Task.ru= n(Task.java:585)

at java.lang.Thread.run(Thread.java:745)=

=C2=A0

=C2=A0

=C2=A0

=C2=A0

=C2=A0



--001a1141fd20693ebc0540c75e4a--