Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id AB154200D23 for ; Thu, 19 Oct 2017 11:10:23 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id A97961609ED; Thu, 19 Oct 2017 09:10:23 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id A59791609E2 for ; Thu, 19 Oct 2017 11:10:21 +0200 (CEST) Received: (qmail 48794 invoked by uid 500); 19 Oct 2017 09:10:16 -0000 Mailing-List: contact dev-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list dev@flink.apache.org Received: (qmail 48783 invoked by uid 99); 19 Oct 2017 09:10:15 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 19 Oct 2017 09:10:15 +0000 Received: from aljoschas-mbp.fritz.box (ip-2-205-81-99.web.vodafone.de [2.205.81.99]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 7B3761A012F for ; Thu, 19 Oct 2017 09:10:13 +0000 (UTC) From: Aljoscha Krettek Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable Mime-Version: 1.0 (Mac OS X Mail 11.0 \(3445.1.7\)) Subject: Re: Unable to write snapshots to S3 on EMR Date: Thu, 19 Oct 2017 11:09:45 +0200 References: <79A5669F-B2FE-47F6-90D1-A02769ADC91C@apache.org> To: dev@flink.apache.org In-Reply-To: Message-Id: X-Mailer: Apple Mail (2.3445.1.7) archived-at: Thu, 19 Oct 2017 09:10:23 -0000 Hi, I can't spot anything obviously wrong with your config. I've been quite = busy lately with traveling and release preparations. I will get back to = this and try it myself, though, once I have time again. Best, Aljoscha > On 12. Oct 2017, at 17:07, Andy M. wrote: >=20 > Hi Ajoscha, >=20 > That didn't seem to do the trick either. Do the following look = correct? >=20 > I see 5.9.0 is released with Flink 1.3.2, so I tried that, and got the = same > problem, all I did was upload my Scala .jar to the master, updated my > flink-conf.yaml, set my env variables, and ran it with the following > command: HADOOP_CONF_DIR=3D/etc/hadoop/conf flink run -m yarn-cluster = -yn 5 > -ys 2 ~/flink-consumer.jar >=20 > I am beginning to think something else may be wrong with my = configuration. > Does the following look correct? >=20 > In flink-conf.yaml: > state.backend: rocksdb > state.backend.fs.checkpointdir: s3://org/flink-project/state > state.checkpoints.dir: s3://org/flink-project/state >=20 > In the code: > = env.enableCheckpointing(getCheckpointRate,CheckpointingMode.EXACTLY_ONCE) > = env.getCheckpointConfig.enableExternalizedCheckpoints(ExternalizedCheckpoi= ntCleanup.RETAIN_ON_CANCELLATION) > env.getCheckpointConfig.setMinPauseBetweenCheckpoints(1000) > env.getCheckpointConfig.setCheckpointTimeout(5000) > env.getCheckpointConfig.setMaxConcurrentCheckpoints(1) > env.setStateBackend(new = RocksDBStateBackend("s3://org/flink-project/state", > true)) >=20 > Thank you >=20 >=20 >=20 > On Wed, Oct 11, 2017 at 5:19 AM, Aljoscha Krettek = > wrote: >=20 >> Hi Andy, >>=20 >> I remember that I was testing a job with almost exactly the same = setup as >> part of the Flink 1.3.2 release testing. The command I used to start = my job >> is roughly this: >>=20 >> HADOOP_CONF_DIR=3D/etc/hadoop/conf bin/flink run -c my.main.Class -m >> yarn-cluster -yn 5 -ys 2 ... >>=20 >> i.e. I export the proper hadoop config dir and I run a per-job YARN >> cluster. I think I also exported the result of "hadoop classpath" as >> HADOOP_CLASSPATH. >>=20 >> Best, >> Aljoscha >>=20 >>> On 10. Oct 2017, at 16:43, Andy M. wrote: >>>=20 >>> Hello, >>>=20 >>> Bowen: Unless I am missing something, it says there needs to be no = setup >>> on EMR, Each topic says: "You don=E2=80=99t have to configure this = manually if >> you >>> are running Flink on EMR." S3 access from CLI works fine on my = clusters. >>>=20 >>> Chen: Thank you for this, I will look into this if I am unable to = get >> this >>> running on YARN successfully. >>>=20 >>> Stephan: Removing the said library causes the flink >>> (flink-1.3.2/bin/flink) bash script to fail. The underlying Java = needs >>> this to work. I tried explicitly setting the classpath for the java = call >>> as well to point to the hadoop library jars. This is the original = java >>> command that I was trying to run: >>>=20 >>> java >>> -Dlog.file=3D/home/hadoop/flink-1.3.2/log/flink-hadoop-client- >> ip-172-31-19-27.log >>> -Dlog4j.configuration=3Dfile:/home/hadoop/flink-1.3.2/conf/ >> log4j-cli.properties >>> -Dlogback.configurationFile=3Dfile:/home/hadoop/flink-1.3.2/ >> conf/logback.xml >>> -classpath >>> /home/hadoop/flink-1.3.2/lib/flink-python_2.11-1.3.2.jar:/ >> home/hadoop/flink-1.3.2/lib/flink-shaded-hadoop2-uber-1.3. >> 2.jar:/home/hadoop/flink-1.3.2/lib/log4j-1.2.17.jar:/home/ >> hadoop/flink-1.3.2/lib/slf4j-log4j12-1.7.7.jar:/home/ >> hadoop/flink-1.3.2/lib/flink-dist_2.11-1.3.2.jar::/etc/hadoop/conf: >>> org.apache.flink.client.CliFrontend run -m yarn-cluster -yn 1 >>> /home/hadoop/flink-consumer.jar >>>=20 >>>=20 >>> This is what I changed it too(removing the shadded-hadoop2-uber jar = and >>> adding in the hadoop folder): >>>=20 >>> java >>> -Dlog.file=3D/home/hadoop/flink-1.3.2/log/flink-hadoop-client- >> ip-172-31-19-27.log >>> -Dlog4j.configuration=3Dfile:/home/hadoop/flink-1.3.2/conf/ >> log4j-cli.properties >>> -Dlogback.configurationFile=3Dfile:/home/hadoop/flink-1.3.2/ >> conf/logback.xml >>> -classpath >>> /home/hadoop/flink-1.3.2/lib/flink-python_2.11-1.3.2.jar:/ >> home/hadoop/flink-1.3.2/lib/log4j-1.2.17.jar:/home/hadoop/ >> flink-1.3.2/lib/slf4j-log4j12-1.7.7.jar:/home/hadoop/flink- >> 1.3.2/lib/flink-dist_2.11-1.3.2.jar:/usr/lib/hadoop/lib/ >> activation-1.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar: >> /usr/lib/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/ >> lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/ >> apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons- >> lang-2.6.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/ >> usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/ >> lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/ >> lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/java- >> xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/netty-3.6.2.Final. >> jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/ >> lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/ >> jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/ >> usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/ >> lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2. >> 3-1.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/ >> lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/curator- >> client-2.7.1.jar:/usr/lib/hadoop/lib/jersey-core-1.9. >> jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/ >> hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/curator- >> framework-2.7.1.jar:/usr/lib/hadoop/lib/jersey-json-1.9. >> jar:/usr/lib/hadoop/lib/slf4j-api-1.7.10.jar:/usr/lib/ >> hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/ >> curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/jersey- >> server-1.9.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7. >> 10.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0. >> jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/ >> jets3t-0.9.0.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4. >> 1.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/ >> hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/jettison-1. >> 1.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/ >> hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/ >> hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/jetty-6.1.26- >> = emr.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/commons- >> collections-3.2.2.jar:/usr/lib/hadoop/lib/htrace-core-3. >> 1.0-incubating.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26- >> emr.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/ >> lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/ >> httpclient-4.5.3.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/ >> usr/lib/hadoop/lib/zookeeper-3.4.10.jar:/usr/lib/hadoop/ >> lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/ >> httpcore-4.4.4.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/ >> usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/ >> hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/ >> jsr305-3.0.0.jar:/usr/lib/hadoop/lib/commons-httpclient- >> 3.1.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/ >> lib/hadoop/lib/junit-4.11.jar:/etc/hadoop/conf >>> org.apache.flink.client.CliFrontend run -m yarn-cluster -yn 1 >>> /home/hadoop/flink-consumer.jar >>>=20 >>> The later throws the following error: >>>=20 >>> SLF4J: Class path contains multiple SLF4J bindings. >>> SLF4J: Found binding in >>> [jar:file:/home/hadoop/flink-1.3.2/lib/slf4j-log4j12-1.7.7. >> jar!/org/slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in >>> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/ >> slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >>> explanation. >>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >>> Exception in thread "main" java.lang.NoClassDefFoundError: >>> org/apache/hadoop/util/VersionInfo >>> at >>> = org.apache.flink.runtime.util.EnvironmentInformation.logEnvironmentInfo( >> EnvironmentInformation.java:283) >>> at org.apache.flink.client.CliFrontend.main(CliFrontend. >> java:1124) >>> Caused by: java.lang.ClassNotFoundException: >>> org.apache.hadoop.util.VersionInfo >>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381) >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424) >>> at = sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335) >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357) >>> ... 2 more >>>=20 >>> Simply removing the .jar from the folder causes the same error. >>>=20 >>> Thank you >>>=20 >>>=20 >>>=20 >>>=20 >>> On Mon, Oct 9, 2017 at 5:46 AM, Stephan Ewen = wrote: >>>=20 >>>> Hi! >>>>=20 >>>> It looks like multiple Hadoop versions are in the classpath. = Flink's >> hadoop >>>> jar and the EMR Hadoop jars. >>>> I would simply drop Flink's own Hadoop dependency and only use the = EMR >>>> Hadoop jars. >>>>=20 >>>> Delete the 'flink-shaded-h=E2=80=8C=E2=80=8Badoop2-uber' jar from = Flink, and make sure >> the >>>> setup is such that the Hadoop lib environment variable is set. Then = it >>>> should not have conflicts any more. >>>>=20 >>>>=20 >>>>=20 >>>> On Sun, Oct 8, 2017 at 12:08 AM, Chen Qin = wrote: >>>>=20 >>>>> Attached my side project verified working to deploy jobmanager and >>>>> taskmanager as stateless service(non yarn/mesos), configuration = here >>>>>=20 >>>>> https://github.com/chenqin/flink-jar/tree/master/config/hadoop >>>>>=20 >>>>> more detail here >>>>> https://github.com/chenqin/flink-jar/blob/master/src/ >>>>> main/java/FlinkBootstrap.java#L49 >>>>>=20 >>>>> On Fri, Oct 6, 2017 at 10:26 PM, Bowen Li = >>>> wrote: >>>>>=20 >>>>>> Hi Andy, >>>>>>=20 >>>>>> I believe it's because you didn't set your s3 impl correctly. Try = to >>>> set >>>>>> your core-site.xml by following https://ci.apache.org/ >>>>>> projects/flink/flink-docs-release-1.4/ops/deployment/ >>>>>> aws.html#s3afilesystem- >>>>>> recommended >>>>>>=20 >>>>>> Bowen >>>>>>=20 >>>>>> On Fri, Oct 6, 2017 at 7:59 AM, Andy M. = wrote: >>>>>>=20 >>>>>>> Hi Till, >>>>>>>=20 >>>>>>> Seems like everything is in line there. hadoop-common.jar -> >>>>>>> hadoop-common-2.7.3-amzn-3.jar >>>>>>>=20 >>>>>>> And when i decompiled that jar I see public void >>>>>> addResource(Configuration >>>>>>> conf) in org/apache/hadoop/conf/Configuration.java >>>>>>>=20 >>>>>>> I agree that an incorrect version of the jar is probably being = run, >>>> is >>>>>>> there a way to limit the classpath for the TaskManager when = starting >>>>> the >>>>>>> job? >>>>>>>=20 >>>>>>> Thank you >>>>>>>=20 >>>>>>> On Fri, Oct 6, 2017 at 6:49 AM, Till Rohrmann = >>>>>>> wrote: >>>>>>>=20 >>>>>>>> Hi Andy, >>>>>>>>=20 >>>>>>>> could you check which Hadoop version this jar >>>>>>>> /usr/lib/hadoop/hadoop-common.jar is? Maybe also checking = whether >>>>> the >>>>>>>> contained hadoop Configuration class has the method >>>>>>>> Configuration.addResource(Lorg/apache/hadoop/conf/ >>>> Configuration;)V. >>>>>>> Maybe >>>>>>>> this jar is the culprit because it comes from a different = Hadoop >>>>>> version. >>>>>>>>=20 >>>>>>>> Cheers, >>>>>>>> Till >>>>>>>> =E2=80=8B >>>>>>>>=20 >>>>>>>> On Thu, Oct 5, 2017 at 4:22 PM, Andy M. = wrote: >>>>>>>>=20 >>>>>>>>> Hi Till, >>>>>>>>>=20 >>>>>>>>> I believe this is what you are looking for, classpath is much >>>>> bigger >>>>>>> for >>>>>>>>> the task manager. I can also post the whole log file if = needed: >>>>>>>>>=20 >>>>>>>>> 2017-10-05 14:17:53,038 INFO org.apache.flink.yarn. >>>>>>>> YarnTaskManagerRunner >>>>>>>>> - Classpath: >>>>>>>>> flink-consumer.jar:lib/flink-dist_2.11-1.3.2.jar:lib/flink- >>>>>>>>> python_2.11-1.3.2.jar:lib/flink-shaded-hadoop2-uber-1.3. >>>>>>>>> 2.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar: >>>>>>>>> log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/ >>>>>>>>> etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.7.3- >>>>>>>>> amzn-3-tests.jar:/usr/lib/hadoop/hadoop-annotations-2.7. >>>>>>>>> 3-amzn-3.jar:/usr/lib/hadoop/hadoop-distcp.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-auth-2.7.3-amzn-3.jar:/usr/lib/hadoop/hadoop- >>>>>>>>> nfs-2.7.3-amzn-3.jar:/usr/lib/hadoop/hadoop-streaming-2.7.3- >>>>>>>>> amzn-3.jar:/usr/lib/hadoop/hadoop-ant-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop/hadoop-distcp-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-datajoin.jar:/usr/lib/hadoop/hadoop- >>>>>>>>> streaming.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-ant.jar:/usr/lib/hadoop/hadoop-sls.jar:/ >>>>>>>>> usr/lib/hadoop/hadoop-azure-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-nfs.jar:/usr/lib/hadoop/hadoop-extras-2.7. >>>>>>>>> 3-amzn-3.jar:/usr/lib/hadoop/hadoop-gridmix.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-common-2.7.3-amzn-3.jar:/usr/lib/hadoop/ >>>>>>>>> hadoop-annotations.jar:/usr/lib/hadoop/hadoop-openstack-2. >>>>>>>>> 7.3-amzn-3.jar:/usr/lib/hadoop/hadoop-archives-2.7.3- >>>>>>>>> amzn-3.jar:/usr/lib/hadoop/hadoop-azure.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-extras.jar:/usr/lib/hadoop/hadoop-openstack. >>>>>>>>> jar:/usr/lib/hadoop/hadoop-rumen.jar:/usr/lib/hadoop/ >>>>>>>>> hadoop-aws-2.7.3-amzn-3.jar:/usr/lib/hadoop/hadoop- >>>>>>>>> datajoin-2.7.3-amzn-3.jar:/usr/lib/hadoop/hadoop- >>>>>>>>> archives.jar:/usr/lib/hadoop/hadoop-aws.jar:/usr/lib/ >>>>>>>>> hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-rumen-2.7.3- >>>>>>>>> amzn-3.jar:/usr/lib/hadoop/hadoop-sls-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop/hadoop-gridmix-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/jaxb-api-2. >>>>>>>>> 2.2.jar:/usr/lib/hadoop/lib/htrace-core-3.1.0-incubating. >>>>>>>>> jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/ >>>>>>>>> lib/hadoop/lib/httpclient-4.5.3.jar:/usr/lib/hadoop/lib/ >>>>>>>>> httpcore-4.4.4.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4. >>>>>>>>> 1.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0. >>>>>>>>> jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/ >>>>>>>>> lib/activation-1.1.jar:/usr/lib/hadoop/lib/jersey-server- >>>>>>>>> 1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/ >>>>>>>>> usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/ >>>>>>>>> gson-2.2.4.jar:/usr/lib/hadoop/lib/commons-digester-1. >>>>>>>>> 8.jar:/usr/lib/hadoop/lib/jackson-xc-1.9.13.jar:/usr/ >>>>>>>>> lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/ >>>>>>>>> apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/commons- >>>>>>>>> httpclient-3.1.jar:/usr/lib/hadoop/lib/curator-client-2.7. >>>>>>>>> 1.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/ >>>>>>>>> lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/ >>>>>>>>> commons-net-3.1.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/ >>>>>>>>> usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/ >>>>>>>>> xmlenc-0.52.jar:/usr/lib/hadoop/lib/jersey-json-1.9. >>>>>>>>> jar:/usr/lib/hadoop/lib/avro-1.7.4.jar:/usr/lib/hadoop/lib/ >>>>>>>>> commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/log4j-1.2.17. >>>>>>>>> jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/ >>>>>>>>> usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/ >>>>>>>>> jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/netty-3.6.2. >>>>>>>>> Final.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/ >>>>>>>>> lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/api-asn1- >>>>>>>>> api-1.0.0-M20.jar:/usr/lib/hadoop/lib/jackson-mapper-asl- >>>>>>>>> 1.9.13.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26-emr.jar: >>>>>>>>> /usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/ >>>>>>>>> jackson-core-asl-1.9.13.jar:/usr/lib/hadoop/lib/commons- >>>>>>>>> cli-1.2.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/ >>>>>>>>> lib/hadoop/lib/jetty-6.1.26-emr.jar:/usr/lib/hadoop/lib/ >>>>>>>>> apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/ >>>>>>>>> commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/junit-4. >>>>>>>>> 11.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar: >>>>>>>>> /usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/ >>>>>>>>> lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/slf4j-api-1.7. >>>>>>>>> 10.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/ >>>>>>>>> usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/ >>>>>>>>> lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/zookeeper-3.4.10. >>>>>>>>> jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/ >>>>>>>>> hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/ >>>>>>>>> curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/jackson- >>>>>>>>> jaxrs-1.9.13.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0. >>>>>>>>> 4.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/ >>>>>>>>> usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3-amzn-3-tests.jar:/ >>>>>>>>> usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/ >>>>>>>>> hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.7.3- >>>>>>>>> amzn-3.jar:/usr/lib/hadoop-hdfs/lib/htrace-core-3.1.0- >>>>>>>>> incubating.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java- >>>>>>>>> 2.5.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8. >>>>>>>>> jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/ >>>>>>>>> lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/ >>>>>>>>> lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/ >>>>>>>>> commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52. >>>>>>>>> jar:/usr/lib/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/ >>>>>>>>> usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/ >>>>>>>>> hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/ >>>>>>>>> netty-3.6.2.Final.jar:/usr/lib/hadoop-hdfs/lib/xml-apis- >>>>>>>>> 1.3.04.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/ >>>>>>>>> hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/ >>>>>>>>> lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/ >>>>>>>>> hadoop-hdfs/lib/jetty-util-6.1.26-emr.jar:/usr/lib/hadoop- >>>>>>>>> hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/lib/hadoop-hdfs/ >>>>>>>>> lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/servlet- >>>>>>>>> api-2.5.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26-emr.jar: >>>>>>>>> /usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/ >>>>>>>>> hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/ >>>>>>>>> lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/commons- >>>>>>>>> logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/jettison-1.1. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/htrace-core-3.1.0-incubating.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-auth-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/ >>>>>>>> hadoop-mapreduce/hadoop- >>>>>>>>> mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> httpclient-4.5.3.jar:/usr/lib/hadoop-mapreduce/hadoop- >>>>>>>>> mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-common-2.7.3-amzn-3.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/httpcore-4.4.4.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-core-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> jmespath-java-1.11.160.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-jobclient-2.7.3-amzn-3-tests. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/hadoop-streaming-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/commons-beanutils-1.7.0.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/hadoop-ant-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-distcp-2.7.3-amzn-3.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-app-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-mapreduce-examples-2.7.3-amzn- >>>>>>>>> 3.jar:/usr/lib/hadoop-mapreduce/commons-digester-1. >>>>>>>> 8.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3-amzn- >>>>>>>>> 3.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.9.13. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/commons- >>>>>>>>> net-3.1.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jersey-json- >>>>>>>>> 1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/avro-1.7.4.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/aws-java-sdk-core-1.11.160.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/aws-java-sdk-s3-1.11.160.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/jackson-dataformat-cbor-2.6.6.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/hadoop-azure-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/ion-java-1.0.2.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/azure-storage-2.0.0.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-shuffle-2.7.3-amzn-3.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-extras-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/jsch- >>>>>>>>> 0.1.42.jar:/usr/lib/hadoop-mapreduce/joda-time-2.8.1.jar: >>>>>>>>> /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2. >>>>>>>>> 7.3-amzn-3.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2. >>>>>>>>> 2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce- >>>>>>>>> client-hs.jar:/usr/lib/hadoop-mapreduce/netty-3.6.2.Final. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/hadoop-openstack-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/hadoop-mapreduce-client- >>>>>>>>> common.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.7. >>>>>>>>> 3-amzn-3.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/hadoop-mapreduce-client- >>>>>>>>> jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-jobclient-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/aws-java-sdk-kms-1.11.160.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/jackson-mapper-asl-1.9.13.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/hadoop-aws-2.7.3-amzn-3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/jetty-util-6.1.26-emr.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/hadoop-datajoin-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop- >>>>>>>>> archives.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1. >>>>>>>>> 9.13.jar:/usr/lib/hadoop-mapreduce/commons-lang3-3.3.2. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/jackson-core-2.6.6.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> jetty-6.1.26-emr.jar:/usr/lib/hadoop-mapreduce/apacheds- >>>>>>>>> kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-aws.jar:/usr/lib/hadoop-mapreduce/hadoop-auth. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.7.3-amzn-3.jar: >>>>>>>>> /usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar: >>>>>>>>> /usr/lib/hadoop-mapreduce/hadoop-sls-2.7.3-amzn-3.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/metrics-core-3.0.1.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/jackson-annotations-2.6.6.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop- >>>>>> mapreduce/commons- >>>>>>>>> configuration-1.6.jar:/usr/lib/hadoop-mapreduce/api-util- >>>>>>>>> 1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/jackson-databind-2.6.6.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> zookeeper-3.4.10.jar:/usr/lib/hadoop-mapreduce/commons-lang- >>>>>>>>> 2.6.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar: >>>>>>>>> /usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/jackson-jaxrs-1.9.13.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/hadoop-gridmix-2.7.3-amzn-3.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/ >>>>>>>>> lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/ >>>>>>>> lib/hadoop-mapreduce/lib/ >>>>>>>>> snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/ >>>>>>>>> jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/ >>>>>>>>> paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/jersey- >>>>>>>>> guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4. >>>>>>>>> jar:/usr/lib/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/ >>>>>>>>> lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/jackson- >>>>>>>>> mapper-asl-1.9.13.jar:/usr/lib/hadoop-mapreduce/lib/ >>>>>>>>> guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/xz-1. >>>>>>>>> 0.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/ >>>>>>>>> usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/ >>>>>>>>> hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop- >>>>>>>>> mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop- >>>>>> yarn-server- >>>>>>>>> sharedcachemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn- >>>>>>>>> server-web-proxy-2.7.3-amzn-3.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> hadoop-yarn-common-2.7.3-amzn-3.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> hadoop-yarn-server-tests-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/hadoop-yarn-applications-distributedshell- >>>>>>>>> 2.7.3-amzn-3.jar:/usr/lib/hadoop-yarn/hadoop-yarn- >>>>>>>>> server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop- >>>>> yarn-server- >>>>>>>>> sharedcachemanager-2.7.3-amzn-3.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> hadoop-yarn-server-applicationhistoryservice.jar: >>>>>>>>> /usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/ >>>>>>>>> lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.7.3-amzn- >>>>>>>>> 3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.7.3-amzn- >>>>>>>>> 3.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server- >>>>>>>>> applicationhistoryservice-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> yarn/hadoop-yarn-server-common-2.7.3-amzn-3.jar:/usr/ >>>>>>>>> lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop- >>>>>>>>> yarn/hadoop-yarn-api-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> yarn/hadoop-yarn-server-resourcemanager-2.7.3-amzn-3. >>>>>>>>> jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications- >>>>>>>>> unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop- >>>>>>>>> yarn-registry-2.7.3-amzn-3.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn- >>>>>>>>> common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web- >>>>>>>>> proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server- >>>>>>>>> nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api. >>>>>>>>> jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications- >>>>>>>>> unmanaged-am-launcher-2.7.3-amzn-3.jar:/usr/lib/hadoop- >>>>>>>>> yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/jaxb- >>>>>>>>> api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0. >>>>>>>>> jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/ >>>>>>>>> lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1. >>>>>>>>> 9.13.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/ >>>>>>>>> usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey- >>>>>>>>> client-1.9.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17. >>>>>>>>> jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/ >>>>>>>>> usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop- >>>>>>>>> yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/ >>>>>>>>> netty-3.6.2.Final.jar:/usr/lib/hadoop-yarn/lib/ >>>>>>>>> aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/ >>>>>>>>> usr/lib/hadoop-yarn/lib/zookeeper-3.4.10-tests.jar:/ >>>>>>>>> usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar: >>>>>>>>> /usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/ >>>>>>>>> lib/hadoop-yarn/lib/jetty-util-6.1.26-emr.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/guice- >>>>>>>>> 3.0.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.9.13. >>>>>>>>> jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> lib/jetty-6.1.26-emr.jar:/usr/lib/hadoop-yarn/lib/commons- >>>>>>>>> collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/stax-api- >>>>>>>>> 1.0-2.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/ >>>>>>>>> usr/lib/hadoop-yarn/lib/zookeeper-3.4.10.jar:/usr/lib/ >>>>>>>>> hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/ >>>>>>>>> lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/jackson- >>>>>>>>> jaxrs-1.9.13.jar:/usr/lib/hadoop-yarn/lib/commons- >>>>>>>>> logging-1.1.3.jar:/usr/lib/hadoop-lzo/lib/hadoop-lzo.jar: >>>>>>>>> /usr/lib/hadoop-lzo/lib/hadoop-lzo-0.4.19.jar:/usr/ >>>>>>>>> share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/ >>>>>>>>> jmespath-java-1.11.129.jar:/usr/share/aws/emr/emrfs/lib/ >>>>>>>>> bcpkix-jdk15on-1.51.jar:/usr/share/aws/emr/emrfs/lib/jcl- >>>>>>>>> over-slf4j-1.7.21.jar:/usr/share/aws/emr/emrfs/lib/ion- >>>>>>>>> java-1.0.2.jar:/usr/share/aws/emr/emrfs/lib/slf4j-api-1.7. >>>>>>>>> 21.jar:/usr/share/aws/emr/emrfs/lib/javax.inject-1.jar:/ >>>>>>>>> usr/share/aws/emr/emrfs/lib/aopalliance-1.0.jar:/usr/ >>>>>>>>> share/aws/emr/emrfs/lib/bcprov-jdk15on-1.51.jar:/usr/ >>>>>>>>> share/aws/emr/emrfs/lib/emrfs-hadoop-assembly-2.18.0.jar:/ >>>>>>>>> usr/share/aws/emr/emrfs/auxlib/*:/usr/share/aws/emr/ >>>>>>>>> lib/*:/usr/share/aws/emr/ddb/lib/emr-ddb-hadoop.jar:/usr/ >>>>>>>>> share/aws/emr/goodies/lib/emr-hadoop-goodies.jar:/usr/share/ >>>>>>>>> aws/emr/kinesis/lib/emr-kinesis-hadoop.jar:/usr/share/ >>>>>>>>> aws/emr/cloudwatch-sink/lib/cloudwatch-sink-1.0.0.jar:/ >>>>>>>>> usr/share/aws/emr/cloudwatch-sink/lib/cloudwatch-sink.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-inspector-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-api- >>>>>>>>> gateway-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-kinesis-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>> aws-java-sdk- >>>>>>>>> elasticloadbalancingv2-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-rekognition-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-test-utils-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> elasticbeanstalk-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-cloudhsm-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-iam-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-cognitoidp-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk- >>>> mechanicalturkrequester-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> marketplacecommerceanalytics-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-snowball-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-polly-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/jmespath-java-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-codebuild-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-emr-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> simpleworkflow-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-machinelearning-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-pinpoint-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-appstream-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-sqs-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-opensdk-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> greengrass-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-ecs-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-xray-1.11.160.jar:/usr/share/aws/aws-java- >>>>>>>>> sdk/aws-java-sdk-cloudtrail-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-cloudwatchmetrics-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> elasticloadbalancing-1.11.160.jar:/usr/share/aws/aws-java- >>>>>>>>> sdk/aws-java-sdk-codepipeline-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-cloudwatch-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-shield-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-config-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>> stepfunctions-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-gamelift-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-rds-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-core-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> datapipeline-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-s3-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-dax-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-opsworks-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-servicecatalog-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-cloudsearch-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-dms-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> directory-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-opsworkscm-1.11.160.jar:/usr/share/aws/aws-java- >>>>>>>> sdk/aws-java-sdk- >>>>>>>>> cloudformation-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-cloudfront-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-budgets-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-clouddirectory-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-importexport-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-lex-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> marketplaceentitlement-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-dynamodb-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-autoscaling-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>> elastictranscoder-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>> organizations-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-workspaces-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-ssm-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> servermigration-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-events-1.11.160.jar:/usr/share/aws/aws-java- >>>>>>>> sdk/aws-java-sdk- >>>>>>>>> applicationautoscaling-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-health-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-kms-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-logs-1.11.160.jar:/usr/share/ >>>>>>>>> aws/aws-java-sdk/aws-java-sdk-codestar-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-route53-1.11.160.jar: >>>>>>>>> /usr/share/aws/aws-java-sdk/aws-java-sdk-redshift-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> marketplacemeteringservice-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-sns-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-batch-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-waf-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-simpledb-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-codedeploy-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-ec2-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-models-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> devicefarm-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-cognitoidentity-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-lexmodelbuilding-1.11.160.jar: >>>>>>>>> /usr/share/aws/aws-java-sdk/aws-java-sdk-directconnect-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> elasticache-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-costandusagereport-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-discovery-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk- >>>> resourcegroupstaggingapi-1.11. >>>>>>>>> 160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-ses-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-lambda- >>>>>>>>> 1.11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> workdocs-1.11.160.jar:/usr/share/aws/aws-java-sdk/aws- >>>>>>>>> java-sdk-code-generator-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-cognitosync-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-efs-1.11.160.jar:/ >>>>>>>>> usr/share/aws/aws-java-sdk/aws-java-sdk-sts-1.11.160.jar: >>>>>>>>> /usr/share/aws/aws-java-sdk/aws-java-sdk-athena-1.11.160. >>>>>>>>> jar:/usr/share/aws/aws-java-sdk/aws-java-sdk-codecommit-1. >>>>>>>>> 11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> storagegateway-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-lightsail-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-acm-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-1.11.160.jar:/usr/share/aws/aws- >>>>>>>>> java-sdk/aws-java-sdk-glacier-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-ecr-1.11.160.jar:/usr/share/aws/ >>>>>>>>> aws-java-sdk/aws-java-sdk-support-1.11.160.jar:/usr/ >>>>>>>>> share/aws/aws-java-sdk/aws-java-sdk-codegen-maven-plugin- >>>>>>>>> 1.11.160.jar:/usr/share/aws/aws-java-sdk/aws-java-sdk- >>>>>>>>> elasticsearch-1.11.160.jar:/usr/share/aws/aws-java-sdk/ >>>>>>>>> aws-java-sdk-iot-1.11.160.jar >>>>>>>>>=20 >>>>>>>>> Thank you >>>>>>>>>=20 >>>>>>>>> On Thu, Oct 5, 2017 at 5:25 AM, Till Rohrmann < >>>>> trohrmann@apache.org> >>>>>>>>> wrote: >>>>>>>>>=20 >>>>>>>>>> Hi Andy, >>>>>>>>>>=20 >>>>>>>>>> the CliFrontend is not executed via Yarn, thus, it is not >>>>> affected >>>>>> by >>>>>>>>>> dependencies which are added due to the underlying Yarn >>>> cluster. >>>>>>>>> Therefore, >>>>>>>>>> it would be helpful to look at the TaskManager logs. Either = you >>>>>> have >>>>>>>>>> enabled log aggregation on your Yarn cluster, then you can >>>> obtain >>>>>> the >>>>>>>>> logs >>>>>>>>>> via `yarn logs -applicationId ` or you have = to >>>>>>> retrieve >>>>>>>>>> them from the machines where they were running (either by = going >>>>>>>> directly >>>>>>>>>> there or via the Yarn web interface). >>>>>>>>>>=20 >>>>>>>>>> Cheers, >>>>>>>>>> Till >>>>>>>>>>=20 >>>>>>>>>> On Wed, Oct 4, 2017 at 4:27 PM, Andy M. >>>>> wrote: >>>>>>>>>>=20 >>>>>>>>>>> Hi Till, >>>>>>>>>>>=20 >>>>>>>>>>> That is actually the classpath used by the flink bash >>>>> script(that >>>>>>>>>> launches >>>>>>>>>>> the jar using the java command). I changed the execute to = an >>>>>> echo, >>>>>>>> and >>>>>>>>>>> grabbed that for the CLI arguments. >>>>>>>>>>>=20 >>>>>>>>>>> I believe this is the class path from the log file(although >>>> it >>>>>>> might >>>>>>>>> not >>>>>>>>>> be >>>>>>>>>>> the taskmanager log, is that any different from what would = be >>>>> in >>>>>> my >>>>>>>>>>> flink-1.3.2/log folder?): >>>>>>>>>>>=20 >>>>>>>>>>> 2017-10-02 20:03:26,450 INFO org.apache.flink.client. >>>>>> CliFrontend >>>>>>>>>>> - Classpath: >>>>>>>>>>> /home/hadoop/flink-1.3.2/lib/flink-python_2.11-1.3.2.jar:/ >>>>>>>>>>> home/hadoop/flink-1.3.2/lib/flink-shaded-hadoop2-uber-1.3. >>>>>>>>>>> 2.jar:/home/hadoop/flink-1.3.2/lib/log4j-1.2.17.jar:/home/ >>>>>>>>>>> hadoop/flink-1.3.2/lib/slf4j-log4j12-1.7.7.jar:/home/ >>>>>>>>>>> hadoop/flink-1.3.2/lib/flink-dist_2.11-1.3.2.jar::/etc/ >>>>>>> hadoop/conf: >>>>>>>>>>>=20 >>>>>>>>>>> If that doesn't seem right, and you can point me in the = right >>>>>>>> direction >>>>>>>>>> as >>>>>>>>>>> to where the TaskManager logs would be, I would be happy to >>>>> grab >>>>>>> the >>>>>>>>>>> information your looking for. >>>>>>>>>>>=20 >>>>>>>>>>> Thank you >>>>>>>>>>>=20 >>>>>>>>>>> On Wed, Oct 4, 2017 at 3:27 AM, Till Rohrmann < >>>>>>> trohrmann@apache.org> >>>>>>>>>>> wrote: >>>>>>>>>>>=20 >>>>>>>>>>>> Hi Andy, >>>>>>>>>>>>=20 >>>>>>>>>>>> this looks to me indeed like a dependency problem. I assume >>>>>> that >>>>>>>> EMR >>>>>>>>> or >>>>>>>>>>>> something else is pulling in an incompatible version of >>>>> Hadoop. >>>>>>>>>>>>=20 >>>>>>>>>>>> The classpath you've posted, is this the one logged in the >>>>> log >>>>>>>> files >>>>>>>>>>>> (TaskManager log) or did you compile it yourself? In the >>>>> latter >>>>>>>> case, >>>>>>>>>> it >>>>>>>>>>>> would also be helpful to get access to the TaskManager >>>> logs. >>>>>>>>>>>>=20 >>>>>>>>>>>> Cheers, >>>>>>>>>>>> Till >>>>>>>>>>>>=20 >>>>>>>>>>>> On Mon, Oct 2, 2017 at 10:20 PM, Andy M. < >>>> ajm2444@gmail.com> >>>>>>>> wrote: >>>>>>>>>>>>=20 >>>>>>>>>>>>> Hi Fabian, >>>>>>>>>>>>>=20 >>>>>>>>>>>>> 1) I have looked at the linked docs, and from what I can >>>>> tell >>>>>>> no >>>>>>>>>> setup >>>>>>>>>>>>> should really need to be done to get Flink working(Other >>>>> than >>>>>>>>>>> downloading >>>>>>>>>>>>> the correct binaries, which I believe I did) >>>>>>>>>>>>> 2) I have downloaded the Flink 1.3.2 >>>>>> binaries(flink-1.3.2-bin- >>>>>>>>>>>>> hadoop27-scala_2.11.tgz >>>>>>>>>>>>> >>> flink-1.3.2/flink-1.3.2-bin- >>>>>>>>>>>>> hadoop27-scala_2.11.tgz>) >>>>>>>>>>>>> This is for hadoop 2.7.X, which matches EMR 5.8.0. >>>>>>>>>>>>>=20 >>>>>>>>>>>>> I appreciate any help or guidance you can provide me in >>>>>> fixing >>>>>>> my >>>>>>>>>>>> problems, >>>>>>>>>>>>> please let me know if there is anything else I can >>>> provide >>>>>> you. >>>>>>>>>>>>>=20 >>>>>>>>>>>>> Thank you >>>>>>>>>>>>>=20 >>>>>>>>>>>>> On Mon, Oct 2, 2017 at 4:12 PM, Fabian Hueske < >>>>>>> fhueske@gmail.com >>>>>>>>>=20 >>>>>>>>>>> wrote: >>>>>>>>>>>>>=20 >>>>>>>>>>>>>> Hi Andy, >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> I'm not an AWS expert, so I'll just check on some >>>> common >>>>>>>> issues. >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> I guess you already had a look at the Flink docs for >>>>>> AWS/EMR >>>>>>>> but >>>>>>>>>> I'll >>>>>>>>>>>>> post >>>>>>>>>>>>>> the link just be to sure [1]. >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> Since you are using Flink 1.3.2 (EMR 5.8.0 comes with >>>>> Flink >>>>>>>>> 1.3.1) >>>>>>>>>>> did >>>>>>>>>>>>> you >>>>>>>>>>>>>> built Flink yourself or did you download the binaries? >>>>>>>>>>>>>> Does the Hadoop version of the Flink build match the >>>>> Hadoop >>>>>>>>> version >>>>>>>>>>> of >>>>>>>>>>>>> EMR >>>>>>>>>>>>>> 5.8.0, i.e., Hadoop 2.7.x? >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> Best, Fabian >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> [1] >>>>>>>>>>>>>> https://ci.apache.org/projects/flink/flink-docs- >>>>>>>>>>>>> release-1.3/setup/aws.html >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>> 2017-10-02 21:51 GMT+02:00 Andy M. >>>> : >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> Hi Fabian, >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> Sorry, I just realized I forgot to include that part. >>>>>> The >>>>>>>>> error >>>>>>>>>>>>> returned >>>>>>>>>>>>>>> is: >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> java.lang.NoSuchMethodError: >>>>>>>>>>>>>>> org.apache.hadoop.conf.Configuration.addResource( >>>>>>>>>>>>>> Lorg/apache/hadoop/conf/ >>>>>>>>>>>>>>> Configuration;)V >>>>>>>>>>>>>>> at com.amazon.ws.emr.hadoop.fs. >>>>>>> EmrFileSystem.initialize( >>>>>>>>>>>>>>> EmrFileSystem.java:93) >>>>>>>>>>>>>>> at org.apache.flink.runtime.fs. >>>>>> hdfs.HadoopFileSystem. >>>>>>>>>>>>>>> initialize(HadoopFileSystem.java:328) >>>>>>>>>>>>>>> at org.apache.flink.core.fs.FileSystem. >>>>>>>>>> getUnguardedFileSystem( >>>>>>>>>>>>>>> FileSystem.java:350) >>>>>>>>>>>>>>> at org.apache.flink.core.fs. >>>>>> FileSystem.get(FileSystem. >>>>>>>>>>> java:389) >>>>>>>>>>>>>>> at org.apache.flink.core.fs.Path. >>>>>>>>>> getFileSystem(Path.java:293) >>>>>>>>>>>>>>> at org.apache.flink.runtime.state.filesystem. >>>>>>>>>>>>>>> FsCheckpointStreamFactory.( >>>>>>> FsCheckpointStreamFactory. >>>>>>>>>>> java:99) >>>>>>>>>>>>>>> at org.apache.flink.runtime.state.filesystem. >>>>>>>>> FsStateBackend. >>>>>>>>>>>>>>> createStreamFactory(FsStateBackend.java:282) >>>>>>>>>>>>>>> at org.apache.flink.contrib.streaming.state. >>>>>>>>>>> RocksDBStateBackend. >>>>>>>>>>>>>>> createStreamFactory(RocksDBStateBackend.java:273 >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> I believe it has something to do with the classpath, >>>>> but >>>>>> I >>>>>>> am >>>>>>>>>>> unsure >>>>>>>>>>>>> why >>>>>>>>>>>>>> or >>>>>>>>>>>>>>> how to fix it. The classpath being used during the >>>>>>> execution >>>>>>>>> is: >>>>>>>>>>>>>>> /home/hadoop/flink-1.3.2/lib/ >>>>>> flink-python_2.11-1.3.2.jar:/ >>>>>>>>>>>>>>> ho=E2=80=8C=E2=80=8Bme/hadoop/flink-1.3.=E2=80=8C=E2=80=8B= 2/ >>>>> lib/flink-shaded-h=E2=80=8C=E2=80=8Badoop2- >>>>>>>>>>>>>>> uber-1.3.2.ja=E2=80=8C=E2=80=8Br:/home/hadoop/ >>>>>>> flink=E2=80=8C=E2=80=8B-1.3.2/lib/log4j-1.2=E2=80=8C=E2=80=8B. >>>>>>>>>>>>>>> 17.jar:/home/hadoop=E2=80=8C=E2=80=8B/flink-1. >>>>>>> 3.2/lib/slf=E2=80=8C=E2=80=8B4j-log4j12-1.7.7. >>>>>>>>>>>>>>> jar=E2=80=8C=E2=80=8B:/home/hadoop/flink-=E2=80=8C=E2=80=8B= 1. >>>>>>> 3.2/lib/flink-dist=E2=80=8C=E2=80=8B_2.11-1.3. >>>>>>>>>>>>>>> 2.jar::/et=E2=80=8C=E2=80=8Bc/hadoop/conf: >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> I decompiled = flink-shaded-h=E2=80=8C=E2=80=8Badoop2-uber-1.3.2.ja=E2=80=8C=E2=80=8Br >>>>> and >>>>>>> it >>>>>>>>>> seems >>>>>>>>>>>> the >>>>>>>>>>>>>>> addResource function does seem to be there. >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> Thank you >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>> On Mon, Oct 2, 2017 at 3:43 PM, Fabian Hueske < >>>>>>>>> fhueske@gmail.com >>>>>>>>>>>=20 >>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>> Hi Andy, >>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>> can you describe in more detail what exactly isn't >>>>>>> working? >>>>>>>>>>>>>>>> Do you see error messages in the log files or on >>>> the >>>>>>>> console? >>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>> Thanks, Fabian >>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>> 2017-10-02 15:52 GMT+02:00 Andy M. < >>>>> ajm2444@gmail.com >>>>>>> : >>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> Hello, >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> I am about to deploy my first Flink projects to >>>>>>>>> production, >>>>>>>>>>> but >>>>>>>>>>>> I >>>>>>>>>>>>> am >>>>>>>>>>>>>>>>> running into a very big hurdle. I am unable to >>>>>> launch >>>>>>> my >>>>>>>>>>> project >>>>>>>>>>>>> so >>>>>>>>>>>>>> it >>>>>>>>>>>>>>>> can >>>>>>>>>>>>>>>>> write to an S3 bucket. My project is running on >>>> an >>>>>> EMR >>>>>>>>>>> cluster, >>>>>>>>>>>>>> where >>>>>>>>>>>>>>> I >>>>>>>>>>>>>>>>> have installed Flink 1.3.2. I am using Yarn to >>>>>> launch >>>>>>>> the >>>>>>>>>>>>>> application, >>>>>>>>>>>>>>>> and >>>>>>>>>>>>>>>>> it seems to run fine unless I am trying to enable >>>>>> check >>>>>>>>>>>>>> pointing(with a >>>>>>>>>>>>>>>> S3 >>>>>>>>>>>>>>>>> target). I am looking to use RocksDB as my >>>>>>>> check-pointing >>>>>>>>>>>> backend. >>>>>>>>>>>>>> I >>>>>>>>>>>>>>>> have >>>>>>>>>>>>>>>>> asked a few places, and I am still unable to >>>> find a >>>>>>>>> solution >>>>>>>>>> to >>>>>>>>>>>>> this >>>>>>>>>>>>>>>>> problem. Here are my steps for creating a >>>> cluster, >>>>>> and >>>>>>>>>>> launching >>>>>>>>>>>>> my >>>>>>>>>>>>>>>>> application, perhaps I am missing a step. I'd be >>>>>> happy >>>>>>>> to >>>>>>>>>>>> provide >>>>>>>>>>>>>> any >>>>>>>>>>>>>>>>> additional information if needed. >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> AWS Portal: >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> 1) EMR -> Create Cluster >>>>>>>>>>>>>>>>> 2) Advanced Options >>>>>>>>>>>>>>>>> 3) Release =3D emr-5.8.0 >>>>>>>>>>>>>>>>> 4) Only select Hadoop 2.7.3 >>>>>>>>>>>>>>>>> 5) Next -> Next -> Next -> Create Cluster ( I >>>>> do >>>>>>> fill >>>>>>>>> out >>>>>>>>>>>>>>>>> names/keys/etc) >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> Once the cluster is up I ssh into the Master and >>>> do >>>>>> the >>>>>>>>>>>> following: >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> 1 wget >>>>>>>>>>>>>>>>> http://apache.claz.org/flink/ >>>>>>>> flink-1.3.2/flink-1.3.2-bin- >>>>>>>>>>>>>>>>> hadoop27-scala_2.11.tgz >>>>>>>>>>>>>>>>> 2 tar -xzf flink-1.3.2-bin-hadoop27- >>>>>>> scala_2.11.tgz >>>>>>>>>>>>>>>>> 3 cd flink-1.3.2 >>>>>>>>>>>>>>>>> 4 ./bin/yarn-session.sh -n 2 -tm 5120 -s 4 >>>> -d >>>>>>>>>>>>>>>>> 5 Change conf/flink-conf.yaml >>>>>>>>>>>>>>>>> 6 ./bin/flink run -m yarn-cluster -yn 1 >>>>>>>>>>> ~/flink-consumer.jar >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> My conf/flink-conf.yaml I add the following >>>> fields: >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> state.backend: rocksdb >>>>>>>>>>>>>>>>> state.backend.fs.checkpointdir: >>>>>>> s3:/bucket/location >>>>>>>>>>>>>>>>> state.checkpoints.dir: s3:/bucket/location >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> My program's checkpointing setup: >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> env.enableCheckpointing(getCheckpointRate, >>>>>>>>>>>>> CheckpointingMode.EXACTLY_ >>>>>>>>>>>>>>>> ONCE) >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> env.getCheckpointConfig. >>>>>> enableExternalizedCheckpoints( >>>>>>>>>>>>>>>>> ExternalizedCheckpointCleanup. >>>>>> RETAIN_ON_CANCELLATION) >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>> env.getCheckpointConfig. >>>>>> setMinPauseBetweenCheckpoints( >>>>>>>>>>>>>>>>> getCheckpointMinPause) >>>>>>>>>>>>>>>>> env.getCheckpointConfig. >>>> setCheckpointTimeout( >>>>>>>>>>>>>> getCheckpointTimeout) >>>>>>>>>>>>>>>>> env.getCheckpointConfig. >>>>>>>> setMaxConcurrentCheckpoints(1) >>>>>>>>>>>>>>>>> env.setStateBackend(new >>>>>> RocksDBStateBackend("s3:// >>>>>>>>>>>>>>> bucket/location", >>>>>>>>>>>>>>>>> true)) >>>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>>=20 >>>>>>>>>>>>>>=20 >>>>>>>>>>>>>=20 >>>>>>>>>>>>=20 >>>>>>>>>>>=20 >>>>>>>>>>=20 >>>>>>>>>=20 >>>>>>>>=20 >>>>>>>=20 >>>>>>=20 >>>>>=20 >>>>=20 >>=20 >>=20