Return-Path: X-Original-To: apmail-flink-user-archive@minotaur.apache.org Delivered-To: apmail-flink-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 16D5619D00 for ; Fri, 1 Apr 2016 16:54:36 +0000 (UTC) Received: (qmail 28493 invoked by uid 500); 1 Apr 2016 16:54:35 -0000 Delivered-To: apmail-flink-user-archive@flink.apache.org Received: (qmail 28403 invoked by uid 500); 1 Apr 2016 16:54:35 -0000 Mailing-List: contact user-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flink.apache.org Delivered-To: mailing list user@flink.apache.org Received: (qmail 28393 invoked by uid 99); 1 Apr 2016 16:54:35 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 01 Apr 2016 16:54:35 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 5638618058E for ; Fri, 1 Apr 2016 16:54:35 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.299 X-Spam-Level: * X-Spam-Status: No, score=1.299 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HEADER_FROM_DIFFERENT_DOMAINS=0.001, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id vQgbUw80IZjB for ; Fri, 1 Apr 2016 16:54:33 +0000 (UTC) Received: from mail-io0-f175.google.com (mail-io0-f175.google.com [209.85.223.175]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 9BA4F5F480 for ; Fri, 1 Apr 2016 16:54:32 +0000 (UTC) Received: by mail-io0-f175.google.com with SMTP id q128so155825391iof.3 for ; Fri, 01 Apr 2016 09:54:32 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:sender:in-reply-to:references:date:message-id:subject :from:to:cc; bh=5ON8l6G7P9jzedSA7qmzncNc7gArytzHlSwJtZ3U0Vc=; b=Gug+j9rqXWSI3efSZlbn83Oor3TrmKHnAf21ywnDRe9jKhUCW6y5K/nMOQlLyKdgmi ObIyHJtkD8fh5gM9BeZBIRjLMmsu/fX/RfZPgOpboMZUXudZSRInCgEuIgyebVOBvUjd RPJalm4hV+uMJYTfIMZZIedwBFZK1ShzPDwG/tyRae+feodEbUT1F4qZkuGbGZfeUN8i jHO5tSKFnXifVqHGbaIJ2B7jFBBbyL3lGXVu/u5P0uYrvqFrCWbk79vtud61xU2aupH8 QHwXxkPaY1j9J6T5eCKRH3BoM+2fb5X8lXCnDm5BdL/z1+zaFc43le49AvdFFvsFQj25 OdhA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:sender:in-reply-to:references:date :message-id:subject:from:to:cc; bh=5ON8l6G7P9jzedSA7qmzncNc7gArytzHlSwJtZ3U0Vc=; b=ThV4a4SCnDFUEbMmK+p31Nl/Srq7XTmu0EVhVePUifc4Nwn2KUfN2ofMaFsW6HR8Ww G5qx2lx5rDVEUyweAzkNyUNGIRTlE35mzaHnXuUsxEZzQCmgX7FrHA3Mc1XMDG6P2192 1tqKLrEywCPElg6Y3xU+Gmo0+rqIY6dWFzWXqnNYVUsmi1YJfLu25N7STySCznIYh/U7 74Lz3zIcY4EKqOUmViiVbQYTHqMoPw3+S/ZDGBFt0y3ZqcfhtpsTZUepWnLKjxEdapSI v9LqyQLOsyUm2/o5vQJbhpPFS5MKl8QzXiq+AgrkUhk/OTR8Pnemheh2Nar7axQR5YgT kzhA== X-Gm-Message-State: AD7BkJJ6g7FJYj20IzM8nC91qvSAk2YXSIDg131lDGVcax82wFKkjAxvWJEcBOeXNncSnDH0geSfQt+aKTbskg== MIME-Version: 1.0 X-Received: by 10.107.149.3 with SMTP id x3mr5999722iod.51.1459529671986; Fri, 01 Apr 2016 09:54:31 -0700 (PDT) Sender: ewenstephan@gmail.com Received: by 10.107.10.68 with HTTP; Fri, 1 Apr 2016 09:54:31 -0700 (PDT) In-Reply-To: <4E4D9010-826F-4E89-9E52-BD4907DB0FB0@gmail.com> References: <7C5129D8-3D5E-4938-A928-69021891A0EB@gmail.com> <63E9E694-B68C-4487-8C26-72CD312B75DB@gmail.com> <6F559987-248F-472B-A486-238D5801A0CF@gmail.com> <4E4D9010-826F-4E89-9E52-BD4907DB0FB0@gmail.com> Date: Fri, 1 Apr 2016 18:54:31 +0200 X-Google-Sender-Auth: LKt7PsJTuAJrx2rEbNAzhHjebf0 Message-ID: Subject: Re: Kafka Test Error From: Stephan Ewen To: user@flink.apache.org Cc: balaji.rajagopalan@olacabs.com Content-Type: multipart/alternative; boundary=001a1140b6cca355e6052f6f3b38 --001a1140b6cca355e6052f6f3b38 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable The issue may be that you include Kafka twice: 1) You explicitly add "org.apache.kafka:kafka-clients:*0.9.0.0*" 2) You add "org.apache.flink:flink-connector-kafka-0.9_2.10:1.0.0", which internally adds "org.apache.kafka:kafka-clients:*0.9.0.1*" These two Kafka versions may conflict. I would drop the dependency (1) and simply let the FlinkKafkaConsumer pull whatever dependency it needs by itself. The 0.9.0.1 client the Flink internally uses should read fine from Kafka 0.9.0.0 brokers. Greetings, Stephan On Fri, Apr 1, 2016 at 5:19 PM, Zhun Shen wrote: > Yeah, I mean I read the demo with FlinkKafkaConsumer08( > http://data-artisans.com/kafka-flink-a-practical-how-to/) then I wrote > the program based on Kafka 0.9.0.0 and Flink 1.0.0. > > On Apr 1, 2016, at 7:27 PM, Balaji Rajagopalan < > balaji.rajagopalan@olacabs.com> wrote: > > Did you make sure the flinkconnector version and flink version is the sam= e > ? Also for 0.8.0.0 you will have to use FlinkKafkaConsumer08 > > On Fri, Apr 1, 2016 at 3:21 PM, Zhun Shen wrote= : > >> I follow the example of kafka 0.8.0.0 on Flink doc. >> >> public static void main(String[] args) throws Exception { >> StreamExecutionEnvironment env =3D >> StreamExecutionEnvironment.getExecutionEnvironment(); >> Properties properties =3D new Properties(); >> properties.setProperty("bootstrap.servers", "localhost:9092"); >> properties.setProperty("zookeeper.connect", "localhost:2181"); >> properties.setProperty("group.id", "test"); >> properties.setProperty("key.deserializer", >> "org.apache.kafka.common.serialization.StringDeserializer"); >> properties.setProperty("value.deserializer", >> "org.apache.kafka.common.serialization.StringDeserializer"); >> properties.setProperty("partition.assignment.strategy", "range")= ; >> >> DataStream messageStream =3D env >> .addSource(new FlinkKafkaConsumer09("nginx-logs"= , >> new SimpleStringSchema(), properties)); >> >> messageStream >> .rebalance() >> .map(new MapFunction() { >> >> @Override >> public String map(String value) throws Exception { >> return "Kafka and Flink says: " + value; >> } >> }).print(); >> >> env.execute(); >> } >> >> >> Always got the error below: >> >> java.lang.NoSuchMethodError: >> org.apache.kafka.clients.consumer.KafkaConsumer.partitionsFor(Ljava/lang= /String;)Ljava/util/List; >> at >> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(= FlinkKafkaConsumer09.java:194) >> at >> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(= FlinkKafkaConsumer09.java:164) >> at >> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(= FlinkKafkaConsumer09.java:131) >> >> >> >> >> On Apr 1, 2016, at 1:40 PM, Ashutosh Kumar >> wrote: >> >> I am using flink 1.0.0 with kafka 0.9 . I works fine for me. I use >> following dependency. >> >> >> org.apache.flink >> flink-connector-kafka-0.9_2.10 >> 1.0.0 >> provided >> >> >> Thanks >> Ashutosh >> >> On Fri, Apr 1, 2016 at 10:46 AM, Zhun Shen >> wrote: >> >>> Hi there, >>> >>> I check my build.gradle file, I use >>> 'org.apache.flink:flink-connector-kafka-0.9_2.10:1.0.0=E2=80=99, but I = found that >>> this lib is based on kaka-clients 0.9.0.1. >>> >>> I want to use Flink streaming to consume Kafka=E2=80=99s events in real= time, but >>> I=E2=80=99m confused by Flink=E2=80=99s libs with different versions. W= hich >>> flink-connector-kafka is comparable with kafka 0.9.0.0 ? >>> My environment is Kafka: 0.9.0.0, Flink: 1.0.0, Language: java >>> >>> part of my build.grade=EF=BC=9A >>> 'org.apache.kafka:kafka_2.10:0.9.0.0', >>> 'org.apache.kafka:kafka-clients:0.9.0.0', >>> 'org.apache.flink:flink-java:1.0.0', >>> 'org.apache.flink:flink-streaming-java_2.10:1.0.0', >>> 'org.apache.flink:flink-connector-kafka-0.9_2.10:1.0.0', >>> 'org.apache.flink:flink-connector-kafka-base_2.10:1.0.0 >>> >>> Any advice ? >>> >>> Thanks. >>> >>> >>> On Mar 30, 2016, at 10:35 PM, Stephan Ewen wrote: >>> >>> Hi! >>> >>> A "NoSuchMethodError" usually means that you compile and run against >>> different versions. >>> >>> Make sure the version you reference in the IDE and the version on the >>> cluster are the same. >>> >>> Greetings, >>> Stephan >>> >>> >>> >>> On Wed, Mar 30, 2016 at 9:42 AM, Balaji Rajagopalan < >>> balaji.rajagopalan@olacabs.com> wrote: >>> >>>> I have tested kafka 0.8.0.2 with flink 1.0.0 and it works for me. Can'= t >>>> talk about kafka 0.9.0.1. >>>> >>>> On Wed, Mar 30, 2016 at 12:51 PM, Zhun Shen >>>> wrote: >>>> >>>>> Hi there, >>>>> >>>>> flink version: 1.0.0 >>>>> kafka version: 0.9.0.0 >>>>> env: local >>>>> >>>>> I run the script below: >>>>> ./bin/flink run -c com.test.flink.FlinkTest test.jar --topic >>>>> nginx-logs --bootstrap.servers localhost:9092 --zookeeper.connect >>>>> localhost:2181 --group.id myGroup --partition.assignment.strategy >>>>> round robin >>>>> >>>>> But I got the error: >>>>> ava.lang.NoSuchMethodError: >>>>> org.apache.kafka.clients.consumer.KafkaConsumer.partitionsFor(Ljava/l= ang/String;)Ljava/util/List; >>>>> at >>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:194) >>>>> at >>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:164) >>>>> at >>>>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:131) >>>>> >>>>> >>>>> The code as below: >>>>> DataStream messageStream =3D env.addSource(new >>>>> FlinkKafkaConsumer09<>("nginx-logs", new >>>>> SimpleStringSchema(),parameterTool.getProperties())); >>>>> messageStream.rebalance().map(new MapFunction>>>> String>() { >>>>> >>>>> @Override >>>>> public String map(String value) throws Exception { >>>>> return "Kafka and Flink says: " + value; >>>>> } >>>>> }).print(); >>>>> >>>>> >>>>> I check the error with google, but it shows that it is a method of >>>>> kafka 0.9.01. Any idea? Thanks. >>>>> >>>>> >>>> >>> >>> >> >> > > --001a1140b6cca355e6052f6f3b38 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
The issue may be that you include Kafka twice:

1) You explicitly add "org.apache.kafka:kafka-clients:0.9.0= .0"
2) You add "org.apache.flink:flink-connector-ka= fka-0.9_2.10:1.0.0", which internally adds "org.apache.kafka:kafk= a-clients:0.9.0.1"

These two Kafka ver= sions may conflict. I would drop the dependency (1) and simply let the Flin= kKafkaConsumer pull whatever dependency it needs by itself.
The 0= .9.0.1 client the Flink internally uses should read fine from Kafka 0.9.0.0= brokers.

Greetings,
Stephan
<= br>

On= Fri, Apr 1, 2016 at 5:19 PM, Zhun Shen <shenzhunallen@gmail.com= > wrote:
Yeah, I mean I read the demo with=C2=A0FlinkKafkaConsumer08(<= a href=3D"http://data-artisans.com/kafka-flink-a-practical-how-to/" target= =3D"_blank">http://data-artisans.com/kafka-flink-a-practical-how-to/) t= hen=C2=A0I wrote the program based on Kafka 0.9.0.0 and Flink 1.0.0.

On Apr 1, 2016= , at 7:27 PM, Balaji Rajagopalan <balaji.rajagopalan@olacabs.com> wrote:=

Did you make sure the flinkconnector versio= n and flink version is the same ? Also for 0.8.0.0 you will have to use=C2= =A0FlinkKafkaConsumer08

<= div class=3D"gmail_quote">On Fri, Apr 1, 2016 at 3:21 PM, Zhun Shen <shenzhunallen@gmail.com> wrote:
I follow the example of kafka 0.8= .0.0 on Flink doc.

=C2=A0 =C2=A0 public static void= main(String[] args) throws Exception {
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 StreamExecutionEnvironment env =3D StreamExecutionEnvironment.getExecut= ionEnvironment();
=C2=A0 =C2=A0 =C2=A0 =C2=A0 Properties properti= es =3D new Properties();
=C2=A0 =C2=A0 =C2=A0 =C2=A0 properties.s= etProperty("bootstrap.servers", "localhost:9092");
=C2=A0 =C2=A0 =C2=A0 =C2=A0 properties.setProperty("zookeeper.co= nnect", "localhost:2181");
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 properties.setProperty("group.id", "test");
=C2=A0 =C2=A0 =C2= =A0 =C2=A0 properties.setProperty("key.deserializer", "org.a= pache.kafka.common.serialization.StringDeserializer");
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 properties.setProperty("value.deserializer&qu= ot;, "org.apache.kafka.common.serialization.StringDeserializer");=
=C2=A0 =C2=A0 =C2=A0 =C2=A0 properties.setProperty("partiti= on.assignment.strategy", "range");

= =C2=A0 =C2=A0 =C2=A0 =C2=A0 DataStream<String> messageStream =3D env<= /div>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 .addSourc= e(new FlinkKafkaConsumer09<String>("nginx-logs", new Simple= StringSchema(), properties));

=C2=A0 =C2=A0 =C2=A0= =C2=A0 messageStream
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 .rebalance()
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 .map(new MapFunction<String, String>() {

=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 @Override
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 public String map(String value) throws E= xception {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 return "Kafka and Flink says: " += value;
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 }
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 }).print();

=C2=A0 =C2=A0 =C2=A0 =C2= =A0 env.execute();
=C2=A0 =C2=A0 }


Always got the error below:

java.la= ng.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.parti= tionsFor(Ljava/lang/String;)Ljava/util/List;
at org.apache.flink.streaming.connectors.kaf= ka.FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.java:194)
at org.apache.flink.streami= ng.connectors.kafka.FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.= java:164)
at org.apa= che.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.<init>(Flin= kKafkaConsumer09.java:131)

=


On Ap= r 1, 2016, at 1:40 PM, Ashutosh Kumar <kmr.ashutosh16@gmail.com> wrote:
<= br>
I am using flink 1.0.0 with kafka 0.9 . = I works fine for me. I use following dependency.

<dependency>=
=C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0=C2=A0 <groupId>= ;org.apache.flink</groupId>
=C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0=C2=A0 = =C2=A0=C2=A0=C2=A0 <artifactId>flink-connector-kafka-0.9_2.10</art= ifactId>
=C2=A0=C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0=C2= =A0 <version>1.0.0</version>
=C2=A0=C2=A0=C2=A0 =C2=A0=C2=A0= =C2=A0 =C2=A0=C2=A0=C2=A0 <scope>provided</scope>
=C2=A0 <= ;/dependency>

Thanks
Ashutosh

On Fri, Apr 1, 2016 at 10:4= 6 AM, Zhun Shen <shenzhunallen@gmail.com> wrote:
Hi there,
I check my build.gradle = file, I use 'org.apache.flink:flink-connector-kafka-0.9_2.10:1.0.0=E2= =80=99, but I found that this lib is based on kaka-clients 0.9.0.1.<= /div>

I want to use Flink streaming to consume Kafka=E2=80=99= s events in realtime, but I=E2=80=99m confused by Flink=E2=80=99s libs with= different versions. Which=C2=A0flink= -connector-kafka=C2=A0is comparable w= ith kafka 0.9.0.0 ?
My envi= ronment is Kafka: 0.9.0.0, Flink: 1.0.0, Language: java

part of my=C2=A0build.grade=EF=BC=9A
'org.apache.kafka:kafka_2.10:0.9.0.0',
<= div>'org.apache.kafka:kafka-clients:0.9.0.0',
'org.ap= ache.flink:flink-java:1.0.0',
'org.apache.flink:flink-str= eaming-java_2.10:1.0.0',
'org.apache.flink:flink-connecto= r-kafka-0.9_2.10:1.0.0',
'org.apache.flink:flink-connecto= r-kafka-base_2.10:1.0.0

Any advice ?=C2=A0<= /span>

Thanks.


On Mar 30, 2016, at 10:= 35 PM, Stephan Ewen <sewen@apache.org> wrote:

Hi!
<= br>
A "NoSuchMethodError&qu= ot; usually means that you compile and run against different versions.

Make sure the version you reference in the IDE an= d the version on the cluster are the same.

Gr= eetings,
Stephan


<= div class=3D"gmail_extra">
On Wed, Mar 30, 20= 16 at 9:42 AM, Balaji Rajagopalan <balaji.rajagopalan@olacabs= .com> wrote:
I have test= ed kafka 0.8.0.2 with flink 1.0.0 and it works for me. Can't talk about= kafka 0.9.0.1.=C2=A0

On Wed, Mar 30, 2016 at 12:51 PM, Zhun Shen <s= henzhunallen@gmail.com> wrote:
Hi there,<= br>
flink version: 1.0.0
kafka version: 0.9.0.0
env: local

I run the script below:
./bin/flink run -c com.test.flink.FlinkTest test.jar --topic nginx-logs --b= ootstrap.servers localhost:9092 --zookeeper.connect localhost:2181 --group.id my= Group --partition.assignment.strategy round robin

But I got the error:
ava.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer= .partitionsFor(Ljava/lang/String;)Ljava/util/List;
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.flink.streaming.connectors.kafka.= FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.java:194)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.flink.streaming.connectors.kafka.= FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.java:164)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.flink.streaming.connectors.kafka.= FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.java:131)


The code as=C2=A0 below:
=C2=A0 =C2=A0 =C2=A0 =C2=A0 DataStream<String> messageStream =3D env.= addSource(new FlinkKafkaConsumer09<>("nginx-logs", new Simp= leStringSchema(),parameterTool.getProperties()));
=C2=A0 =C2=A0 =C2=A0 =C2=A0 messageStream.rebalance().map(new MapFunction&l= t;String, String>() {

=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 @Override
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 public String map(String value) t= hrows Exception {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 return "Kafka = and Flink says: " + value;
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 }
=C2=A0 =C2=A0 =C2=A0 =C2=A0 }).print();


I check the error with google, but it shows that it is a method of kafka 0.= 9.01. Any idea? Thanks.







--001a1140b6cca355e6052f6f3b38--