From commits-return-13987-archive-asf-public=cust-asf.ponee.io@hudi.apache.org Sun Mar 22 17:24:04 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 85564180677 for ; Sun, 22 Mar 2020 18:24:03 +0100 (CET) Received: (qmail 38921 invoked by uid 500); 22 Mar 2020 17:24:03 -0000 Mailing-List: contact commits-help@hudi.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hudi.apache.org Delivered-To: mailing list commits@hudi.apache.org Received: (qmail 38828 invoked by uid 99); 22 Mar 2020 17:24:02 -0000 Received: from mailrelay1-us-west.apache.org (HELO mailrelay1-us-west.apache.org) (209.188.14.139) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 22 Mar 2020 17:24:02 +0000 Received: from jira-he-de.apache.org (static.172.67.40.188.clients.your-server.de [188.40.67.172]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id B4482E013F for ; Sun, 22 Mar 2020 17:24:01 +0000 (UTC) Received: from jira-he-de.apache.org (localhost.localdomain [127.0.0.1]) by jira-he-de.apache.org (ASF Mail Server at jira-he-de.apache.org) with ESMTP id 63797780BBD for ; Sun, 22 Mar 2020 17:24:00 +0000 (UTC) Date: Sun, 22 Mar 2020 17:24:00 +0000 (UTC) From: "Vinoth Chandar (Jira)" To: commits@hudi.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (HUDI-719) Exception during clean phase: Found org.apache.hudi.avro.model.HoodieCleanMetadata, expecting org.apache.hudi.avro.model.HoodieCleanerPlan MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HUDI-719?page=3Dcom.atlassian.= jira.plugin.system.issuetabpanels:all-tabpanel ] Vinoth Chandar updated HUDI-719: -------------------------------- Description:=20 Dataset is written using 0.5 moving to the latest master: {code:java} Exception in thread "main" org.apache.avro.AvroTypeException: Found org.ap= ache.hudi.avro.model.HoodieCleanMetadata, expecting org.apache.hudi.avro.mo= del.HoodieCleanerPlan, missing required field policy at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:292) at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) at org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.jav= a:130) at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReade= r.java:215) at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(Generi= cDatumReader.java:175) at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java= :153) at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java= :145) at org.apache.avro.file.DataFileStream.next(DataFileStream.java:233) at org.apache.avro.file.DataFileStream.next(DataFileStream.java:220) at org.apache.hudi.common.util.AvroUtils.deserializeAvroMetadata(AvroUtils= .java:149) at org.apache.hudi.common.util.CleanerUtils.getCleanerPlan(CleanerUtils.ja= va:87) at org.apache.hudi.client.HoodieCleanClient.runClean(HoodieCleanClient.jav= a:141) at org.apache.hudi.client.HoodieCleanClient.lambda$clean$0(HoodieCleanClie= nt.java:88) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.jav= a:1382) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:= 580) at org.apache.hudi.client.HoodieCleanClient.clean(HoodieCleanClient.java:8= 6) at org.apache.hudi.client.HoodieWriteClient.clean(HoodieWriteClient.java:8= 43) at org.apache.hudi.client.HoodieWriteClient.postCommit(HoodieWriteClient.j= ava:520) at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieW= riteClient.java:168) at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieW= riteClient.java:111) at org.apache.hudi.utilities.deltastreamer.DeltaSync.writeToSink(DeltaSync= .java:397) at org.apache.hudi.utilities.deltastreamer.DeltaSync.syncOnce(DeltaSync.ja= va:237) at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(Hoodie= DeltaStreamer.java:121) at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(Hoodie= DeltaStreamer.java:294) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja= va:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso= rImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scal= a:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit= $$runMain(SparkSubmit.scala:845) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:= 920) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala){code} =C2=A0 was: Dataset is written using 0.5 moving to the latest master: =C2=A0 Exception in thread "main" org.apache.avro.AvroTypeException: Found org.apa= che.hudi.avro.model.HoodieCleanMetadata, expecting org.apache.hudi.avro.mod= el.HoodieCleanerPlan, missing required field policy at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:292) at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) at org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.jav= a:130) at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReade= r.java:215) at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(Generi= cDatumReader.java:175) at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java= :153) at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java= :145) at org.apache.avro.file.DataFileStream.next(DataFileStream.java:233) at org.apache.avro.file.DataFileStream.next(DataFileStream.java:220) at org.apache.hudi.common.util.AvroUtils.deserializeAvroMetadata(AvroUtils= .java:149) at org.apache.hudi.common.util.CleanerUtils.getCleanerPlan(CleanerUtils.ja= va:87) at org.apache.hudi.client.HoodieCleanClient.runClean(HoodieCleanClient.jav= a:141) at org.apache.hudi.client.HoodieCleanClient.lambda$clean$0(HoodieCleanClie= nt.java:88) at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.jav= a:1382) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:= 580) at org.apache.hudi.client.HoodieCleanClient.clean(HoodieCleanClient.java:8= 6) at org.apache.hudi.client.HoodieWriteClient.clean(HoodieWriteClient.java:8= 43) at org.apache.hudi.client.HoodieWriteClient.postCommit(HoodieWriteClient.j= ava:520) at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieW= riteClient.java:168) at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieW= riteClient.java:111) at org.apache.hudi.utilities.deltastreamer.DeltaSync.writeToSink(DeltaSync= .java:397) at org.apache.hudi.utilities.deltastreamer.DeltaSync.syncOnce(DeltaSync.ja= va:237) at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(Hoodie= DeltaStreamer.java:121) at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(Hoodie= DeltaStreamer.java:294) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja= va:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso= rImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scal= a:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit= $$runMain(SparkSubmit.scala:845) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:= 920) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Exception during clean phase: Found org.apache.hudi.avro.model.HoodieClea= nMetadata, expecting org.apache.hudi.avro.model.HoodieCleanerPlan > -------------------------------------------------------------------------= ----------------------------------------------------------------- > > Key: HUDI-719 > URL: https://issues.apache.org/jira/browse/HUDI-719 > Project: Apache Hudi (incubating) > Issue Type: Bug > Components: DeltaStreamer > Reporter: Alexander Filipchik > Priority: Major > Fix For: 0.6.0 > > > Dataset is written using 0.5 moving to the latest master: > {code:java} > Exception in thread "main" org.apache.avro.AvroTypeException: Found org.= apache.hudi.avro.model.HoodieCleanMetadata, expecting org.apache.hudi.avro.= model.HoodieCleanerPlan, missing required field policy > at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:29= 2) > at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) > at org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.j= ava:130) > at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumRea= der.java:215) > at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(Gene= ricDatumReader.java:175) > at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.ja= va:153) > at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.ja= va:145) > at org.apache.avro.file.DataFileStream.next(DataFileStream.java:233) > at org.apache.avro.file.DataFileStream.next(DataFileStream.java:220) > at org.apache.hudi.common.util.AvroUtils.deserializeAvroMetadata(AvroUti= ls.java:149) > at org.apache.hudi.common.util.CleanerUtils.getCleanerPlan(CleanerUtils.= java:87) > at org.apache.hudi.client.HoodieCleanClient.runClean(HoodieCleanClient.j= ava:141) > at org.apache.hudi.client.HoodieCleanClient.lambda$clean$0(HoodieCleanCl= ient.java:88) > at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.j= ava:1382) > at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.jav= a:580) > at org.apache.hudi.client.HoodieCleanClient.clean(HoodieCleanClient.java= :86) > at org.apache.hudi.client.HoodieWriteClient.clean(HoodieWriteClient.java= :843) > at org.apache.hudi.client.HoodieWriteClient.postCommit(HoodieWriteClient= .java:520) > at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodi= eWriteClient.java:168) > at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodi= eWriteClient.java:111) > at org.apache.hudi.utilities.deltastreamer.DeltaSync.writeToSink(DeltaSy= nc.java:397) > at org.apache.hudi.utilities.deltastreamer.DeltaSync.syncOnce(DeltaSync.= java:237) > at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(Hood= ieDeltaStreamer.java:121) > at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(Hood= ieDeltaStreamer.java:294) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.= java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces= sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.sc= ala:52) > at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubm= it$$runMain(SparkSubmit.scala:845) > at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161= ) > at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) > at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) > at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scal= a:920) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala){code} > =C2=A0 -- This message was sent by Atlassian Jira (v8.3.4#803005)