reef-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saikat Kanjilal <sxk1...@gmail.com>
Subject Re: reef-runtime-spark--moving to Java1.8 [Discuss]
Date Tue, 17 Oct 2017 00:12:21 GMT
Ok time to tackle the next issue that I see when I launch
reef-runtime-spark, it seems that we need to incorporate Jackson core and
Jackson annotations
https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core
into either the reef-examples directory or the reef-runtime-spark directory
, however we have an issue here, when I look at the main reef pom file I
see a dependency on jackson which is really old shown below:

<dependency>
    <groupId>org.codehaus.jackson</groupId>
    <artifactId>jackson-mapper-asl</artifactId>
    <version>${jackson.version}</version>
</dependency>
<dependency>
    <groupId>org.codehaus.jackson</groupId>
    <artifactId>jackson-core-asl</artifactId>
    <version>${jackson.version}</version>
</dependency>


The Jackson dependency in the main pom file for the above is version 1.9.13.



How should we proceed on this, I have a feeling that moving to the
newer version of jackson might introduce other problems in the
codebase but I could be wrong here.


Some guidance/suggestions would be helpful.


Here is the spark exception I am seeing:

Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200)
        at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:194)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:194)
        at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:102)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
        at org.apache.reef.runtime.spark.job.SparkRunner.run(SparkRunner.java:55)
        at org.apache.reef.examples.data.loading.DataLoadingREEFOnSpark.main(DataLoadingREEFOnSpark.java:124)
Caused by: java.lang.NoSuchMethodError:
com.fasterxml.jackson.core.JsonFactory.requiresPropertyOrdering()Z




Thanks in advance


On Mon, Oct 16, 2017 at 2:21 PM, Sergiy Matusevych <
sergiy.matusevych@gmail.com> wrote:

> I agree that we should focus on Spark 2+. I don't know about spark 2.1,
> though -- are there any particular features in 2.1 that we need for
> integration?
>
> -- Sergiy.
>
> On Thu, Oct 12, 2017 at 8:34 AM, Saikat Kanjilal <sxk1969@gmail.com>
> wrote:
>
> > One other point comes to mind here in this discussion, I would propose we
> > start the reef-runtime-spark component off by supporting spark 2.1 and
> > beyond, any objections to this?  Additionally my vision is to have this
> > component work on local runtime, as well as emr and hdinsight runtime
> using
> > yarn.
> >
> > Sent from my iPhone
> >
> > > On Oct 10, 2017, at 5:32 PM, Saikat Kanjilal <sxk1969@gmail.com>
> wrote:
> > >
> > > Hey Sergiy,
> > > The features in spark API become easier to use if reef-runtime-spark
> > moves to Java1.8 but this is not strictly required at the moment.  Case
> in
> > point , when using a flatmap operation with Java1.8 it helps to be able
> to
> > use lamdas from a code maintenance and readability perspective , to
> > workaround this I implemented anonymous inner classes instead using the
> > FlatMapFunction which compiles and works fine as a substitute.
> > >
> > > It’s good to know that REEF works on 1.8, however the maven compilation
> > flags point to compiling for 1.7 which prevents a the use of lambdas.
> > >
> > > Regards
> > >
> > > Sent from my iPhone
> > >
> > >> On Oct 10, 2017, at 5:22 PM, Sergiy Matusevych <
> > sergiy.matusevych@gmail.com> wrote:
> > >>
> > >> Hi Saikat,
> > >>
> > >> REEF can run on JVM 1.8 just fine - we just don't use any Java 1.8
> > language
> > >> features at the moment. (Note that we don't prevent REEF applications
> > from
> > >> using Java8!)
> > >>
> > >> Are there any features in Spark API that might require
> > reef-runtime-spark
> > >> use Java8? If not, I would suggest sticking to Java7 for the time
> being
> > and
> > >> properly redesign REEF API to take full advantage of Java8 (say, using
> > >> lambdas instead of EventHandler classes) later. I really don't want to
> > mix
> > >> our Spark integration effort with any other sub-projects.
> > >>
> > >> What do you guys think?
> > >>
> > >> Cheers,
> > >> Sergiy.
> > >>
> > >>
> > >>> On Tue, Oct 10, 2017 at 3:25 PM, Saikat Kanjilal <sxk1969@gmail.com>
> > wrote:
> > >>>
> > >>> Hello Reef community,
> > >>> I'm doing some research on how to test the reef-runtime-spark end to
> > end
> > >>> and I wanted to bring up a few things:
> > >>>
> > >>> 1) It seems that the spark community is going to leave Java1.7 behind
> > and
> > >>> has already done so in most versions of spark (post 2.0.0 and in some
> > cases
> > >>> even earlier versions are compiling with Java8)-->see here:
> > >>> https://spark.apache.org/docs/2.1.0/
> > >>> 2) As I mentioned before if we really want to take advantage of some
> > of the
> > >>> paradigms in spark moving to Java1.8 makes that path much easier
> > >>> 3) Do we currently have clients who are using Java 1.8?
> > >>>
> > >>>
> > >>> I would love to spearhead an effort to do this port sooner than later
> > maybe
> > >>> as part of developing the reef-runtime-spark component as a
> > springboard to
> > >>> do this.
> > >>>
> > >>> What are your thoughts one way or another, would love a discussion
on
> > this.
> > >>>
> > >>> Thanks in advance.
> > >>>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message