spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jiří Syrový <syrovy.j...@gmail.com>
Subject Re: Dependency Injection and Microservice development with Spark
Date Wed, 04 Jan 2017 12:06:17 GMT
Hi,

another nice approach is to use instead of it Reader monad and some
framework to support this approach (e.g. Grafter -
https://github.com/zalando/grafter). It's lightweight and helps a bit with
dependencies issues.

2016-12-28 22:55 GMT+01:00 Lars Albertsson <lalle@mapflat.com>:

> Do you really need dependency injection?
>
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
>
> Or do you want to use DI for other reasons?
>
>
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/lalle@mapflat.com
>
>
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
> <chetan.opensource@gmail.com> wrote:
> > Hello Community,
> >
> > Current approach I am using for Spark Job Development with Scala + SBT
> and
> > Uber Jar with yml properties file to pass configuration parameters. But
> If i
> > would like to use Dependency Injection and MicroService Development like
> > Spring Boot feature in Scala then what would be the standard approach.
> >
> > Thanks
> >
> > Chetan
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Mime
View raw message