spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lan Jiang <ljia...@gmail.com>
Subject Re: Configuring logging properties for executor
Date Mon, 20 Apr 2015 15:31:27 GMT
Each application gets its own executor processes,  so there should be no problem running them
in parallel. 

Lan


> On Apr 20, 2015, at 10:25 AM, Michael Ryabtsev <michael.ry@gmail.com> wrote:
> 
> Hi Lan, 
> 
> Thanks for fast response. It could be a solution if it works. I have more than one log4
properties file, for different run modes like debug/production, for executor and for application
core. I think I would like to keep them separate. Then, I suppose I should give all other
properties files a special names and keep the executor configuration with the default name?
Can I conclude that going this way I will not be able to run several applications on the same
cluster in parallel?
> 
> Regarding submit, I am not using it now, I submit from the code, but I think I should
consider this option.
> 
> Thanks.
> 
> On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang <ljiang2@gmail.com <mailto:ljiang2@gmail.com>>
wrote:
> Rename your log4j_special.properties file as log4j.properties and place it under the
root of your jar file, you should be fine.
> 
> If you are using Maven to build your jar, please the log4j.properties in the src/main/resources
folder.
> 
> However, please note that if you have other dependency jar file in the classpath that
contains another log4j.properties file this way, it might not work since the first log4j.properties
file that is loaded will be used.
> 
> You can also do spark-submit —file log4j_special.properties … ,which should transfer
your log4j property file to the worker nodes automatically without you copying them manually.
> 
> Lan
> 
> 
> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev <michael.ry@gmail.com <mailto:michael.ry@gmail.com>>
wrote:
> >
> > Hi all,
> >
> > I need to configure spark executor log4j.properties on a standalone cluster.
> > It looks like placing the relevant properties file in the spark
> > configuration folder and  setting the spark.executor.extraJavaOptions from
> > my application code:
> > sparkConf.set("spark.executor.extraJavaOptions",
> > "-Dlog4j.configuration=log4j_special.properties");
> > does the work, and the executor logs are written in the required place and
> > level. As far as I understand, it works, because the spark configuration
> > folder is on the class path, and passing parameter without path works here.
> > However, I would like to avoid deploying these properties to each worker
> > spark configuration folder.
> > I wonder, if I put the properties in my application jar, is there any way of
> > telling executor to load them?
> >
> > Thanks,
> >
> >
> >
> > --
> > View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
<http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html>
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <mailto:user-unsubscribe@spark.apache.org>
> > For additional commands, e-mail: user-help@spark.apache.org <mailto:user-help@spark.apache.org>
> >
> 
> 


Mime
View raw message