flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gonzalo Herreros <gherre...@gmail.com>
Subject Re: Spark suppress INFO messages per Streaming Job
Date Wed, 24 Feb 2016 08:14:25 GMT
The way I have done that is by having a copy the spark config folder with
the updated log4j settings and running the job with the flag that points to
that configuration folder.
The drawback is that if you change other Spark settings for the cluster,
that job won't be updated.

I guess other options are linking the config files in that alternative
config folder or maybe adding a log4j configuration in front of the
driver/executor classpath with the -extraClasspath options.

Maybe in the Spark user list people know of better ways.


On 23 February 2016 at 23:54, Sutanu Das <sd2302@att.com> wrote:

> Community,
> How can I suppress INFO messages  from Spark Streaming job for per job ?
> …….. meaning, I don’t want to change the log4j properties for the entire
> Spark cluster but want to suppress just the INFO messages for a specific
> Streaming job perhaps in the job properties file, Is that possible?
> Or, Do I need to write the sc._jvm.Logging function inside our scala code
> to suppress INFO messages of RDDs?
> Please help us, else, the Streaming job output re-direct log is so BIG
> with those INFO messages, our  file system is getting full. Thanks again.

View raw message