An important part of any application is the underlying log system we incorporate into it. Logs are not only for debugging and traceability, but also for business intelligence. Building a robust logging system within our apps could be use as a great insights of the business problems we are solving.
Spark uses log4j as the standard library for its own logging. Everything that happens inside Spark gets logged to the shell console and to the configured underlying storage. Spark also provides a template for app writers so we could use the same log4j libraries to add whatever messages we want to the existing and in place implementation of logging in Spark.
Under the SPARK_HOME/conf folder, there is log4j.properties.template file which serves as an starting point for our own logging system.
Based on this file, we created the log4j.properties file and put it under the same directory.
log4j.properties looks like follows:
Basically, we want to hide all logs Spark generates so we don’t have to deal with them in the shell. We redirect them to be logged in the file system. On the other hand, we want our own logs to be logged in the shell and a separated file so they don’t get mixed up with the ones from Spark. From here, we will point Splunk to the files where our own logs are which in this particular case is /var/log/sparkU.log.
This (log4j.properties) file is picked up by Spark when the application starts so we don’t have to do anything aside of placing it in the mentioned location.
Now that we have configured the components that Spark requires in order to manage our logs, we just need to start writing logs within our apps.
In order to show how this is done, let’s write a small app that helps us in the demonstration.
Our App:
Running this Spark app will demonstrate that our log system works. We will be able to see how Hello demo and I am done messages being logged in the shell and in the file system while the Spark logs will only go to the file system.
So far, everything seems easy, yet there is a problem we haven’t mentioned.
The class org.apache.log4j.Logger is not serializable which implies we cannot use it inside a closure while doing operations on some parts of the Spark API.
For example, if we do in our app:
this will fail when running on Spark. Spark complaints that the log object is not Serializable so it cannot be sent over the network to the Spark workers.
This problem is actually easy to solve. Let’s create a class that does something to our data set while doing a lot of logging.
Mapper receives a RDD[Int] and returns a RDD[String] and it also logs what value its being mapped. In this case, noted how the log object has been marked as @transient which allows the serialization system to ignore the log object. Now, Mapper is being serialized and sent to each worker but the log object is being resolved when it is needed in the worker, solving our problem.
Another solution is to wrap the log object into a object construct and use it all over the place. We rather have log within the class we are going to use it, but the alternative is also valid.
At this point, our entire app looks like follows:
Our logs are now being shown in the shell and also stored in their own files. Spark logs are being hidden from the shell and being logged into their own file. We also solved the serialization problem that appears when trying to log in different workers.
We now can build more robust BI systems based on our own Spark logs as we do with other non distributed systems and applications we have today. Business Intelligence is for us a very big deal and having the right insights is always nice to have.
If you find this post useful, please recommend it so others can benefit from it.
Read next:
How to log in Apache Spark, a functional approach
Hacker Noon is how hackers start their afternoons. We’re a part of the @AMIfamily. We are now accepting submissions and happy to discuss advertising &sponsorship opportunities.
To learn more, read our about page, like/message us on Facebook, or simply, tweet/DM @HackerNoon.
If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!