Do it yourself

In this section, we will provide the problem for the reader so that they can create their own application after reading the previous content.

Here, we will extend the example given previous regarding the setup and configuration of NiFi. The problem statement is read from a real-time log file and put into Cassandra. The pseudo code is as follows:

  • Tail log file
  • Put events into Kafka topic
  • Read events from Kafka topic
  • Filter events
  • Push event into Cassandra

You have to install Cassandra and configure it so that NiFi will be able to connect it.

Logstash is made to process the logs and throw them to other tools for storage or visualization. The best fit here is Elastic Search, Logstash and Kibana (ELK). As per the scope of this chapter, we will build integration between Elastic Search and Logstash and, in the next chapters, we will integrate Elastic Search with Kibana for complete workflow. So all you need to do to build ELK is:

  • Create a program to read from PubNub for real-time sensor data. The same program will publish events to the Kafka topic
  • Install Elasticsearch on the local machine and start
  • Now, write a Logstash configuration which reads from a Kafka topic, parse and format them and push them into the Elasticsearch engine