Flume to Spark Streaming - Pull model

In this post i will demonstrate how to stream data from flume into Spark using Streaming. When it comes to Streaming data from Flume to Spark you have 2 options.
  1. Push Model: Spark listens on particular port for Avro event and flume connects to that port and publishes event
  2. Pull Model: You use special Spark Sink in flume that keeps collecting published data and Spark pulls that data at certain frequency
I built this simple configuration in which i could send event to flume on netcat, flume would take those events and send them to Spark as well as print to console.
  • First download spark-streaming-flume-sink_2.10-1.6.0.jar and copy it to flume/lib directory
  • Next create flume configuration that looks like this, as you can see, Flume is listening for netcat event on port 44444 and it is taking every event and replicating it to both logger and Spark sink. Spark sink would listen on port 9999 for Spark program to connect
  • This is how your Spark driver will look like. The Spark Flume listener gets event in avro format so you will have to call event.getBody().array() to get the event.
Once your spark and flume agents are started open netcat on port 44444 and send messages, those messages should appear in your Spark Console

7 comments:

  1. Hi i am trying to integrate spark with flume. can u plz make a video or elaborate the post step by step. thank u so much

    ReplyDelete
  2. Can you please provide the same concept using java? And also make video

    ReplyDelete
  3. Hi Sunil

    Can you let me know what all dependencies are required for the pom.xml file. Also how to integrate it with sbt build tool in eclipse.
    Thanks

    ReplyDelete
  4. Hi Sunil,
    Can you please specify the command you used for deploying spark. which are the host and port numbers used for the same?

    ReplyDelete
  5. Hi,

    I'm trying to implement it, but my problem is that my Flume memory channel is filled, but events are not consumed by the sink. My configuration seems good. Did you already have this kind of problem ?

    Thank in advance

    Hélène

    ReplyDelete