WordCount MapReduce program using Hadoop streaming and python

I wanted to learn how to use Hadoop Streaming, which allows us to use scripting language such as Python, Ruby,.. etc for developing Map Reduce program. The idea is instead of writing Java classes for Mapper and Reducer you develop 2 script files (something that can be executed from command line) one for mapper and other for reducer and pass it to Hadoop. Hadoop will communicate to the script files using standard input/output, which means for both mapper and reducer hadoop will pass input on standard input and your script file will read it from standard input. Once your script is done processing the data in either mapper or reducer it will write output to standard output that will get sent back to hadoop. I decided to create Word Count program that takes a file as input and counts occurrence of every word in the file and writes it in output. I followed these steps
  1. I started by creating a mapper.py file like this, In the mapper i am reading one line from input at a time and then splitting it into pieces and writing it to output in (word,1) format. In the mapper whatever i write in output gets passed back to Hadoop, so i could not use standard output for writing debug statements. So i configured file logger that generates debug.log in the current directory
  2. Next i created a reducer.py program that reads one line at a time and splits it on tab character. In the split first part is word and second is the count. Now one difference between java reducer and streaming reducer is in Java your reduce method gets input like this (key, [value1, value2,value3]),(key1, [value1, value2,value3]) . In streaming it gets called with one key and value every time like this (key,value1),(key,value2),(key,value3),(key1,value),(key1,value2),(key1,value3), so you will have to remember what key your processing and handle the change in key. In my reducer i am keeping track of current key, and for every value of the current key i keep accumulating it, when the key changes i use that opportunity to dump the old key and count
  3. One good part with developing using scripting is that you can test your code without hadoop as well. In this case once my mapper and reducer are ready i can test it on command line using data | mapper | sort | reducer format. In my case the mapper and reducer files are in /home/user/workspace/HadoopPython/streaming/ directory. and i have a sample file in home directory so i could test my program by executing it like this cat ~/sample.txt | /home/user/workspace/HadoopPython/streaming/mapper.py | sort | /home/user/workspace/HadoopPython/streaming/reducer.py
  4. After working through bugs i copied aesop.txt in in root of my HDFS and then i could use following command to execute my map reduce program. hadoop jar /usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.4.0.jar -input aesop.txt -output output/wordcount -mapper /home/user/workspace/HadoopPython/streaming/mapper.py -reducer /home/user/workspace/HadoopPython/streaming/reducer.py
  5. Once the program is done executing i could see the output generated by it using following command hdfs dfs -cat output/wordcount/part-00000
Note: My mapper and reducer code is not as compact as it can be, because i am new to Python

202 comments:

«Oldest   ‹Older   201 – 202 of 202
IIM Skills(Shreya Saha) said...

This is a really clear and practical walkthrough of using Hadoop Streaming with Python! I like how you not only explained the theory behind the mapper and reducer but also gave actual code examples and command-line testing steps. It’s especially helpful for beginners trying to bridge the gap between Java-based Hadoop MapReduce and scripting languages.
financial modeling courses in delhi

indu said...

Really appreciate this clear demo of WordCount with Hadoop Streaming using Python! 🐍📊 It's a great example of how to leverage MapReduce without diving into Java. Loved how you broke down the mapper and reducer scripts—perfect for beginners and a solid refresher for anyone working with big data workflows.
financial modeling courses in delhi

«Oldest ‹Older   201 – 202 of 202   Newer› Newest»