Hadoop provides you with the Java API that you can use to perform some of the commonly used file operations such as read, create a new file or append at the end of the existing file or search for files. I wanted to try these common operartions out so i built this HelloHDFS project, that you can download from
here
This is the main class that takes command line argument for operation name and file path and performs the operation.
After downloading the source code for the project you can use following maven command to build a single jar that also contains all the dependencies in it.
mvn clean compile assembly:single
Once your jar is ready you can use it like this for reading file from hdfs with relative path
java -jar target/HelloHDFS-jar-with-dependencies.jar read aesop.txt
or fully qualified path
java -jar target/HelloHDFS-jar-with-dependencies.jar read hdfs://localhost/user/user/aesop.txt
3 comments:
Really a good piece of knowledge on Big Data and Hadoop. Thanks for such a good post. I would like to recommend one more resource NPN Training which helps in getting more knowledge on Hadoop. The best part of NPN Training is they provide complete Hands-on classes.
For More Details visit
http://www.npntraining.com
http://www.npntraining.com/testimonial.php
Good work sir, Thanks for the proper explanation about Read & write data to hdfs java api . I found one of the good resource related Read & Write Data To HDFS Using Java API Programs and hadoop tutorial. It is providing in-depth knowledge on Read & Write Data To HDFS Using Java API Programs and hadoop tutorial. which I am sharing a link with you where you can get more clear on HAdoop file system programs . To know more Just have a look at this link
Top Read & Write Data To HDFS Using Java API Programs
Top Read & Write Data To HDFS Using Java API Programs 2
Thanks for info....
Website development in Bangalore
Post a Comment