Importing data from RDBMS into Hive using create-hive-table of sqoop

In the Importing data from RDBMS into Hive i blogged about how to import data from RDBMS into Hive using Sqoop. In that case the import command took care of both creating table in Hive based on RDMBS table as well as importing data from RDBMS into Hive. But Sqoop can also be used to import data stored in HDFS text file into Hive. I wanted to try that out, so what i did is i created the contact table in Hive manually and then used the contact table that i exported as text file into HDFS as input
  1. First i used sqoop import command to import content of Contact table into HDFS as text file. By default sqoop will use , for separating columns and newline for separating
    
    sqoop import --connect jdbc:mysql://macos/test --table contact -m 1
    
    After import is done i can see content of the text file by executing hdfs dfs -cat contact/part-m-00000 like this
  2. After that you can use sqoop to create table into hive based on schema of the CONTACT table in RDBMS. by executing following command
    
    sqoop create-hive-table --connect jdbc:mysql://macos/test --table Address --fields-terminated-by ','
    
  3. Last step is to use Hive for loading content of contact text file into contact table. by executing following command.
    
    LOAD DATA INPATH 'contact' into table contact;
    

4 comments:

  1. I really appreciate information shared above. It’s of great help. If someone want to learn Online (Virtual) instructor lead live training in IBM COGNOS VARICENT, kindly contact us http://www.maxmunus.com/contact
    MaxMunus Offer World Class Virtual Instructor led training on IBM COGNOS VARICENT. We have industry expert trainer. We provide Training Material and Software Support. MaxMunus has successfully conducted 100000+ trainings in India, USA, UK, Australlia, Switzerland, Qatar, Saudi Arabia, Bangladesh, Bahrain and UAE etc.
    For Demo Contact us.
    Saurabh Srivastava
    MaxMunus
    E-mail: saurabh@maxmunus.com
    Skype id: saurabhmaxmunus
    Ph:+91 8553576305 / 080 - 41103383
    http://www.maxmunus.com/

    ReplyDelete
  2. Why we need to import data into hive/hbase instead of hdfs?
    Can you please explain about this?

    ReplyDelete
  3. This blog is nicely written and I found the content and information very informative as well as helpful.

    This is a great post. I like this topic.This site has lots of advantage.I found many interesting things from this site. It helps me in many ways.Thanks for posting.

    evs full form
    raw agent full form
    full form of tbh in instagram
    dbs bank full form
    https full form
    tft full form
    pco full form
    kra full form in hr
    tbh full form in instagram story
    epc full form

    ReplyDelete