Apache Sqoop
[root@kingsolomon ~]$ source ~/.bashrc
Rename these two files as following and made changes accordingly
Sqoop is an Apache project that is part of the broader
Hadoop EcoSystem. It built on top of MapReduce and take advantage of its
parallelism and fault tolerance. Sqoop was designed to move data from and into
relational databases using JDBC driver to connect.Its functionality is
extensive. This real time example will show how to use Sqoop to import data from MySQL to Hive using real time data. Hope you will enjoy !!!
Installing Apache Sqoop
Download Apache Sqoop 1.4.5 from Apache Sqoop 1.4.5 Link
Go to the specified location where you want to have hive installable(Here is /usr/lib/)
[hadoopadmin@kingsolomon ~]$ tar -xzf sqoop-1.4.5.bin__hadoop-2.0.4-alpha.tar.gz
[hadoopadmin@kingsolomon ~]$ mv sqoop-1.4.5.bin__hadoop-2.0.4-alpha sqoop
Sqoop Configuration
Setup Environment Variable
Setup Environment Variables by editing bashrc file using Edit ~/.bashrc file and append following values at end of file. /home/hadoopadmin/.bashrc
export SQOOP_HOME=/usr/lib/sqoop
export PATH=$SQOOP_HOME/bin
[root@kingsolomon ~]$ source ~/.bashrc
Edit Sqoop Configuration Files
[hadoopadmin@kingsolomon ~]$ cd /usr/lib/sqoop/conf
Rename these two files as following and made changes accordingly
1. Rename sqoop-env.template.cmd to sqoop-env.sh
export HADOOP_COMMON_HOME=/usr/HadoopEnv/hadoop-2.6.0
export HADOOP_MAPRED_HOME=/usr/HadoopEnv/hadoop-2.6.0
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk
2. Download and Copy MySQL connector JAR inside SQOOP Library Folder /usr/lib/sqoop/lib
copy mysql-connector-java-5.1.29 inside sqoop lib folder
then execute below query
Installing MySQL
[hadoopadmin@localhos desktop]$ su
Password : put your password for root user
[root@localhost Desktop]# yum -y install mysql-server
[root@localhost Desktop]# service mysqld start
[root@localhost Desktop]# /usr/bin/mysql_secure_installation
By default, a MySQL installation has an anonymous user,
allowing anyone
to log into MySQL without having to have
a user account created for
them. This is
intended only
for testing, and to make the installation
go a bit smoother.
You should remove them before moving into a
production environment.
then set root password.
Remove anonymous users? [Y/n] y
Disallow root login remotely? [Y/n] n
Remove test database and access to it? [Y/n] n
Reload privilege tables now? [Y/n] y
Thanks for using MySQL!
Start MySQL Server
[root@localhost Desktop]# service mysqld start --- Start
mysql server
[root@localhost Desktop]# mysql -u root -p
Enter password:
Welcome to the MySQL monitor.
Create Database & Table using below commands
mysql>create database hadoopdb;
mysql>use hadoopdb;
mysql>CREATE TABLE iqubalcontact (
contact_id int(11)
NOT NULL AUTO_INCREMENT,
name varchar(45) NOT
NULL,
email varchar(45)
NOT NULL,
address varchar(45)
NOT NULL,
telephone
varchar(45) NOT NULL,
PRIMARY KEY
(contact_id)
) ENGINE=InnoDB AUTO_INCREMENT=25 DEFAULT CHARSET=utf8;
INSERT INTO iqubalcontact (name,email,address,telephone)
VALUES
("Steve Job",
"Steve.Jobs@apple.com","CA","8390900000");
INSERT INTO iqubalcontact (name,email,address,telephone)
VALUES
("Bill Gates", "Bill.Gates@microsoft.com","CA","8390900000");
INSERT INTO iqubalcontact (name,email,address,telephone)
VALUES
("Iqubal M Kaki", "Iqubal.MKaki@szias.com","INDIA ","8390900000");
Now Create same database inside Hive by using Hive shell(For Hive Installation & configuration, please go through my previous blog on Apache Hive).
Note : Be ensure that Hadoop is up and running by using JPS command and don't close mysql unix shell terminal. Open another unix terminal to perform hive task.
[hadoopadmin@kingsolomon ~]$ hive
hive> create database hadoodb;
Importing data from MySQL into Hive using Sqoop
Run the below Sqoop command to import mysql table data to Hive
[hadoopadmin@kingsolomon ~]$ sqoop import --connect jdbc:mysql://localhost/hadoopdb \
--username root -P
\
--table iqubalcontact
\
--hive-import \
--hive-table
hadoopdb.iqubalcontact -m 1;
After some time it will successfully import MySQL table data into Hive
Test Hive Import from Hive Shell
[hadoopadmin@kingsolomon ~]$ hive
hive> use hadoopdb;
hive> select * from iqubalcontact;
Hope you have enjoyed the article which explore real time example.
Author: Iqubal Mustafa Kaki, Technical Specialist.
Want to connect with me
If you want to connect with me, please connect through my email - iqubal.kaki@gmail.com
Want to connect with me
If you want to connect with me, please connect through my email - iqubal.kaki@gmail.com
Great...
ReplyDeleteVery nice article, impressive...
ReplyDeletehope you will publish more article in future.
May Allah swt bless you my dear cheeta... iqbal bhai
Thanks for your article.
ReplyDeleteHowever, I have a question that you didn't mention which part is realtime. So, can sqoop transmit data from hadoop to hive real time?
Nice information Thanks for sharing it
ReplyDeleteHadoop Training in Chennai
This comment has been removed by the author.
ReplyDeleteNice post ! Thanks for sharing valuable information with us. Keep sharing..Big data hadoop online training India
ReplyDeletehi here i learnt the about installation of apache squoop being a beginer it helped me alot do see Hadoop Training in Velachery | Hadoop Training | Hadoop Training in chennai.
ReplyDeleteI really love the theme/design of your website. Do you ever run into any browser compatibility problems?
ReplyDeleteiosh course in chennai
This is the information that have been looking for. Great insights & you have explained it really well. Thank you & looking forward for more of such valuable updates.
ReplyDeleteweb designing training in chennai
web designing training in velachery
digital marketing training in chennai
digital marketing training in velachery
rpa training in chennai
rpa training in velachery
tally training in chennai
tally training in velachery
Awesome..You have clearly explained …Its very useful for me to know about new things..Keep on blogging.
ReplyDeletehadoop training in chennai
hadoop training in annanagar
salesforce training in chennai
salesforce training in annanagar
c and c plus plus course in chennai
c and c plus plus course in annanagar
machine learning training in chennai
machine learning training in annanagar
"Thanks for sharing such beautiful information with us . I hope you will share some more info. Please keep sharing!
ReplyDeletejava training in chennai
java training in tambaram
aws training in chennai
aws training in tambaram
python training in chennai
python training in tambaram
selenium training in chennai
selenium training in tambaram
Thanks for posting useful information.You have provided an nice article, Thank you very much for this one. And i hope this will be useful for many people.. and i am waiting for your next post keep on updating these kinds of knowledgeable things.
ReplyDeleteangular js training in chennai
angular js training in omr
full stack training in chennai
full stack training in omr
php training in chennai
php training in omr
photoshop training in chennai
photoshop training in omr
Smm panel
ReplyDeletesmm panel
iş ilanları
İnstagram Takipçi Satın Al
hırdavat
beyazesyateknikservisi.com.tr
Servis
Tiktok Jeton Hilesi
maltepe beko klima servisi
ReplyDeletekartal lg klima servisi
ümraniye lg klima servisi
ümraniye daikin klima servisi
üsküdar toshiba klima servisi
beykoz beko klima servisi
üsküdar beko klima servisi
pendik daikin klima servisi
beykoz daikin klima servisi