- Import all MySQL tables from "training" database into the HDFS directory "hdfsdir"
$]sqoop import-all-tables --connect jdbc:mysql://localhost/training --username training --password training --warehouse-dir hdfsdir
- Loginto the hive shell
- create the external table(without location) with the schema as of the MySQL database table "movies"
- Load data from the hdfs location into the table movies
- Run Hive Queries
hive>select eid, name, codefrom movies where eid >1000;
- Overwrite table "movies" with a filter, this will delete "movies" directory in "/user/hive/warehouse" and recreates the directory "/user/hive/warehouse/movies" with a file 0000000_0. Contents of this file would the records of the filter
- Create a table in MySQL with the same scheme
mysql> CREATE TABLE movies1 ( eid int(11) NOT NULL, ename char(255), code char(255));
- export data from the HIVE/HDFS into the MySQL