I am trying to update a hive table, based on the records from a mysql table.
mysql-table: (table name: delimiter_test)
+---------------+-----------------+
| department_id | department_name |
+---------------+-----------------+
| 2 | Fitness |
| 3 | Footwear |
| 4 | Apparel |
| 5 | Golf |
| 6 | Outdoors |
| 7 | Fan Shop |
| 8 | Test |
+---------------+-----------------+
hive-table (table name: my_test)
2 Fitness
3 Footwear
4 Apparel
5 Golf
6 Outdoors
7 Fan Shop
I am trying to use sqoop, to import the last record in the mysql table with department_id 8, into hive table using incremental-update in sqoop.
my-sqoop command:
sqoop import --connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" --username xxx --password xxx --table delimiter_test --hive-import --hive-table my_test --split-by department_id --check-column department_id --incremental append --last-value 7
I am not getting any errors,but the extra record from the mysql table with department_id 8 is not getting updated into the hive table.
Please suggest me where am I going wrong.
I dont know if probably we are working on itversity labs. Well I have done this thing using the code below. probably that could work for you too.
First load data in hive
sqoop import --connect jdbc:mysql://xxxxx/retail_db --username xxxx --password xxxx \
--table departments --where department_id=2 --hive-import --hive-database poc --hive-table departments_sqoop \
--target-dir /user/ingenieroandresangel/sqoop/dep_hive --split-by department_id -m 1
Then I perform the update with the script below:
sqoop import --connect jdbc:mysql://xxxxxx/retail_db --username xxxxx --password xxxx \
--table departments --where 'department_id>=2' --hive-import --hive-database poc --hive-table departments_sqoop --incremental append \
--check-column department_id --last-value 2 --target-dir /user/ingenieroandresangel/sqoop/dep_hive --split-by department_id -m 1