当前位置:文档之家› sqoop从hive到mysql,mysql到hive

sqoop从hive到mysql,mysql到hive

-------------
-------------hive表不能是“stored as RCFile” textfile应该可以。。。。RCFile导入数据会导不进去,直接进入默认hdfs路径下

1. mysql————>hdfs
(hdfs目录存在的话会报错)
sqoop import --connect jdbc:mysql://192.168.1.10/testdb --username chroot --password chcloud --table user_test --target-dir /warehouse/test_tn.db/test -m 1


2. mysql————>hive

---HDFS默认路径下:/user/当前用户名/表名; hive表也默认,不在任何数据库里:
sqoop import --connect jdbc:mysql://172.168.234.31/bd_chiq --username etl --password etl --table whole_active_data_hive --hive-import -m 1

---指定具体数据库及表名,并覆盖 (mysql表和hive表结构相同)
sqoop import --connect jdbc:mysql://172.168.234.31/bd_chiq --username etl --password etl --table whole_active_data_hive --fields-terminated-by '\t' --hive-import --hive-table black_goods_data.whole_active_data --hive-overwrite -m 1

3. 创建与mysql中指定表结构相同的hive表
sqoop create-hive-table --connect jdbc:mysql://192.168.1.10/testdb --username chroot --password chcloud --table channel --fields-terminated-by '\t'


4. 用SQL的语句形式将MySQL数据库表中数据导入到HDFS中

sql语句中必须有where $CONDITIONS
sqoop import --connect jdbc:mysql://192.168.1.10/testdb --username chroot --password chcloud --query 'select distinct username, channel from user_test where $CONDITIONS' --split-by username --target-dir /user/query/ -m 1


5. 用SQL的语句形式将MySQL数据库表中数据导入到hive中

也要指明路径
sqoop import --connect jdbc:mysql://118.123.227.19:3306/wggetl --username wggetl --password wggetl -e'select * from x where $CONDITIONS' --target-dir /warehouse/tag_system.db/mid_fridge_food_management --hive-import --hive-table tag_system.mid_fridge_food_management --fields-terminated-by '\t' --hive-overwrite -m 1


--------------
--------------

select 'global' as cid, 'barcode' as idtype, devsn as id, GROUP_CONCAT(type) as food_management, replace(date_sub(CURRENT_DATE(),interval 1 day),'-','') as updatetime, date_sub(CURRENT_DATE(),interval 1 day) as p_log_date from (select distinct devsn, case when type='剩菜' then '剩菜剩饭' when type='Vegetables' then '蔬菜' when type='1' then '蔬菜' when type='Others' then '其他' else type end as type from ingredients_history where CAST(savetime AS date)>=date_sub(CURRENT_DATE(),interval 30 day)) w group by devsn



6. hive-->mysql:
sqoop export --connect jdbc:mysql://172.168.234.31/bd_chiq --username etl --password etl --table whole_active_data_hive --input-fields-terminated-by '\t' --export-dir /warehouse/black_goods_data.db/whole_active_data




hadoop fs -rm /warehouse/test_tn.db/tnn/test.txt
hadoop fs -put record1.txt /warehouse/test_tn.db/tnn/record1

hadoop fs -copyFromLocal usercount.txt hdfs://localhost/root/usercount.txt
hadoop fs -put record1.txt /ware

house/test_tn.db/tnn/record

create table timeslot_hour(timeslot string, hour int)
row format delimited fields terminated by ','
stored as textfile
location'/user/root/timeslot_hour';


mysql -h 192.168.1.10 -uchiq_voice -pchiq_voice --default-character-set=gb2312 -B -e 'select * from bd_chiq.bd_chiq_appstore_activation;' > /opt/tnn/data/bd_chiq_appstore_activation.xls

相关主题
文本预览
相关文档 最新文档