hive学习笔记之八:Sqoop

编程

欢迎访问我的GitHub

https://github.com/zq2599/blog_demos

内容:所有原创文章分类汇总及配套源码,涉及Java、Docker、Kubernetes、DevOPS等;

关于Sqoop

Sqoop是Apache开源项目,用于在Hadoop和关系型数据库之间高效传输大量数据,本文将与您一起实践以下内容:

  1. 部署Sqoop
  2. 用Sqoop将hive表数据导出至MySQL
  3. 用Sqoop将MySQL数据导入到hive表

部署

  1. 在hadoop账号的家目录下载Sqoop的<font color="blue">1.4.7</font>版本:

wget https://mirror.bit.edu.cn/apache/sqoop/1.4.7/sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz

  1. 解压:

tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz

  1. 解压后得到文件夹<font color="blue">sqoop-1.4.7.bin__hadoop-2.6.0</font>,将<font color="red">mysql-connector-java-5.1.47.jar</font>复制到<font color="blue">sqoop-1.4.7.bin__hadoop-2.6.0/lib</font>目录下
  2. 进入目录<font color="blue">sqoop-1.4.7.bin__hadoop-2.6.0/conf</font>,将<font color="red">sqoop-env-template.sh</font>改名为<font color="blue">sqoop-env.sh</font>:

mv sqoop-env-template.sh sqoop-env.sh

  1. 用编辑器打开<font color="blue">sqoop-env.sh</font>,增加下面三个配置,<font color="red">HADOOP_COMMON_HOME</font>和<font color="red">HADOOP_MAPRED_HOME</font>是完整的hadoop路径,<font color="red">HIVE_HOME</font>是完整的hive路径:

export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.7.7

export HADOOP_MAPRED_HOME=/home/hadoop/hadoop-2.7.7

export HIVE_HOME=/home/hadoop/apache-hive-1.2.2-bin

  1. 安装和配置完成了,进入<font color="blue">sqoop-1.4.7.bin__hadoop-2.6.0/bin</font>,执行<font color="blue">./sqoop version</font>查看sqoop版本,如下所示,可见是1.4.7版本(有些环境变量没配置会输出告警,在此先忽略):

[hadoop@node0 bin]$ ./sqoop version

Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hbase does not exist! HBase imports will fail.

Please set $HBASE_HOME to the root of your HBase installation.

Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../accumulo does not exist! Accumulo imports will fail.

Please set $ACCUMULO_HOME to the root of your Accumulo installation.

Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../zookeeper does not exist! Accumulo imports will fail.

Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.

20/11/02 12:02:58 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7

Sqoop 1.4.7

git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8

Compiled by maugli on Thu Dec 21 15:59:58 STD 2017

  • sqoop装好之后,接下来体验其功能

MySQL准备

为了接下来的实战,需要把MySQL准备好,这里给出的MySQL的配置供您参考:

  1. MySQL版本:5.7.29
  2. MySQL服务器IP:192.168.50.43
  3. MySQL服务端口:3306
  4. 账号:root
  5. 密码:123456
  6. 数据库名:sqoop

关于MySQL部署,我这为了省事儿,是用docker部署的,参考《群晖DS218+部署mysql》

从hive导入MySQL(export)

  • 执行以下命令,将hive的数据导入到MySQL:

./sqoop export 

--connect jdbc:mysql://192.168.50.43:3306/sqoop

--table address

--username root

--password 123456

--export-dir "/user/hive/warehouse/address"

--fields-terminated-by ","

  • 查看<font color="blue">address</font>表,数据已经导入:

从MySQL导入hive(import)

  1. 在hive的命令行模式执行以下语句,新建名为address2的表结构和address一模一样:

create table address2 (addressid int, province string, city string) 

row format delimited

fields terminated by ",";

  1. 执行以下命令,将MySQL的address表的数据导入到hive的address2表,<font color="red">-m 2</font>表示启动2个map任务:

./sqoop import 

--connect jdbc:mysql://192.168.50.43:3306/sqoop

--table address

--username root

--password 123456

--target-dir "/user/hive/warehouse/address2"

-m 2

  1. 执行完毕后,控制台输入类似以下内容:

		Virtual memory (bytes) snapshot=4169867264

Total committed heap usage (bytes)=121765888

File Input Format Counters

Bytes Read=0

File Output Format Counters

Bytes Written=94

20/11/02 16:09:22 INFO mapreduce.ImportJobBase: Transferred 94 bytes in 16.8683 seconds (5.5726 bytes/sec)

20/11/02 16:09:22 INFO mapreduce.ImportJobBase: Retrieved 5 records.

  1. 去查看hive的address2表,可见数据已经成功导入:

hive> select * from address2;

OK

1 guangdong guangzhou

2 guangdong shenzhen

3 shanxi xian

4 shanxi hanzhong

6 jiangshu nanjing

Time taken: 0.049 seconds, Fetched: 5 row(s)

  • 至此,Sqoop工具的部署和基本操作已经体验完成,希望您在执行数据导入导出操作时,此文能给您一些参考;

你不孤单,欣宸原创一路相伴

  1. Java系列
  2. Spring系列
  3. Docker系列
  4. kubernetes系列
  5. 数据库+中间件系列
  6. DevOps系列

欢迎关注公众号:程序员欣宸

微信搜索「程序员欣宸」,我是欣宸,期待与您一同畅游Java世界... https://github.com/zq2599/blog_demos

以上是 hive学习笔记之八:Sqoop 的全部内容, 来源链接: utcz.com/z/519570.html

回到顶部