且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

通过sqoop将mysql数据导入到hive中进行计算示例

更新时间:2022-09-07 20:23:45

通过sqoop将mysql数据导入到hive中进行计算示例

hive计算示例

先将数据通过sqoop从mysql导入hive,在hive执行mysql的查询语句,得到与mysql一样的执行结果

步骤:

  1. mysql数据准备
  • account账号表
  • detail收支数据表
CREATE TABLE `account` (
  `id` int(11) PRIMARY KEY AUTO_INCREMENT,
  `account` varchar(20),
  `name` varchar(5),
  `age` int(3)
);

insert into account(account, name, age) values("tom@qq.com", "Tom", 23);
insert into account(account, name, age) values("jack@qq.com", "Jack", 20);
insert into account(account, name, age) values("jone@qq.com", "Jone", 22);
insert into account(account, name, age) values("jimi@qq.com", "Jimi", 25);
insert into account(account, name, age) values("black@qq.com", "Black", 24);
select * from account;



CREATE TABLE `detail` (
  `id` int(11) PRIMARY KEY AUTO_INCREMENT,
  `account` varchar(20),
  `income` double,
  `expenses` double,
  `time` varchar(10)
);

insert into detail(account, income, expenses, time) values("tom@qq.com", 10, 20, 2018-12-1);
insert into detail(account, income, expenses, time) values("jack@qq.com", 10, 30, 2018-12-4);
insert into detail(account, income, expenses, time) values("jone@qq.com", 13, 22, 2018-12-3);
insert into detail(account, income, expenses, time) values("jimi@qq.com", 45, 25, 2018-12-2);
insert into detail(account, income, expenses, time) values("black@qq.com", 34, 24, 2018-12-1);
insert into detail(account, income, expenses, time) values("tom@qq.com", 50, 20, 2018-12-1);
select * from detail;
  1. 创建hive表
create table account (
    id int, 
    account string, 
    name string, 
    age int
) row format delimited fields terminated by '\t';

create table detail (
    id int, 
    account string, 
    income double, 
    expenses double, 
    time string
) row format delimited fields terminated by '\t';
  1. 通过sqoop将mysq当中的数据直接导入到hive当中
sqoop import 
--connect jdbc:mysql://localhost:3306/mydata 
--username root 
--password 123456 
--table account 
--hive-import 
--hive-overwrite 
--hive-table account 
--fields-terminated-by '\t'

sqoop import --connect jdbc:mysql://localhost:3306/mydata --username root --password 123456 --table detail --hive-import --hive-overwrite --hive-table detail --fields-terminated-by '\t'
  1. 计算结果,mysql和hive中计算结果一致
select a.account, a.name, d.total 
from account as a 
join(
    select account, sum(income - expenses) as total 
    from detail group by account
) as d 
on a.account=d.account;

mysql计算结果

+--------------+-------+-------+
| account      | name  | total |
+--------------+-------+-------+
| black@qq.com | Black |    10 |
| jack@qq.com  | Jack  |   -20 |
| jimi@qq.com  | Jimi  |    20 |
| jone@qq.com  | Jone  |    -9 |
| tom@qq.com   | Tom   |    20 |
+--------------+-------+-------+

hive计算结果

black@qq.com    Black   10.0
jack@qq.com Jack    -20.0
jimi@qq.com Jimi    20.0
jone@qq.com Jone    -9.0
tom@qq.com  Tom  20.0

报错及解决

报错:

/tmp/hive on HDFS should be writable.

解决

> hadoop fs -chmod -R 777 /tmp

参考

hive启动出现权限错误 /tmp/hive on HDFS should be writable.

报错:

Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR

解决:

往/etc/profile最后加入

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*

然后刷新配置,source /etc/profile

参考:

ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.