数据湖仓一体(四)安装hive

07-16 1356阅读

 上传安装包到/opt/software目录并解压

数据湖仓一体(四)安装hive
(图片来源网络,侵删)
[bigdata@node106 software]$ tar -zxvf hive-3.1.3-with-spark-3.3.1.tar.gz -C /opt/services
[bigdata@node106 services]$ mv apache-hive-3.1.3-bin apache-hive-3.1.3   

配置环境变量

export HIVE_HOME=/opt/services/apache-hive-3.1.3
export $PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$ZK_HOME/bin:$KAFKA_HOME/bin:$SEA_HOME/bin:$HIVE_HOME/bin

分发环境变量

[bigdata@node106 bin]$ sudo ./bin/xsync /etc/profile.d/bigdata_env.sh 

刷新环境变量,5台机器上执行

[bigdata@node106 ~]$ source /etc/profile

上传mysql驱动包到hive的lib目录下

[bigdata@node106 software]$ cp mysql-connector-java-8.0.18.jar /opt/services/apache-hive-3.1.3/lib/ 

解决jar包冲突

[bigdata@node106 ~]$ mv $HIVE_HOME/lib/log4j-slf4j-impl-2.17.1.jar $HIVE_HOME/lib/log4j-slf4j-impl-2.17.1.jar.bak 

配置hive-site.xml文件

		javax.jdo.option.ConnectionURL
		jdbc:mysql://node106:3306/metastore?useSSL=false&useUnicode=true&characterEncoding=UTF-8&allowPublicKeyRetrieval=true
	
	
	
		javax.jdo.option.ConnectionDriverName
		com.mysql.jdbc.Driver
	
	
	
		javax.jdo.option.ConnectionUserName
		root
		
	
	
		javax.jdo.option.ConnectionPassword
		123456
	
	
	
		hive.metastore.schema.verification
		false
	
	
	
		hive.metastore.event.db.notification.api.auth
		false
	
	
	
		hive.metastore.warehouse.dir
		/user/hive/warehouse
	
	
	
    	hive.cli.print.header
		true
	
	
	
    	hive.cli.print.current.db
		true
	
	
	  
		hive.metastore.uris
		thrift://node106:9083  
	
	
			hive.server2.thrift.port
			10000
	
	
			hive.server2.thrift.bind.host
			node106
	
	
			hive.users.in.admin.role
			bigdata
		  
	
			hive.security.authorization.enabled
			false
		
	
	    hive.execution.engine
	    mr
	 

配置日志文件

[bigdata@node106 conf]$ cp hive-exec-log4j2.properties.template  hive-exec-log4j2.properties
[bigdata@node106 conf]$ cp hive-log4j2.properties.template hive-log4j2.properties

修改hive-log4j2.properties,添加日志目录

property.hive.log.dir = /opt/services/apache-hive-3.1.3/logs

编辑hive-env.sh

[bigdata@node106 conf]$ cp hive-env.sh.template hive-env.sh
[bigdata@node106 conf]$ vim hive-env.sh
export HADOOP_HEAPSIZE=1024

创建元数据库

[bigdata@node106 conf]$ mysql -uroot -p'123456'
mysql>  create database if not exists metastore  DEFAULT CHARACTER SET utf8 DEFAULT COLLATE utf8_general_ci; 

初始化元数据库

[bigdata@node106 bin]$ schematool -initSchema -dbType mysql -verbose 

 修改编码集,解决乱码问题

mysql>  alter table DBS convert to character set utf8;                 
mysql>  alter table COLUMNS_V2 character set utf8;                 
mysql>  alter table COLUMNS_V2 change COMMENT COMMENT  varchar(256) character set utf8;                                     
mysql>  alter table TABLE_PARAMS change PARAM_VALUE    PARAM_VALUE mediumtext character set utf8;                       
mysql>  alter table PARTITION_KEYS change PKEY_COMMENT  PKEY_COMMENT varchar(4000) character set utf8;                    
mysql>  alter table PARTITION_KEYS character set utf8;  

编写hive.sh脚本

[bigdata@node106 bin]$ vim  hive.sh 
#!/bin/bash
echo ==================== 启动hive服务 =========================
echo ==================== 启动metastore服务 ====================
ssh node106 "nohup $HIVE_HOME/bin/hive --service metastore > $HIVE_HOME/logs/metastore.log 2>&1 &"
echo ==================== 启动hiveservice2服务 =================
ssh node106 "nohup $HIVE_HOME/bin/hive --service hiveserver2 > $HIVE_HOME/logs/hiveservice2.log 2>&1 &"

授权hive.sh

[bigdata@node106 bin]$ chmod +x hive.sh 

分发hive.sh

[bigdata@node106 bin]$ xsync  hive.sh

copy到其他机器

[bigdata@node107 bin]$ scp -r bigdata@node106:/opt/services/apache-hive-3.1.3/ /opt/services/apache-hive-3.1.3/                          
[bigdata@node108 bin]$ scp -r bigdata@node106:/opt/services/apache-hive-3.1.3/ /opt/services/apache-hive-3.1.3/

启动hive

[bigdata@node106 bin]$ hive.sh start 
VPS购买请点击我

文章版权声明:除非注明,否则均为主机测评原创文章,转载或复制请以超链接形式并注明出处。

目录[+]