首页 > 数据库 >spark SQL 连接hive

spark SQL 连接hive

时间:2023-03-05 10:56:10浏览次数:42  
标签:bin hadoop hive usr SQL spark local

将hive的conf下的hive-site.xml复制到spark的conf下

cp /usr/local/hive/apache-hive-1.2.2-bin/conf/hive-site.xml /usr/local/spark/spark-2.0.2-bin-hadoop2.6/conf/

将hadoop下core-site.xml和hdfs-site.xml复制到spark的conf下

cp /usr/local/hadoop/hadoop-2.6.1/etc/hadoop/core-site.xml /usr/local/spark/spark-2.0.2-bin-hadoop2.6/conf/

cp /usr/local/hadoop/hadoop-2.6.1/etc/hadoop/hdfs-site.xml  /usr/local/spark/spark-2.0.2-bin-hadoop2.6/conf/

将MySQL连接驱动复制到spark的jars下

cp /usr/local/hive/apache-hive-1.2.2-bin/lib/mysql-connector-java-5.1.49-bin.jar /usr/local/spark/spark-2.0.2-bin-hadoop2.6/jars/

将spark分发到slave中

scp -r /usr/local/spark root@slave1:/usr/local/

scp -r /usr/local/spark root@slave2:/usr/local/

启动spark

sh /usr/local/spark/spark-2.0.2-bin-hadoop2.6/sbin/start-all.sh

/usr/local/hadoop/hadoop-2.6.1/etc/hadoop/core-site.xml

标签:bin,hadoop,hive,usr,SQL,spark,local
From: https://www.cnblogs.com/6wenhong6/p/17180013.html

相关文章