需要在linux的hadoop用户的根目录(/home/hadoop)上创建app,data,lib,software,source目录。
- source:存放框架源码
- data :存放测试数据
- lib:存放开发的jar包
- software:存放软件安装包
- app:存放所有安装的软件
[hadoop@hadoop000 ~]$ ll
total 100
drwxrwxr-x 13 hadoop hadoop 4096 Oct 22 2017 app
drwxrwxr-x 3 hadoop hadoop 4096 Oct 22 2017 data
drwxr-xr-x 3 hadoop hadoop 4096 Oct 13 2017 Desktop
drwxr-xr-x 2 hadoop hadoop 4096 Sep 22 2013 Documents
drwxr-xr-x 2 hadoop hadoop 4096 Nov 23 2014 Downloads
-rw-r--r-- 1 hadoop hadoop 40762 Oct 23 2016 install.log
drwxrwxr-x 2 hadoop hadoop 4096 Oct 24 2017 lib
drwxrwxr-x 2 hadoop hadoop 4096 Sep 15 2017 maven_repos
drwxr-xr-x 2 hadoop hadoop 4096 Sep 22 2013 Pictures
drwxr-xr-x 2 hadoop hadoop 4096 Sep 22 2013 Public
drwxrwxr-x 2 hadoop hadoop 4096 Jul 15 2017 shell
drwxrwxr-x 2 hadoop hadoop 4096 Sep 15 2017 software
drwxrwxr-x 2 hadoop hadoop 4096 Sep 9 2017 source
drwxr-xr-x 2 hadoop hadoop 4096 Sep 22 2013 Templates
drwxrwxr-x 2 hadoop hadoop 4096 Feb 28 17:34 tmp
drwxr-xr-x 2 hadoop hadoop 4096 Sep 22 2013 Videos
[hadoop@hadoop000 ~]$ pwd
/home/hadoop
注意:当我们需要使用root权限设置环境变量或者相关的操作时,不需要root用户,普通的hadoop用户 可以使用sudo,后面跟我们的具体命令即可完成。
对应版本说明CentOS 6.4
Hadoop生态系统:cdh5.7.0
所有的Hadoop生态的软件下载地址为:http://archive.cloudera.com/cdh5/cdh/5/
jdk1.8
spark:2.2
scala:2.1.8
进去之后,找到More -> Building Spark
看到版本环境限制条件说明
Apache Maven
The Maven-based build is the build of reference for Apache Spark. Building Spark using Maven requires Maven 3.6.3 and Java 8. Spark requires Scala 2.12; support for Scala 2.11 was removed in Spark 3.0.0.
[hadoop@hadoop000 ~]$ cd app/
[hadoop@hadoop000 app]$ ls
apache-flume-1.6.0-cdh5.7.0-bin kafka_2.11-0.9.0.0
apache-maven-3.3.9 scala-2.11.8
data spark-2.2.0-bin-2.6.0-cdh5.7.0
hadoop-2.6.0-cdh5.7.0 tmp
hbase-1.2.0-cdh5.7.0 zookeeper-3.4.5-cdh5.7.0
jdk1.8.0_144
注意: app文件夹下的安装文件都需要配置到系统的环境变量中
查看环境变量
cat ~/.bash_profile
[hadoop@hadoop000 app]$ cat ~/.bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
export JAVA_HOME=/home/hadoop/app/jdk1.8.0_144
export PATH=$JAVA_HOME/bin:$PATH
export FLUME_HOME=/home/hadoop/app/apache-flume-1.6.0-cdh5.7.0-bin
export PATH=$FLUME_HOME/bin:$PATH
export ZK_HOME=/home/hadoop/app/zookeeper-3.4.5-cdh5.7.0
export PATH=$ZK_HOME/bin:$PATH
export KAFKA_HOME=/home/hadoop/app/kafka_2.11-0.9.0.0
export PATH=$KAFKA_HOME/bin:$PATH
export SCALA_HOME=/home/hadoop/app/scala-2.11.8
export PATH=$SCALA_HOME/bin:$PATH
export MAVEN_HOME=/home/hadoop/app/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$PATH
export HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0
export PATH=$HADOOP_HOME/bin:$PATH
export HBASE_HOME=/home/hadoop/app/hbase-1.2.0-cdh5.7.0
export PATH=$HBASE_HOME/bin:$PATH
export SPARK_HOME=/home/hadoop/app/spark-2.2.0-bin-2.6.0-cdh5.7.0
export PATH=$SPARK_HOME/bin:$PATH
spark的版本升级:jdk版本升级,spark版本升级。修改JAVA_HOME和SPARK_HOME
标签:4096,app,hadoop,PATH,版本升级,export,linux,HOME,环境变量 From: https://blog.51cto.com/u_12528551/5900173