解压:
tar -zxvf apache-flume-1.7.0-bin.tar.gz
重命名:
mv apache-flume-1.7.0-bin/ flume
配置环境变量:
vi /etc/profile
export FLUME_HOME=/usr/local/src/flume
export PATH=$PATH:$FLUME_HOME/bin
刷新环境变量:
source /etc/profile
切换目录:
cd /usr/local/src/flume/conf
拷贝文件:
cp flume-env.sh.template flume-env.sh
编辑文件:
vi /usr/local/src/flume/conf/flume-env.sh
export JAVA_HOME=/usr/local/src/jdk1.8.0_181/
查看版本:
flume-ng version
如果查看版本报错:
vi /usr/local/src/flume/bin/flume-ng
110 java.library.path 2>/dev/null)
切换目录:
/usr/local/src/flume
编辑文件:
vi /usr/local/src/flume/flume-hdfs.conf
a1.sources=r1
a1.sources=r1
a1.sinks=k1
a1.channels=c1
a1.sources.r1.type=spooldir
a1.sources.r1.spoolDir=/usr/local/src/hadoop-2.7.6/logs/
a1.sources.r1.fileHeader=true
a1.sinks.k1.type=hdfs
a1.sinks.k1.hdfs.path=hdfs://master:9000/flume
a1.sinks.k1.hdfs.rollsize=1048760
a1.sinks.k1.hdfs.rollCount=0
a1.sinks.k1.hdfs.rollInterval=900
a1.sinks.k1.hdfs.useLocalTimeStamp=true
a1.channels.c1.type=file
a1.channels.c1.capacity=1000
a1.channels.c1.transactionCapacity=100
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
使用flume-ng agent命令加载flume-hdfs.conf配置信息,启动flume传输数据:
flume-ng agent --conf-file /usr/local/src/flume/flume-hdfs.conf --name a1
查看Hadoop目录:
hadoop fs -ls /
查看Flume传输到HDFS 的文件:
hadoop fs -ls /flume
标签:Flume,hdfs,src,local,a1,usr,flume,组件,安装 From: https://www.cnblogs.com/-liunian/p/17410969.html