1、编辑flume的配置文件
a1.sources = r1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = taildir a1.sources.r1.filegroups=f1 a1.sources.r1.filegroups.f1 = /workplace/data/log*.* #监控data目录下所有的log为前缀的文件 a1.sources.r1.positionFile = /workplace/data/taildir_position.json # Use a channel which buffers events in memory a1.channels.c1.type = org.apache.flume.channel.kafka.KafkaChannel a1.channels.c1.kafka.bootstrap.servers=master:9092 a1.channels.c1.kafka.topic = test1 # Bind the source and sink to the channel a1.sources.r1.channels = c1
2、启动flume,导入数据到kafka
bin/flume-ng agent --conf conf --conf-file ./conf/job/flume_to_kafka2.conf --name a1 -Dflume.root.logger=INFO,consol
3、查看kafka中数据
/app/kafka/bin/kafka-console-consumer.sh --bootstrap-server 192.168.80.128:9092 --from-beginning --topic test1
标签:flume,r1,--,kafka,a1,sources,导入 From: https://www.cnblogs.com/cstark/p/16836008.html