首先安装:elasticsearch 、 kibana ,以下测试会用到。
安装参考:
Elasticsearch 入门:CentOS 5.6 安装 Elasticsearch 5.0
Elasticsearch 入门:Elasticsearch 5.0 安装 kibana 5.0
logstash多种安装方法: https://www.elastic.co/guide/en/logstash/5.0/installing-logstash.html
logstash下载 :https://www.elastic.co/downloads/logstash
logstash5.0.0下载:https://www.elastic.co/downloads/past-releases/logstash-5-0-0
部署,使用的是离线安装,因有的服务器不能连接网络
shell> tar zxvf logstash-5.0.0.tar.gz
shell> mv logstash-5.0.0 /usr/local/elasticsearch/files/logstash
shell> cd /usr/local/elasticsearch/files/logstash
运行
shell> bin/logstash -e 'input { stdin {} } output { stdout {} }'
Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties.
The stdin plugin is now waiting for input:
[2017-01-14T20:27:54,232][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2017-01-14T20:27:54,429][INFO ][logstash.pipeline ] Pipeline main started
[2017-01-14T20:27:54,603][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
2017-01-14T12:27:54.427Z 0.0.0.0
在当前窗口输入第一行信息,可看到第二行输出结果:
hello logstash!
2017-01-14T12:31:14.798Z 0.0.0.0 hello logstash!
ctrl + C 退出,使用配置文件方法启动,创建配置文件(另一种格式输出:codec => rubydebug)
shell> vi config/logstashtest.conf
input {
stdin {}
}
output {
stdout {
codec => rubydebug
}
}
运行,使用配置文件
shell> bin/logstash -f config/logstashtest.conf
Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties.
The stdin plugin is now waiting for input:
[2017-01-14T22:26:15,834][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2017-01-14T22:26:15,926][INFO ][logstash.pipeline ] Pipeline main started
[2017-01-14T22:26:16,404][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
{
"@timestamp" => 2017-01-14T14:26:15.952Z,
"@version" => "1",
"host" => "0.0.0.0",
"message" => ""
}
输入第一行信息,可看到输出结果:
hello logstash!
{
"@timestamp" => 2017-01-14T14:26:47.163Z,
"@version" => "1",
"host" => "0.0.0.0",
"message" => "hello logstash!"
}
现在配置输出到 Elasticsearch,也保留 stdout 的输出.(注意名称:hosts!)
hell> vi config/logstashtest.conf
input {
stdin {}
}
output{
elasticsearch {
hosts => ["192.168.1.222:9200"]
index => "test"
}
stdout {
codec => rubydebug
}
}
运行,使用配置文件
shell> bin/logstash -f config/logstashtest.conf
Sending Logstash logs to /usr/local/elasticsearch/files/logstash/logs which is now configured via log4j2.properties.
The stdin plugin is now waiting for input:
[2017-01-14T23:10:11,512][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://192.168.1.222:9200"]}}
[2017-01-14T23:10:11,521][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-01-14T23:10:11,785][INFO ][logstash.outputs.elasticsearch] Attempting to install template
{:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}
, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}
, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string"
, "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string"
, "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}]
, "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}
, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}
, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-01-14T23:10:17,071][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2017-01-14T23:10:33,701][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["192.168.1.222:9200"]}
[2017-01-14T23:10:33,714][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
[2017-01-14T23:10:33,715][INFO ][logstash.pipeline ] Pipeline main started
[2017-01-14T23:10:35,171][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
启动使用了模板,接着输入“hello logstash!”会正常输出 ,但是有没有保存到 elasticsearch 呢??
打开浏览器访问 kibana 地址: http://192.168.1.222:5601/
点击选项 “Dev Tools”,查询输入的语句或单词:
GET _search
{
"query": {
"match_phrase": {
"message": "hello logstash!"
}
}
}
输出结果如下:
{
"took": 48,
"timed_out": false,
"_shards": {
"total": 31,
"successful": 31,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 0.51623213,
"hits": [
{
"_index": "test",
"_type": "logs",
"_id": "AVmdjOMfxMlXNHIPQXkG",
"_score": 0.51623213,
"_source": {
"@timestamp": "2017-01-14T15:16:05.916Z",
"@version": "1",
"host": "0.0.0.0",
"message": "hello logstash!"
}
}
]
}
}
可以看到索引为 “test” , message 为 "hello logstash!" ,已经保持进来了!
要方便在可视化界面中搜索查询, 先在 kibana 中创建一个索引,名称为 “test”
现在可以在 kibana 中查询了!
好了,简单测试完成!