Logstash
The agent
Logstash is a tool that parses logs into JSON so they can be indexed and search. The application is packaged a self contained .jar file.
Currently the logstash is deployed by having an instance of logstash running on clients and a server.
The clients read in events from files and send them to a central server. This is done as most of the application in the cluster do not write their logs to the system syslog by rather use another logging tool such as log4j or logback
The central point where logstash events are sent to is acually a redis instance, this acts as a broker allowing for a loarge amount of traffic.
The logstash installation is contained within /opt/logstash
/opt/logstash/bin/logstash-monolithic.jar /opt/logstash/etc/logstash.conf /opt/logstash/log/logstash.log
All files are distributed by cfengine, this also takes care of ensuring the relevant conf file is copied based on the server.
A basic config would be as follows
input {
file { type => "yum" path => ["/var/log/yum"] exclude => ["*.gz"] debug => true } }
filter {
grok { type => "yum" pattern => ["%{SYSLOGTIMESTAMP} %{DATA:action}\: %{GREEDYDATA:package}"] break_on_match => false } }
output {
stdout { debug => true }
redis { host => "148.187.66.65" data_type => "list" key => "logstash" } }
When events are received on the central server they are indexed by elastic search.
input { redis { host => "148.187.66.65" type => "redisinput" data_type => "list" key => "logstash" } }
output { elasticsearch { cluster => "logstash" } }
Note: At the time of writing logstash is only compatible with elasticsearch 20.2-x
Redis instance
Elastic search
Web Interface
Sending alters to nagios
The web interface is provided by a ruby application called Kibana
--
GeorgeBrown - 2013-05-07