Here is the problem, its quite simple, I want to parse some rabbitmq logs and send them to my ELK stack.
Some things are really simple but have complicated starts, in this case I thought google it, someone has done this before. If you google “rabbitmq and ELK” you get how to use Qs to manage logs, and whilst this is excellent and recommended its not really helping. Next problem… using the latest versions of things … yep the world moves on and examples are slightly dated.
Anyhow the answer is as always real simple.
I use filebeat to send the logs to my all-in-one-docker-elk server (I will explain how this was setup else where). The logs are parsed by logstash on the docker server and Kibana will show the results as I need.
The filebeat config is as follows:
filebeat.prospectors:
- type: log
enabled: true
paths:
#- /var/log/*.log
#This is where the file is for now... Note the windows path needs \\ to escape the \
- c:\\TEMP\\rab*
#Mark the log as rabbit (Useful for the filter in Logstash later)
fields: {log_type: rabbit}
# Use Multiline !
multiline.pattern: '^='
# Defines if the pattern set under pattern should be negated or not. Default is false.
multiline.negate: true
# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
multiline.match: after
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# Period on which files under path should be checked for changes
#reload.period: 10s
setup.template.settings:
index.number_of_shards: 3
http://localhost:5601/path
hosts: ["localhost:9200"]
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
enabled: true
# You will need to get this cert from the docker image
ssl.certificate_authorities: ["c:\\program files\\filebeat\\logstash-beats.crt"]
timeout: 15
Now you need to sort out the logstash element. In this instance I added a new config to /etc/logstash/conf.d
12-rabbit.conf
filter {
if [fields][log_type] == "rabbit" {
grok {
match => { "message" => "=%{WORD:rabbit_message_type} REPORT=+ (?<time>%{MONTHDAY}-%{MONTH}-%{YEAR}::%{HOUR}:%{MINUTE}:%{SECOND}) ==.*$"}
}
date {
match => [ "time_text", "dd-MMM-yyyy::HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
Things to note here:
1) use of the log_type (set in filebeat)
2) scraping the time from the log file as time_text
3) decode the time and set the correct date
4) tagging the rabbit_message_type
Gotchas
1) The blerry thing should work but where are the logs… we set the date correctly in the filter so the logs will display as the correct date, therefore make sure kibana is looking at the log dates from the log you are importing not todays date/time.
Have fun!