Log parsing with logstash
Jump to navigation
Jump to search
You need to ssh to our elk machine: log10 contains logstash, elasticsearch as well as kibana.
ssh root@log10.iihe.ac.be
- There are 2 directories to work with:
- /etc/logstash/conf.d/ : contains the filters to parse you log lines
- /opt/logstash/patterns: contains the patterns that can be used in the filters
- To understand all the eventual patterns to match parts of your log line, have a look at the pattern file grok-patterns.
- [To help you with the patterns and the grok filter, you can use this excellent site, where you can input log lines, patterns, the section of grok filter related to the patterns, etc, to construct your match.]
- a pattern looks like:
PATTERN_1 ${NUMBER:stored_in_my_number}%{SPACE}{IP:ip} PATTERN_2 %{INT:floor} %{GREEDYDATA:address} PATTERN_TOT %{PATTERN_1}%{SPACE}%{PATTERN2}
- PATTERN_TOT would match:
0412345678 127.0.0.1 666 road to higgs
my_number => 0412345678 ip => 127.0.0.1 floor => 666 address => road to higgs
- For logs sent through (r)syslog, there will always be a bunch of information (like timestamp, program used, Facility, host, host_ip) that are prefixed to your log line. It is already extracted in /etc/logstash/conf.d/logstash-complex.conf
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:timestamp} %{IPORHOST:logsource} (?:%{PROG:program}(?:\[%{POSINT:pid}\])?: )?%{GREEDYDATA:msg}",:
Therefore your log line is stored in message (msg is mutated in message), and that's what you need to match using the grok filter.
To match lines like the example before, expecting you have put the patterns in a file, you just need to make a myservice.conf file in /etc/logstash/conf.d/ with:
filter{ grok { match => [ "message", "%{PATTERN_TOT}"] } }
- You then need to restart logstash:
service logstash restart
- Your log lines should now be filtered in the Logstash Search dashboard.
- You should see some new fields appearing for each line of logs if you've constructed your grok matching correctly!