Logstash Parsing the Logs - LogStash

What is Logstash Parsing the Logs?

Logstash obtains the logs by means of input plugins and then makes use of the filter plugins to parse and transform the data. The parsing and transformation of logs are done affording to the systems existing in the output destination. Logstash parses the logging data and forwards solitary the essential fields. Later, these fields are changed into the destination system’s well-matched and understandable form.

How to Parse the Logs?

Parsing of the logs is done my using the GROK (Graphical Representation of Knowledge) patterns and you can find them in Github −

https://github.com/elastic/logstash/tree/v1.4.2/patterns.

Logstash contests the data of logs with a stated GROK Pattern or a pattern sequence for parsing the logs like "%{COMBINEDAPACHELOG}", which is usually used for apache logs.

The parsed data is more organized and calmer to search and for execution queries. Logstash searches for the stated GROK patterns in the input logs and extracts the matching lines from the logs. You can use GROK debugger to test your GROK patterns.

The syntax for a GROK pattern is %{SYNTAX:SEMANTIC}. Logstash GROK filter is written in the following form −

%{PATTERN:FieldName}

Now, PATTERN signifies the GROK pattern and the fieldname is the name of the field, which represents the parsed data in the output.

For instance, using online GROK debugger https://grokdebug.herokuapp.com/

Input

A sample error line in a log –

GROK Pattern Sequence

This GROK pattern sequence matches to the log event, which includes of a timestamp trailed by Log Level, Process Id, Transaction Id and an Error Message.

output

The output is in JSON format.

All rights reserved © 2020 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

LogStash Topics