SlideShare a Scribd company logo
Logstash + Elasticsearch + Kibana
Centralized Log server
(as Splunk replacement)

Marko Ojleski
DevOps Engineer
$plunk
Business as usual, untill…
#Outage @03:00AM
Check logs….?!?
10 network devices
40 servers
100 logs
Massive RAGE
tail
cat
grep
sed
awk
sort
uniq
and looots of |
tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n
it’s just too much
1. collect data
2. parse/filter
3. send data

Logstash

written in JRuby
Author: Jordan Sissel
input

parse/filter

output
1. collect data

30+ inputs
1. collect data
file

syslog

tcp

udp

zmq

redis

log4j
Logstash input
Log shippers

Logstash
Beaver (Python)
Lumberjack (Go)
Woodchuck (Ruby)
Nxlog (C)
Sample conf

input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}
2. parse/filter

40+ filters
2. parse/filter
grok

csv

grep

geoip

json
mutate

Logstash
filters

xml
key/value
Grok filter

REGEX pattern collection
Grok filter
Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2}))(?![0-9])
Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[09]{1,2}))(?![0-9])

IP
Logstash + Elasticsearch + Kibana Presentation on Startit Tech Meetup
`$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=(
$!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++;
$_++;$_++;($_,$,$,)=($~.$"."$;$/$%[$?]$_$$,$:$%[$?]",$"&$~,$#,);$,++
;$,++;$^|=$";`$_$$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$$"$^$~$*.>&$=`
`$=`;$_=%!;($_)=/(.)/;$==++$|;($.,$/,$,,$,$",$;,$^,$#,$~,$*,$:,@%)=(
$!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++;
$_++;$_++;($_,$,$,)=($~.$"."$;$/$%[$?]$_$$,$:$%[$?]",$"&$~,$#,);$,++
;$,++;$^|=$";`$_$$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$$"$^$~$*.>&$=`

Just another Perl hacker.
Grok filter

120+ regex patterns
USERNAME
IP
HOSTNAME
SYSLOGTIMESTAMP
LOGLEVEL
etc…
Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message
Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message
%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message}
Grok filter

client => 2.10.146.54
time => 2013-12-01T13:37:57Z
message = > some really boring message
Grok filter
input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}

filter {
if [type] == “server1" {
grok {
match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "}
}
}
3. send data

50+ outputs
3. send data
Logstash
output
statsd

stdout
tcp

elastic

redis

mongo

zmq
1. RESTful api
2. JSON-oriented
3. Horizontal scale
4. HA
5. Full Text search
6. Based on Lucene

Elasticsearch
Distributed RESTful
search server
Logstash => elasticsearch
input {
tcp {
type => “server1"
host => "192.168.1.1"
port => "5555"
}

filter {
if [type] == “server1" {
grok {
match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "}
}
}
output {
elasticsearch {}
}
1. Clean and simple UI
2. Fully customizable
3. Bootstrap based
4. Old version running on Ruby
5. Milestone 3 fully rewritten in
HTML/Angular.js

Kibana
Awesome Elasticsearch
Web Frontend to
search/graph
Real Life Scenarios
Scenario 1
L2 switch

Cisco ASA

L3 switch

UDP

UDP

Elasticsearch

Syslog broker

(lightweight shipper)

UDP

Logstash

(main log server)

Kibana
Scenario 2
Apache

(lightweight shipper)

IIS

TCP

TCP

(lightweight shipper)

Jboss

(lightweight shipper)

Elasticsearch

Logstash

(main log server)

TCP

Kibana

More Related Content

Logstash + Elasticsearch + Kibana Presentation on Startit Tech Meetup