Centralized Logging with Logstash, Elasticsearch & Kibana

Centralized Logging with Logstash, Elasticsearch & Kibana

In this post, we will set up the collection and visualization of system logs with the use of Logstash and Kibana.

Sometimes we need to look through the logs, searching for the required lines on several servers; to do this, we log in and look for the logs repeating the same commands on the servers.

Suppose we have 3 MX servers, and some customer files a complaint on a missing message; which he sent abroad to his wife at a particular time. 


Using the ELK stack (E-elasticsearch L-logstash K-kibana), we can find that message in the logs in a couple of clicks. 


What you have to replace is marked in italic.

What we use for building the centralized logging system

– Centos 7: The most recent version of the operation system

– Logstash: Server-based part for processing incoming logs

– Elasticsearch: For storing logs

– Kibana: Web interface for searching through and visualizing the logs

– Logstash Forwarder: It is installed on the servers as an agent for sending logs to a logstash server.

We will install the first three components on our collection server, and Logstash Forwarder on the servers we want to collect logs from.

Install Java 8

Java is needed for Logstash and Elasticsearch. We are going to install OpenJDK 8.

cd /opt
sudo wget --no-cookies --no-check-certificate --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com%2F; oraclelicense=accept-securebackup-cookie" 
"https://download.oracle.com/otn-pub/java/jdk/8u40-b25/jre-8u40-linux-x64.tar.gz"

Unpack

sudo tar xvf jre-8*.tar.gz

Grant the necessary rights:

sudo chown -R root: jre1.8*

Create simlinks with the use of alternatives:

sudo alternatives --install /usr/bin/java java /opt/jre1.8*/bin/java 1

Delete the downloaded archive

sudo rm /opt/jre-8*.tar.gz

Install Elasticsearch

Import Elasticsearch public GPG key:

sudo rpm --import http://packages.elasticsearch.org/GPG-KEY-elasticsearch

Create and edit the repository file for Elasticsearch:

sudo vi /etc/yum.repos.d/elasticsearch.repo
[elasticsearch-1.4]
name=Elasticsearch repository for 1.4.x packages
baseurl=http://packages.elasticsearch.org/elasticsearch/1.4/centos
gpgcheck=1
gpgkey=http://packages.elasticsearch.org/GPG-KEY-elasticsearch
enabled=1

Install Elasticsearch

sudo yum -y install elasticsearch-1.4.4

Modify the configuration file:

sudo vi /etc/elasticsearch/elasticsearch.yml

Close access to elascticsearch from the outside:

network.host: localhost

Run Elasticsearch:

sudo systemctl start elasticsearch.service

And add it to the autorun:

sudo systemctl enable elasticsearch.service

Install Kibana:

Download and unpack Kibana 4:

cd ~; wget https://download.elasticsearch.org/kibana/kibana/kibana-4.0.1-linux-x64.tar.gz; tar xvf kibana-*.tar.gz

Edit the configuration file:

vi ~/kibana-4*/config/kibana.yml

In the Kibana configuration file, find the line that determines the host and replace the IP (0.0.0.0 by default) with the IP from the localhost:

host: "localhost"

This parameter indicates that Kibana will be accessible only locally. This is good, as we will use the reverse proxy server Nginx to grant access from the outside.

sudo mkdir -p /opt/kibana

And relocate the unpacked files there:

sudo cp -R ~/kibana-4*/* /opt/kibana/

Kibana is run as /opt/kibana/bin/kibana, but we will run it as a service. Create Kibana Systemd:

sudo vi /etc/systemd/system/kibana4.service
[Service]
ExecStart=/opt/kibana/bin/kibana
Restart=always
StandardOutput=syslog
StandardError=syslog
SyslogIdentifier=kibana4
User=root
Group=root
Environment=NODE_ENV=production
 
[Install]
WantedBy=multi-user.target

Now, run it and add it to the autorun

sudo systemctl start kibana4
sudo systemctl enable kibana4

Set epel repository

sudo yum -y install epel-release

Install Nginx

sudo yum -y install nginx httpd-tools

Using htpasswd, create a user and a password

sudo htpasswd -c /etc/nginx/htpasswd.users kibanaadmin

Now, edit the main configuration nginx.com:

sudo vi /etc/nginx/nginx.conf

Find and delete the whole section server{}. Two lines should remain at the end

    include /etc/nginx/conf.d/*.conf;
}

Now, create the configuration file nginx for kibana4

server {
    listen 80;
 
    server_name example.com;
 
    auth_basic "Restricted Access";
    auth_basic_user_file /etc/nginx/htpasswd.users;
 
    location / {
        proxy_pass http://localhost:5601;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;        
    }
}

Run Nginx:

sudo systemctl start nginx
sudo systemctl enable nginx

Now, Kibana is accessible at https://FQDN/

Install Logstash:

Create the repository file for Logstash:

sudo vi /etc/yum.repos.d/logstash.repo
 
[logstash-1.5]
name=logstash repository for 1.5.x packages
baseurl=http://packages.elasticsearch.org/logstash/1.5/centos
gpgcheck=1
gpgkey=http://packages.elasticsearch.org/GPG-KEY-elasticsearch
enabled=1

Save and exit

Install Logstash:

sudo yum -y install logstash

Generate SSL certificates

Generate certificates for checking server authenticity

cd /etc/pki/tls
sudo openssl req -subj '/CN=logstash_server_fqdn/' -x509 -days 3650 -batch -nodes -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt

The file logstash-forwarder.crt should be copied to all servers, which will send logs to the Logstash server

Configure Logstash:

The configuration files for Logstash are written in json format and are located at /etc/logstash/conf.d. Configuration includes 3 sections: inputs, filters, and outputs.

Create file 01-lumberjack-input.conf and set up “lumberjack” input (the protocol used by Logstash and Logstash Forwarder to communicate)

sudo vi /etc/logstash/conf.d/01-lumberjack-input.conf
 
input {
  lumberjack {
    port => 5000
    type => "logs"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}

Save and exit. It was noted here that lumberjack will listen to TCP port 5000 and will use the certificates we had generated before
.

Now, create a file named 10-syslog.conf, and add it to the settings of syslog messages filtration:

sudo vi /etc/logstash/conf.d/10-syslog.conf
 
filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

Save and exit

Create the last file 30-lumberjack-output.conf:

sudo vi /etc/logstash/conf.d/30-lumberjack-output.conf
 
output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

Restart Logstash:

sudo service logstash restart

Now that Logstash is set up, we go to Logstash Forwarder

Set up Logstash Forwarder

Copy the SSL certificate to the server where Logstash Forwarder will work

scp /etc/pki/tls/certs/logstash-forwarder.crt user@server_private_IP:/tmp

Download the key:

sudo rpm --import http://packages.elasticsearch.org/GPG-KEY-elasticsearch

Create the repository configuration file:

sudo vi /etc/yum.repos.d/logstash-forwarder.repo

Creating repo for Logstash Forwarder

[logstash-forwarder]
name=logstash-forwarder repository
baseurl=http://packages.elasticsearch.org/logstashforwarder/centos
gpgcheck=1
gpgkey=http://packages.elasticsearch.org/GPG-KEY-elasticsearch
enabled=1

Install Logstash Forwarder

sudo yum -y install logstash-forwarder

Copy the certificates to the required location:

sudo cp /tmp/logstash-forwarder.crt /etc/pki/tls/certs/

Let’s get to setting it up:

sudo vi /etc/logstash-forwarder.conf
"servers": [ "logstash_server_private_IP:5000" ],
"timeout": 15,
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
 
Между квадратнымы скобками вставляем
 
  {
      "paths": [
        "/var/log/messages",
        "/var/log/secure"
       ],
      "fields": { "type": "syslog" }
    }

Add Logstash Forwarder to the autorun and run it:

sudo service logstash-forwarder restart

Now, Logstash Forwarder will send logs to your Logstash server.

Enter kibana, open Dashboard, and enjoy the view.

Add comment

E-mail is already registered on the site. Please use the Login form or enter another.

You entered an incorrect username or password

Sorry that something went wrong, repeat again!