Introduction
The ELK (Elasticsearch, Logstash, Kibana) stack, also known as the Elastic stack, runs on various setups and operating systems. A simple way to try out, install and test the ELK stack is to run it on Docker.
This tutorial outlines two ways to install the ELK stack on Docker.
Prerequisites
- Docker installed and configured.
- Docker Compose installed and configured (recommended).
- Git installed to clone the ELK Docker repository.
- Access to a web browser to view the Kibana dashboard.
- Access to the internet to pull plugins.
- Command-line interface/terminal access with sudo privileges.
Installing ELK on Docker
There are two ways to install ELK on Docker:
1. Pull an automatically built image from the Docker registry.
2. Build the image from source files.
Before starting either method, the ELK stack requires a change on the mmap counts to not run out of virtual memory during installation and use.
To change the value on Linux, run the following:
sudo sysctl -w vm.max_map_count=262144
The mmap counts help store and access index data. If not set, the installation comes to a halt due to the lower default OS value.
Pulling the Image
To pull the ELK image from the Docker registry, open the terminal and run:
sudo docker pull sebp/elk
Use tags to specify a specific version of Elasticsearch, Kibana, and Logtash:
sudo docker pull sebp/elk:<tag>
The command pulls the latest version of the ELK stack using the default latest
tag. The complete list is available at Docker Hub.
Building the Image from Source Files
There are two ways to build the image from a source file. Use either vanilla Docker or Docker compose. In both cases, clone the Git repository and enter the directory:
git clone https://github.com/spujadas/elk-docker.git
cd elk-docker
Use either docker build
or docker-compose
to build the image.
Using Docker Build
To build the Docker image with the docker build
command, run:
sudo docker build -t elk-docker .
This option does not require Docker compose. The better option is to use Docker compose to ensure an isolated and functional environment.
Using Docker Compose
The Git repository comes with the YAML configuration file for setup with Docker compose.
1. Open the docker-compose.yml file in a text editor, such as Nano:
sudo nano docker-compose.yml
2. Delete everything from the compose file and add the following:
services:
elk:
build: .
ports:
- "5601:5601"
- "9200:9200"
- "5044:5044"
The ports help run the container at a later step.
Press CTRL+X, then Y, and Enter to save the file and exit the editor.
3. Build the image with:
sudo docker-compose build elk
If using a different repository, exchange the repository/image name appropriately. Wait for the build to complete before continuing.
Installing Plugins
ELK provides various plugins to enrich the system with additional features and libraries. When running ELK on Docker, use the Dockerfile to install plugins and build the image to run the installation.
Open the Dockerfile located in the repository directory:
sudo nano Dockerfile
Below are examples for installing Elasticsearch, Logstash, and Kibana plugins through Docker. The plugin install process is similar for all three.
Elasticsearch
To install an Elasticsearch plugin, do the following:
1. Add the following at the end of the Dockerfile:
FROM sebp/elk
ENV ES_HOME /opt/elasticsearch
WORKDIR ${ES_HOME}
RUN yes | CONF_DIR=/etc/elasticsearch gosu elasticsearch bin/elasticsearch-plugin \
install -b <plugin name or link>
Exchange <plugin name or link>
for the plugin of your choice.
The code sets the working directory, configuration directory location, and searches for the plugin.
The install script located in bin/elasticsearch-plugin runs the installation. When installing Elasticsearch plugins, the command requires the elasticsearch
user while gosu
elevates the privileges.
2. Save the Dockerfile and close the editor.
3. Build the image using either docker build
or docker-compose
.
The image build runs the Dockerfile commands and executes the installation.
Logstash
Follow the steps below to install Logstash plugins.
1. Add the following code to the Dockerfile:
FROM sebp/elk
WORKDIR ${LOGSTASH_HOME}
RUN gosu logstash bin/logstash-plugin install <plugin name>
The Logstash plugins do not require a configuration directory. Install directly using the install script located in bin/logstash-plugin and replace <plugin name or link>
with the plugin's accurate information.
2. Save the contents and close the Dockerfile.
3. Run the build to install the plugin.
The installation information and resulting output show up at the end of the build log.
Kibana
To install Kibana plugins, do the following:
1. Insert the following code at the end of the Dockerfile:
FROM sebp/elk
WORKDIR ${KIBANA_HOME}
RUN gosu kibana bin/kibana-plugin install <plugin name or link>
The process is the same as with Logstash plugins. Exchange <plugin name or link>
for an actual plugin name or link.
2. Save the file and close.
3. Build the Docker image and check the output for the installation results.
The command automatically searches for the plugin and installs it with the kibana-plugin script.
Running the ELK Container
There are two ways to run the ELK container:
1. From the image through the docker run
command.
2. Using Docker compose.
Below are the commands and explanations for both cases.
From the Image via Command
To start the whole ELK stack container via docker run
, use the following:
sudo docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -it --name elk sebp/elk
The command publishes the following ports:
5601
serves the Kibana web interface.9200
for Elasticsearch JSON interface.5044
for Logstash Beats interface.
The three ports are necessary for the stack to work correctly. Additionally, the following ports are exposed but not published:
9300
for the transport interface of Elasticsearch (expose with-p 9300:9300
).9600
for the Logstash monitoring API (expose with-p 9600:9600
).
Access the Kibana web interface with:
http://<host>:5601
Replace <host>
with the hostname or IP of Docker's host. If running a local version, use localhost.
Using Docker Compose
To run the ELK container with Docker compose, use:
sudo docker-compose up elk
The port mapping is in the docker-compose.yml file. Access Kibana from the web browser with:
http://<host>:5601
Use localhost if running Docker locally.
Running Individual Services
The following environment variables allow running ELK services individually instead of the whole stack:
ELASTICSEARCH_START
LOGSTASH_START
KIBANA_START
Setting the variables to anything other than one (1
) does not start the service.
For example, to start Elasticsearch without Logstash and Kibana, use:
sudo docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -it -e LOGSTASH_START=0 -e KIBANA_START=0 --name elk sebp/elk
Check that Elasticsearch is running with a curl request:
curl localhost:9200
The Kibana dashboard page (localhost:5601
) does not display because the service is not running.
Conclusion
After following the steps from this guide, you've installed and run the ELK stack on Docker!
Next, check out how to deploy Elasticsearch on Kubernetes.