What to do with system logs?
Logs are essential elements for all applications to understand the functioning, analyze, diagnose, and intervene accordingly.
One of the simple ways to have proactive management is to check the logs regularly. And react to information that indicates a degradation of the system or the application.
When you have to manage a lot of applications, with multiple systems, different components, and you want to consult the logs, it quickly becomes a headache.
This is where Elastic Stack (ELK) comes in.
What is ELK?
Centralizing logs with Elastic Stack
The Elastic Stack (ELK) is composed of :
- ElasticSearch: Indexing server that allows fast data search (like google.fr). It uses a distributed search engine, a NoSQL database, and a REST interface.
- Logstash: collects and processes in parallel data from a multitude of sources (programs, log files, etc.), then aggregates and sends them to Elasticsearch.
- Kibana: provides an interactive and customizable dashboard to view the stored data provided by ElasticSearch.
All these tools are under a free license. The Elastic company ensures the homogeneity of the development within the framework of community development.
Centralize logs with Elastic Stack: 3 tools
Elasticsearch is a server using Lucene (a project of the Apache foundation to create a java library that allows indexing and searching text).
The search engine Elasticsearch allows for indexing and searching for data.
It has two families of classes (indexing and search):
- Indexing: allows us to create indexes, to place documents in them, to analyze them, and to index them.
- Search: access to the index, formulation of searches, restitution of the result, and access statistics.
All functions are available via JSON and Java interfaces.
Any information can be put in Elasticsearch. There are many different use cases.
It is often integrated into applications to help in the search for contextual information.
LogStash is a tool dedicated to logging. It has a large number of data collectors, which makes it a potent tool for almost all log harvesting needs.
It centralizes, transforms, and stores data.
It should be used as a tool that will break up all the fields of a line, composing a log to do complex searches.To do this, it can process, transform the data to give us a better understanding of the information.
Example: GOIP processing of the IP address of log access on a Web server that will tell us the country of the Internet user.
Kibana is a data visualization plugin. It will allow you to access the data provided by Elasticsearch and restore them in graphical form. It is a product that will enable us to make beautiful graphs.
It makes line graphs, bar, histograms (pie chart), point cloud, and can place them on a map, as well as some aggregation and filtering treatments.
Its most common use is to create a dashboard with a real-time calculation of data to have a snapshot of cross data.
The assembly of these three tools makes it an ideal solution to centralize and restore the operating status of an IS by cross-analysis of logs.
You can go into detail to search for keywords of a log, a machine, or stay at a macroscopic level by displaying pre-set information in a Kibana graph.
In general, some graphs are created to provide a dashboard, and then, if there is a need for detail, the information can be searched precisely to make event correlation.