Skip to content

The Logging System is a robust solution designed to collect, store, and visualize log data.

Notifications You must be signed in to change notification settings

Ogedi001/logging_system

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logging System

The Logging System is a robust solution designed to collect, store, and visualize log data. It leverages Elasticsearch as the data store for logs, Kibana for management and exploration, and Grafana for advanced visualization. This repository contains the codebase and configuration files required to set up and run the logging system.

Key Components

1. Elasticsearch

Elasticsearch is a powerful, open-source search and analytics engine. In this logging system, it is used as the primary datastore for storing log data. Elasticsearch allows for efficient indexing, querying, and analysis of large volumes of log data.

2. Kibana

Kibana is an open-source data visualization dashboard for Elasticsearch. It allows you to explore, visualize, and manage the log data stored in Elasticsearch. Kibana provides powerful tools for creating detailed visualizations and dashboards.

3. Grafana

Grafana is an open-source platform for continous monitoring and observability. It provides a rich set of tools to create, explore, and share dashboards, making it easy to visualize the data stored in Elasticsearch and understand how it is working in real time. Grafana's flexible query options and interactive visualizations help in gaining deep insights from log data. Grafana is used for continous monitoring our application.

4. Node.js Backend

The backend is built with Node.js and Express.js, providing APIs to ingest and manage log data. The backend communicates with Elasticsearch to index and retrieve logs.

Features

  • Log Ingestion: API endpoints to ingest logs into Elasticsearch.
  • Index Management: Automatically create and manage indices in Elasticsearch.
  • Data Visualization: Pre-configured Grafana dashboards for log visualization and analysis.
  • Timestamp Management: Automatic addition of @timestamp field to log entries for precise time-based querying.

Installation

Prerequisites

Steps

  1. Clone the Repository:

    git clone https://github.com/Ogedi001/logging_system.git
    cd logging_system
  2. Install Dependencies:

    npm install
  3. Environment Variables:

    Create a .env file in the root of the project and add the following environment variables:

    PORT=4000 # default port
    ELASTIC_URL=http://localhost:9200/  # Elasticsearch runs locally on port 9200 by default
    ELASTIC_USERNAME=elastic
    ELASTIC_PASSWORD=<your_password>

Elasticsearch Setup

  1. Download and Install Elasticsearch:

    Follow the instructions from the Elasticsearch installation guide to download and install Elasticsearch.

N/B Password for elasticsearch is autogenerated during installation. Copy and Save somewhere.

To see all executable program or "Elasticsearch CLI commands"

cd /usr/share/elasticsearch/bin
ls
  1. To reset password
    sudo ./elasticsearch-reset-password
  1. To generate enrolement token for kibana
   sudo ./elasticsearch-create-enrollment-token --scope kibana
  1. Start Elasticsearch

    sudo bin/elasticsearch
  2. Some Important command Linux

systemctl start elasticsearch //start elasticsearch
systemctl status elasticsearch //see elasticsearch status
systemctl stop elasticsearch //stop elasticsearch from running
  1. To Adjust Elasticsearch Configuration

Open "yml" file

sudo nano /etc/elasticsearch/elasticsearch.yml 

E.g To configure CORS -open "yml" with any of nano , vim etc \

Then Add

#Enable Cors for origins 
const corsOptions = {
  origin: '*', // Allow all origins
  methods: ['OPTIONS', 'POST', 'GET'],
  allowedHeaders: ['X-Requested-With', 'X-Auth-Token', 'Content-Type', 'Content-Length', 'Authorization', 'Access-Control-Allow-Headers', 'Accept'],
  credentials: true
};

app.use(cors(corsOptions));
  1. Check if Elasticsearch is running
curl -i -u elastic:<your password> GET https://localhost:9200

or
open URL on browser, enter password and username

https://localhost:9200

Kibana Setup

  1. Download and Install Kibana:

    Follow the instructions from the Kibana installation guide to download and install Kibana.

  2. Configure Kibana: To see all executable program or "Kibana CLI commands"

    cd /usr/share/kibana/bin
     ls

    Open the kibana.yml configuration file and set the elasticsearch.hosts property to point to your Elasticsearch instance:

     sudo nano /etc/kibana/kibana.yml 
    elasticsearch.hosts: ["http://localhost:9200"]

    To allow connection to every posible user

     server.host: "0.0.0.0"

    Set Port

    server.port: 5601
  3. Start Kibana:

    bin/kibana
  4. Some Important command Linux

systemctl start kibana //start kibana
systemctl status kibana //see kibana status
systemctl stop kibana//stop kibana from running
  1. Check if Kibana is running And View Logs

open URL on browser, enter

http://localhost:5601
  • Copy and paste elasticsearch enrollment token
  • Sign up automatically

or

  • Login with your Elasticsearh

then

  • Go to kibana dashboard
  • Go to Stack Management
  • Index Mangement

You can select any index from the available indices to manage or discover index

Grafana Setup

  1. Download and Install Grafana:

    Follow the instructions from the Grafana installation guide to download and install Grafana.

  2. Start Grafana:

    sudo systemctl start grafana-server
  3. Add Elasticsearch Data Source in Grafana:

    • Open Grafana in your browser (http://localhost:3000).
    • Log in with the default credentials (admin/admin).
    • Go to Configuration > Data Sources > Add data source.
    • Select Elasticsearch and configure it with your Elasticsearch URL (http://localhost:9200).

alt text

alt text

  • Create New Dashboard to visualized and monitor the data
  1. Run the Application:

    npm start or npm run dev

    The application will start on http://localhost:PORT.

Usage

API Endpoints

  • Endpoint:
    • POST http://localhost:PORT/api/log/local/:logName
    • Description: Ingest log data into Elasticsearch using local configuration.
  • Endpoint:
    • POST http://localhost:PORT/api/log/cloud/:logName
    • Description: Ingest log data into Elasticsearch running on cloud.

Ingest Log Data (Cloud Configuration):

  • Endpoint: POST http://localhost:PORT/api/log/cloud/:logName
  • Description: Ingest log data into Elasticsearch running on cloud.
  • URL Parameters:
    • logName (string): The name of the log index.
  • Body Parameters:
    • logData (object): The log data to be ingested.

Example Request:

curl -X POST http://localhost:PORT/api/log/cloud/example-log -H 'Content-Type: application/json' -d '{
    "message": "This is a log message",
    "level": "info"
}'

Ingest Log Data (Local Configuration):

  • Endpoint: POST http://localhost:PORT/api/log/local/:logName
  • Description: Ingest log data into Elasticsearch running on cloud.
  • URL Parameters:
    • logName (string): The name of the log index.
  • Body Parameters:
    • logData (object): The log data to be ingested.

Example Request:

curl -X POST http://localhost:PORT/api/log/local/example-log -H 'Content-Type: application/json' -d '{
    "message": "This is a log message",
    "level": "info"
}'

Contributing

Contributions are welcome! Please open an issue or submit a pull request for any improvements or bug fixes.

Contact

For any questions or inquiries, please contact [email protected].


Enjoy using the Logging System!

About

The Logging System is a robust solution designed to collect, store, and visualize log data.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published