Problem Statement

In our organization, Error detection and troubleshooting are currently manual and time-consuming processes due to the lack of a unified view of application logs and metrics. This leads to delayed incident response and increased downtime.

Objective: To implement a comprehensive observability solution that provides a unified view of application logs and metrics, thereby automating error detection and troubleshooting processes. This initiative aims to significantly reduce the time spent on manual monitoring, enhance incident response times, and minimize system downtime by leveraging real-time data visualization and advanced analytics.

We adopted ELK stack to achieve above objective.

Setting up ELK on Ubuntu:

  1. Install and configure ELK on UBUNTU as single node cluster.
  2. Filebeat Setup on Application Nodes

Check Elasticsearch:

Check Kibana:

Traces:


APM agent  

Copy the apm_agent folder to the server (preferably in the wildfly installation location) 


File elasticapm.proerties

server_url=http://<private-ip>:8200

enable_log_correlation=true

environment=hyd-sandbox


Switch to bin/wildfly/ and edit standalone.conf.bat add below line under wildfly folder

set "JAVA_OPTS=%JAVA_OPTS% -javaagent:E:\\AppServer\\wildfly-30.0.0.Final\\wildfly-30.0.0.Final\\elastic-apm-agent-1.52.0.jar"

restart wildfly 

check if traces are visible in APM section of kibana 

  

Logging 

  1. WAR file changes 

    - create a Logs folder inside wildfly  

    - in application.properties of war file ensure that `logging.file.name` is set to `{wildfly-location}/Logs/{service}.log` 

        - Ex: if wildfly is deployed at `E:\wildfly` and service is Common API then set `logging.file.name=E:/wildfly/Logs/common-api.log` 

    - download [http_ca.crt] to the server (preferably in the filebeat installation location) 

    - edit filebeat.yml in filebeat installation folder and replace its contents with [filebeat.yml]

        - set the [paths](filebeat.yml) to Logs folder created in step 1 `{wildfly-location}/Logs` 

            - Ex: if wildfly is deployed at `E:\wildfly` then set paths to `E:/wildfly/Logs/*.json` 

        - set the [environment](filebeat.yml) to match the one set in APM agent 

            - Ex: if the server is in `Hyderabad` and it is a `sandbox` server set `environment` to `hyd-sandbox` 

        - set [API KEY](filebeat.yml) used for filebeat to communicate with elasticsearch 

        - set the [certificate](filebeat.yml) location of http_ca.crt 

 

Filebeat.yml file:

filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - "E:/AppServer/wildfly-30.0.0.Final/wildfly-30.0.0.Final/Logs/*.json"
    json.keys_under_root: true
    json.overwrite_keys: true
  
processors:
  - add_fields:
      target: ''
      fields:
        environment: hyd-sandbox

        
output.elasticsearch:
  hosts: ["https://<private-ip>:9200"]
  api_key: "<api-key>"
  ssl:
    certificate_authorities: ["C:/Program Files/Filebeat/http_ca.crt"]
  index: "filebeat-%{[service.name]}-%{+yyyy.MM.dd}"

setup.template.name: "filebeat"
setup.template.pattern: "filebeat-*"
setup.template.settings:
  index.lifecycle.name: "90day-log-policy"
  index.lifecycle.rollover_alias: "filebeat"


restart filebeat

check for logs under filebeat in kibana 


Changes need to be done in GitHub in ci-properties & application properties for each service line.

Example:

Goto common-ci.properties

ADD a new line

logging.file.name=@env.HWC_API_LOGGING_FILE_NAME@ 


Goto application.properties 

Add below two lines:

logging. Path=logs/ 

logging.file.name=logs/hwc-api.log



4.1. Define Environment Variables in Jenkins Pipeline:

In the Jenkins pipeline, add an environment variable

env.HWC_API_LOGGING_FILE_NAME='E:/AppServer/wildfly-30.0.0.Final/wildfly-30.0.0.Final/Logs/hwc-api.log'                       

  1. Verifying the Setup

5.1. Check Elasticsearch:

              curl -X GET "localhost:9200/"

5.2. Check Logstash Input:

              curl -XGET 'http://your_logstash_host:5044/_cat/health'

5.3. Check Kibana:

5.4. Check Application Logs:

Check if the logs are being generated at the specified location (e.g., logs/inventory-api.log) and whether they are being processed by Filebeat and appear in Kibana.

Troubleshooting

 


Conclusion

This document outlines the process of configuring ELK for log collection from our application. we have set up Elasticsearch, Logstash, and Kibana on Ubuntu, configured Filebeat for log forwarding, and adjusted your application’s properties file to use environment variables for log paths. The Jenkins pipeline is configured to pass the necessary environment variables to the application, ensuring that the logs are written to the correct location.