Problem Statement
In our organization, Error detection and troubleshooting are currently manual and time-consuming processes due to the lack of a unified view of application logs and metrics. This leads to delayed incident response and increased downtime.
Objective: To implement a comprehensive observability solution that provides a unified view of application logs and metrics, thereby automating error detection and troubleshooting processes. This initiative aims to significantly reduce the time spent on manual monitoring, enhance incident response times, and minimize system downtime by leveraging real-time data visualization and advanced analytics.
We adopted ELK stack to achieve above objective.
Setting up ELK on Ubuntu:
- Install and configure ELK on UBUNTU as single node cluster.
Copy the apm_agent folder to the server (preferably in the wildfly installation location)
File elasticapm.proerties
server_url=http://<private-ip>:8200
enable_log_correlation=true
environment=hyd-sandbox
- Filebeat Setup on Application Nodes
- Filebeat is used to forward logs from our application nodes to Logstash or Elasticsearch.
- Download and install
- Download http_ca.crt to the server (preferably in the filebeat installation location)
- Create a Logs folder inside Wildfly
Edit Filebeat.yml file do the changes
Check Elasticsearch:
- Test Elasticsearch by accessing its URL in a browser: curl -X GET "localhost:9200/"
- If Elasticsearch is running correctly, you should see a JSON response with Elasticsearch cluster details.
Check Kibana:
- Go to Kibana’s dashboard by accessing its URL in a browser http://localhost:5601
- Navigate to the "Discover" section and check if logs are being indexed in Elasticsearch.
Switch to bin/wildfly/ and edit standalone.conf.bat add below line under wildfly folder
set "JAVA_OPTS=%JAVA_OPTS% -javaagent:E:\\AppServer\\wildfly-30.0.0.Final\\wildfly-30.0.0.Final\\elastic-apm-agent-1.52.0.jar"
restart wildfly
check if traces are visible in APM section of kibana
restart filebeat
check for logs under filebeat in kibana
Changes need to be done in GitHub in ci-properties & application properties for each service line.
Example:
Goto common-ci.properties
ADD a new line
logging.file.name=@env.HWC_API_LOGGING_FILE_NAME@
Add below two lines:
logging. Path=logs/
logging.file.name=logs/hwc-api.log
4.1. Define Environment Variables in Jenkins Pipeline:
In the Jenkins pipeline, add an environment variable
env.HWC_API_LOGGING_FILE_NAME='E:/AppServer/wildfly-30.0.0.Final/wildfly-30.0.0.Final/Logs/hwc-api.log'
- Verifying the Setup
5.1. Check Elasticsearch:
- Test Elasticsearch by accessing its URL in a browser:
curl -X GET "localhost:9200/"
- If Elasticsearch is running correctly, you should see a JSON response with Elasticsearch cluster details.
5.2. Check Logstash Input:
- Test the connection between Filebeat and Logstash:
curl -XGET 'http://your_logstash_host:5044/_cat/health'
- Verify that Filebeat is sending logs to Logstash.
5.3. Check Kibana:
- Go to Kibana’s dashboard by accessing its URL in a browser: http://localhost:5601
- Navigate to the "Discover" section and check if logs are being indexed in Elasticsearch.
5.4. Check Application Logs:
Check if the logs are being generated at the specified location (e.g., logs/inventory-api.log) and whether they are being processed by Filebeat and appear in Kibana.
Troubleshooting
- Filebeat not sending logs: Check the Filebeat logs at /var/log/filebeat/ for errors.
- Elasticsearch or Kibana not working: Review the logs in /var/log/elasticsearch/ and /var/log/kibana/.
- Application not writing logs: Ensure that the application has access to the specified log path and that the environment variable is being correctly passed from Jenkins.
Conclusion
This document outlines the process of configuring ELK for log collection from our application. we have set up Elasticsearch, Logstash, and Kibana on Ubuntu, configured Filebeat for log forwarding, and adjusted your application’s properties file to use environment variables for log paths. The Jenkins pipeline is configured to pass the necessary environment variables to the application, ensuring that the logs are written to the correct location.