Starting with Kinetic Core v2.0.2 a new feature was implemented referred to as structured logging. Structured logging essentially is ensuring that a log file is a preset, consistent, and machine readable format. Structured logs are useful for enterprise log aggregation tools like Splunk, Graylog, or Elastic.
We here at Kinetic Data leveraged this feature for our kinops.io SaaS offering. Take a look at this brief blog article on the benefits we found by using structured logging with Elasticsearch. If you're interested in integrating with Elasticsearch as well for log aggregation, see our community article Elasticsearch and Filebeat integration with Kinetic Structured Logs. That article will go over how to use Filebeat to read our log files and send them over to Elasticsearch.
The structured logs consist of the following files in the %DATA_DIR%/logs directory -
- Logs an entry for every time a resource is accessed through the kinetic application. Who, when, how long, what, etc. This can be used for troubleshooting, auditing, or analytics.
- Contains entries for application warnings, errors, debug, or trace level messages. Used for troubleshooting.
- Authentication attempts get logged to this file. Who, when, and authentication type. Used for troubleshooting and auditing.
- Heartbeat checks, application startups and shutdowns, and other non-frequent events like that are the purpose for this log file.