Introduction to the ElasticSearchLoggerAppender Module
The ElasticSearchLoggerAppender module provides a logger appender that sends log events to ElasticSearch using the bulk API for efficient batch logging.
Features
- Batched logging using ElasticSearch Bulk API for high throughput
- Configurable batch size and flush interval
- Support for date-based index patterns (e.g.,
logs-Y.m.d)
- Structured JSON documents with standard log fields
- Thread-safe buffer management
- Automatic flush on close
Usage Example
%requires Logger
%requires ElasticSearchLoggerAppender
Logger logger("MyApp", LoggerLevel::getLevelInfo());
ElasticSearchLoggerAppender appender("es-appender",
new LoggerLayoutPattern("%d [%p] %c - %m%n"),
{
"url": "http://localhost:9200",
"index": "app-logs-%Y.%m.%d",
"batch_size": 100,
"flush_interval_ms": 5000,
});
logger.addAppender(appender);
appender.open();
logger.info("Application started");
logger.error("Something went wrong");
appender.close();
Constructor Options
The ElasticSearchLoggerAppender class accepts the following options in the constructor:
| Option | Type | Required | Description
|
url | string | Yes | The ElasticSearch server URL (e.g., "http://localhost:9200")
|
index | string | Yes | Index name pattern; supports date format placeholders like Y, m, d </td>
|
(e.g., "app-logs-%Y.%m.%d" creates daily indices)
batch_size | int | No | Maximum number of log events per batch before flushing (default: 100)
|
flush_interval_ms | int | No | Maximum time in milliseconds between flushes (default: 5000)
|
pipeline | string | No | ElasticSearch ingest pipeline name for document processing
|
connect_timeout | int | No | Connection timeout in milliseconds
|
timeout | int | No | Request timeout in milliseconds
|
Additional Examples
- Using with Date-Based Indices
ElasticSearchLoggerAppender appender("es", layout, {
"url": "http://elasticsearch:9200",
"index": "myapp-%Y.%m.%d",
});
- High-Throughput Configuration
ElasticSearchLoggerAppender appender("es", layout, {
"url": "http://elasticsearch:9200",
"index": "high-volume-logs",
"batch_size": 500,
"flush_interval_ms": 1000,
});
- Using an Existing REST Client
RestClient rest({"url": "http://elasticsearch:9200", "headers": {"Authorization": "Bearer token"}});
ElasticSearchLoggerAppender appender("es", layout, rest, {
"index": "secure-logs",
});
Document Structure
Each log event is converted to an ElasticSearch document with the following structure:
{
"@timestamp": "2024-01-15T10:30:45.123456+00:00",
"level": "ERROR",
"level_value": 40000,
"logger": "MyApp.Service",
"message": "Connection failed",
"thread_id": 12345,
"location": {
"file": "Service.q",
"line": 142,
"function": "connect"
},
"exception": {
"type": "SOCKET-ERROR",
"message": "Connection refused"
},
"host": {
"name": "server01.example.com"
}
}
Release Notes
v1.0
- Initial release with batching support