mirror of
https://github.com/zeek/zeek.git
synced 2025-10-03 07:08:19 +00:00

Since the millisecond resolution cannot be harnessed universally and is not supported by older version of libcurl, we will allow only specifications at the granularity of seconds. This commit also fixes a typing issue that causes that prevented the ElasticSearch timeout to work in the first place: curl_easy_setopt requires a long but was given a uint64_t.
47 lines
1.6 KiB
Text
47 lines
1.6 KiB
Text
##! Log writer for sending logs to an ElasticSearch server.
|
|
##!
|
|
##! Note: This module is in testing and is not yet considered stable!
|
|
##!
|
|
##! There is one known memory issue. If your elasticsearch server is
|
|
##! running slowly and taking too long to return from bulk insert
|
|
##! requests, the message queue to the writer thread will continue
|
|
##! growing larger and larger giving the appearance of a memory leak.
|
|
|
|
module LogElasticSearch;
|
|
|
|
export {
|
|
## Name of the ES cluster
|
|
const cluster_name = "elasticsearch" &redef;
|
|
|
|
## ES Server
|
|
const server_host = "127.0.0.1" &redef;
|
|
|
|
## ES Port
|
|
const server_port = 9200 &redef;
|
|
|
|
## Name of the ES index
|
|
const index_prefix = "bro" &redef;
|
|
|
|
## The ES type prefix comes before the name of the related log.
|
|
## e.g. prefix = "bro\_" would create types of bro_dns, bro_software, etc.
|
|
const type_prefix = "" &redef;
|
|
|
|
## The time before an ElasticSearch transfer will timeout. Time
|
|
## specifications less than seconds result in a timeout value of 0, which
|
|
## means "no timeout."
|
|
const transfer_timeout = 2secs;
|
|
|
|
## The batch size is the number of messages that will be queued up before
|
|
## they are sent to be bulk indexed.
|
|
const max_batch_size = 1000 &redef;
|
|
|
|
## The maximum amount of wall-clock time that is allowed to pass without
|
|
## finishing a bulk log send. This represents the maximum delay you
|
|
## would like to have with your logs before they are sent to ElasticSearch.
|
|
const max_batch_interval = 1min &redef;
|
|
|
|
## The maximum byte size for a buffered JSON string to send to the bulk
|
|
## insert API.
|
|
const max_byte_size = 1024 * 1024 &redef;
|
|
}
|
|
|