Skip to content

parser job from logsearch/203.0.0

Github source: f85490fb or master branch

Properties

logstash

metadata_level

Whether to include additional metadata throughout the event lifecycle. NONE = disabled, DEBUG = fully enabled

Default
NONE

output

elasticsearch
data_hosts

The list of elasticsearch data node IPs

Default
- 127.0.0.1
flush_size

Controls how many logs will be buffered and sent to Elasticsearch for bulk indexing

Default
500

logstash_parser

debug

Debug level logging

Default
false

deployment_dictionary

A file that contains a hash of job name-deployment name keypairs for @source.deployment lookup

Default
/var/vcap/packages/logsearch-config/deployment_lookup.yml

elasticsearch_document_id

Use a specific, dynamic ID rather than an auto-generated identifier.

elasticsearch_index

The specific, dynamic index name to write events to.

Default
logstash-%{+YYYY.MM.dd}

elasticsearch_index_type

The specific, dynamic index type name to write events to.

Default
'%{@type}'

elasticsearch_routing

The routing to be used when indexing a document.

enable_json_filter

Toggles the if_it_looks_like_json.conf filter rule

Default
false

filters

The configuration to embed into the logstash filters section. Can either be a set of parsing rules as a string or a list of hashes in the form of [{name: path_to_parsing_rules.conf}]

Default
""

idle_flush_time

How frequently to flush events if the output queue is not full.

inputs

A list of input plugins, with a hash of options for each of them. Please refer to example below.

Default
  - options: {}
    plugin: redis
Example
inputs:
- options:
    host: 192.168.1.1
    password: c1oudbunny
    user: logsearch
  plugin: rabbitmq

message_max_size

Maximum log message length. Anything larger is truncated (TODO: move this to ingestor?)

Default
1.048576e+06

outputs

The configuration to embed into the logstash outputs section

timecop

reject_greater_than_hours

Logs with timestamps greater than this many hours in the future won’t be parsed and will get tagged with fail/timecop

Default
1
reject_less_than_hours

Logs with timestamps less than this many hours in the past won’t be parsed and will get tagged with fail/timecop

Default
24

wait_for_templates

A list of index templates that need to be present in ElasticSearch before the process starts

Default
- index_template

workers

The number of worker threads that logstash should use (default: auto = one per CPU)

Default
auto

redis

host

Redis host of queue

key

Name of queue to pull messages from

Default
logstash

port

Redis port of queue

Default
6379

Templates

Templates are rendered and placed onto corresponding instances during the deployment process. This job's templates will be placed into /var/vcap/jobs/parser/ directory (learn more).

  • bin/monit_debugger (from bin/monit_debugger)
  • bin/parser_ctl (from bin/parser_ctl)
  • config/filters_override.conf (from config/filters_override.conf.erb)
  • config/filters_post.conf (from config/filters_post.conf.erb)
  • config/filters_pre.conf (from config/filters_pre.conf.erb)
  • config/input_and_output.conf (from config/input_and_output.conf.erb)
  • data/properties.sh (from data/properties.sh.erb)
  • helpers/ctl_setup.sh (from helpers/ctl_setup.sh)
  • helpers/ctl_utils.sh (from helpers/ctl_utils.sh)
  • logsearch/logs.yml (from logsearch/logs.yml)

Packages

Packages are compiled and placed onto corresponding instances during the deployment process. Packages will be placed into /var/vcap/packages/ directory.