archiver job from logsearch/18
Github source:
adff8e1b
or
master branch
Properties¶
archiver
¶
cron_schedule
¶Schedule for pausing the archiver to upload log files
- Default
'*/30 * * * *'
data_dir
¶Directory for dumping log files
- Default
/var/vcap/store/archiver
s3
¶
access_key_id
¶S3 Access Key ID
bucket
¶S3 Bucket
endpoint
¶S3 Endpoint
- Default
s3.amazonaws.com
prefix
¶S3 Prefix
- Default
""
secret_access_key
¶S3 Secret Access Key
workers
¶The number of worker threads that logstash should use (default: auto = one per CPU)
- Default
auto
redis
¶
host
¶Redis host of queue
key
¶Name of queue to pull messages from
- Default
logstash
port
¶Redis port of queue
- Default
6379
Templates¶
Templates are rendered and placed onto corresponding
instances during the deployment process. This job's templates
will be placed into /var/vcap/jobs/archiver/
directory
(learn more).
bin/logstash.process
(frombin/logstash.process.erb
)bin/s3upload.cron
(frombin/s3upload.cron.erb
)config/logstash.conf
(fromconfig/logstash.conf.erb
)
Packages¶
Packages are compiled and placed onto corresponding
instances during the deployment process. Packages will be
placed into /var/vcap/packages/
directory.