archiver job from logsearch/20
              Github source:
              da7806ca or
              master branch
            
Properties¶
archiver¶
  
  
    
cron_schedule¶Schedule for pausing the archiver to upload log files
- Default
'*/30 * * * *'
data_dir¶Directory for dumping log files
- Default
/var/vcap/store/archiver
method¶Select the method for archiving
- Default
s3
s3¶
access_key_id¶S3 Access Key ID
- Default
""
bucket¶S3 Bucket
- Default
""
endpoint¶S3 Endpoint
- Default
s3.amazonaws.com
prefix¶S3 Prefix
- Default
""
secret_access_key¶S3 Secret Access Key
- Default
""
scp¶
destination¶Destination directory for the tranferred log files
- Default
""
host¶Host to transfer the log files to
- Default
""
port¶Port of the remote host
- Default
22
ssh_key¶Private ssh key in PEM format (file contents, not a path)
- Default
""
username¶If your remote username differs from the default root one
- Default
root
workers¶The number of worker threads that logstash should use (default: auto = one per CPU)
- Default
auto
redis¶
  
  
    
host¶Redis host of queue
key¶Name of queue to pull messages from
- Default
logstash
port¶Redis port of queue
- Default
6379
Templates¶
            Templates are rendered and placed onto corresponding
            instances during the deployment process. This job's templates
            will be placed into /var/vcap/jobs/archiver/ directory
            (learn more).
          
- bin/logstash.process(from- bin/logstash.process.erb)
- bin/s3upload.cron(from- bin/s3upload.cron.erb)
- bin/scpupload.cron(from- bin/scpupload.cron.erb)
- config/logstash.conf(from- config/logstash.conf.erb)
- config/scpupload.pem(from- config/scpupload.pem.erb)
- logsearch/metric-collector/files/collector(from- logsearch/metric-collector/files/collector)
Packages¶
            Packages are compiled and placed onto corresponding
            instances during the deployment process. Packages will be
            placed into /var/vcap/packages/ directory.