Amazon S3 plugin for Fluentd
s3 output plugin buffers event logs in local file and upload it to S3 periodically.
This plugin splits files exactly by using the time of event logs (not the time when the logs are received). For example, a log '2011-01-02 message B' is reached, and then another log '2011-01-03 message B' is reached in this order, the former one is stored in "20110102.gz" file, and latter one in "20110103.gz" file.
s3 input plugin reads data from S3 periodically. This plugin uses SQS queue on the region same as S3 bucket. We must setup SQS queue and S3 event notification before use this plugin.
:warning: Be sure to keep a close eye on S3 costs, as a few user have reported unexpectedly high costs.
|>= 1.0.0||>= v0.14.0||>= 2.1|
|< 1.0.0||>= v0.12.0||>= 1.9|
Simply use RubyGems:
# install latest version $ gem install fluent-plugin-s3 --no-document # for fluentd v1.0 or later # If you need to install specifiv version, use -v option $ gem install fluent-plugin-s3 -v 1.3.0 --no-document # For v0.12. This is for old v0.12 users. Don't use v0.12 for new deployment $ gem install fluent-plugin-s3 -v "~> 0.8" --no-document # for fluentd v0.12
Both S3 input/output plugin provide several credential methods for authentication/authorization.
See Configuration: credentials about details.
See Configuration: Output about details.
See Configuration: Input about details.
Tips and How to
See Migration guide from v0.12 about details.
Website, license, et. al.
|Copyright||(c) 2011 FURUHASHI Sadayuki|
|License||Apache License, Version 2.0|