Class: LogStash::Outputs::GoogleCloudStorage
- Inherits:
-
Base
- Object
- Base
- LogStash::Outputs::GoogleCloudStorage
- Defined in:
- lib/logstash/outputs/google_cloud_storage.rb
Overview
Summary: plugin to upload log events to Google Cloud Storage (GCS), rolling files based on the date pattern provided as a configuration setting. Events are written to files locally and, once file is closed, this plugin uploads it to the configured bucket.
For more info on Google Cloud Storage, please go to: cloud.google.com/products/cloud-storage
In order to use this plugin, a Google service account must be used. For more information, please refer to: developers.google.com/storage/docs/authentication#service_accounts
Recommendation: experiment with the settings depending on how much log data you generate, so the uploader can keep up with the generated logs. Using gzip output can be a good option to reduce network traffic when uploading the log files and in terms of storage costs as well.
USAGE: This is an example of logstash config:
- source,json
output {
google_cloud_storage { bucket => "my_bucket" (required) json_key_file => "/path/to/privatekey.json" (optional) temp_directory => "/tmp/logstash-gcs" (optional) log_file_prefix => "logstash_gcs" (optional) max_file_size_kbytes => 1024 (optional) date_pattern => "%Y-%m-%dT%H:00" (optional) flush_interval_secs => 2 (optional) gzip => false (optional) gzip_content_encoding => false (optional) uploader_interval_secs => 60 (optional) upload_synchronous => false (optional) }}
Improvements TODO list:
-
Support logstash event variables to determine filename.
-
Turn Google API code into a Plugin Mixin (like AwsConfig).
-
There’s no recover method, so if logstash/plugin crashes, files may not
be uploaded to GCS.
-
Allow user to configure file name.
-
Instance Attribute Summary collapse
-
#active ⇒ Object
Returns the value of attribute active.
-
#disable_uploader ⇒ Object
Returns the value of attribute disable_uploader.
Instance Method Summary collapse
- #close ⇒ Object
-
#multi_receive_encoded(event_encoded_pairs) ⇒ Object
Method called for incoming log events.
- #register ⇒ Object
Instance Attribute Details
#active ⇒ Object
Returns the value of attribute active.
157 158 159 |
# File 'lib/logstash/outputs/google_cloud_storage.rb', line 157 def active @active end |
#disable_uploader ⇒ Object
Returns the value of attribute disable_uploader.
157 158 159 |
# File 'lib/logstash/outputs/google_cloud_storage.rb', line 157 def disable_uploader @disable_uploader end |
Instance Method Details
#close ⇒ Object
190 191 192 193 194 195 196 197 198 |
# File 'lib/logstash/outputs/google_cloud_storage.rb', line 190 def close @logger.debug('Stopping the plugin, uploading the remaining files.') Stud.stop!(@uploader_thread) unless @uploader_thread.nil? # Force rotate the log. If it contains data it will be submitted # to the work pool and will be uploaded before the plugin stops. @log_rotater.rotate_log! @workers.stop! end |
#multi_receive_encoded(event_encoded_pairs) ⇒ Object
Method called for incoming log events. It writes the event to the current output file, flushing depending on flush interval configuration.
183 184 185 186 187 188 |
# File 'lib/logstash/outputs/google_cloud_storage.rb', line 183 def multi_receive_encoded(event_encoded_pairs) encoded = event_encoded_pairs.map{ |event, encoded| encoded } @logger.debug? && @logger.debug('Received events', :events => encoded) @log_rotater.write(*encoded) end |
#register ⇒ Object
159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 |
# File 'lib/logstash/outputs/google_cloud_storage.rb', line 159 def register @logger.debug('Registering Google Cloud Storage plugin') # NOTE: this is a hacky solution to get around the fact that we used to # do our own pseudo-codec processing. This should be removed in the # next major release. params['codec'] = LogStash::Plugin.lookup('codec', 'json_lines').new if @output_format == 'json' params['codec'] = LogStash::Plugin.lookup('codec', 'line').new if @output_format == 'plain' @workers = LogStash::Outputs::Gcs::WorkerPool.new(@max_concurrent_uploads, @upload_synchronous) initialize_temp_directory initialize_path_factory initialize_log_rotater initialize_google_client start_uploader @content_type = @gzip ? 'application/gzip' : 'text/plain' @content_encoding = @gzip_content_encoding ? 'gzip' : 'identity' end |