Class: LogStash::Codecs::JSONLines
- Inherits:
-
Base
- Object
- Base
- LogStash::Codecs::JSONLines
- Extended by:
- PluginMixins::ValidatorSupport::FieldReferenceValidationAdapter
- Includes:
- PluginMixins::ECSCompatibilitySupport, PluginMixins::ECSCompatibilitySupport::TargetCheck, PluginMixins::EventSupport::EventFactoryAdapter, PluginMixins::EventSupport::FromJsonHelper
- Defined in:
- lib/logstash/codecs/json_lines.rb
Overview
This codec will decode streamed JSON that is newline delimited. Encoding will emit a single JSON string ending in a ‘@delimiter` NOTE: Do not use this codec if your source input is line-oriented JSON, for example, redis or file inputs. Rather, use the json codec. More info: This codec is expecting to receive a stream (string) of newline terminated lines. The file input will produce a line string without a newline. Therefore this codec cannot work with line oriented inputs.
Constant Summary collapse
- DEFAULT_DECODE_SIZE_LIMIT_BYTES =
512 * (1024 * 1024)
Instance Method Summary collapse
Instance Method Details
#decode(data, &block) ⇒ Object
68 69 70 71 72 |
# File 'lib/logstash/codecs/json_lines.rb', line 68 def decode(data, &block) @buffer.extract(data).each do |line| parse_json(@converter.convert(line), &block) end end |
#encode(event) ⇒ Object
74 75 76 77 78 |
# File 'lib/logstash/codecs/json_lines.rb', line 74 def encode(event) # Tack on a @delimiter for now because previously most of logstash's JSON # outputs emitted one per line, and whitespace is OK in json. @on_event.call(event, "#{event.to_json}#{@delimiter}") end |
#flush(&block) ⇒ Object
80 81 82 83 84 85 |
# File 'lib/logstash/codecs/json_lines.rb', line 80 def flush(&block) remainder = @buffer.flush if !remainder.empty? parse_json(@converter.convert(remainder), &block) end end |
#register ⇒ Object
59 60 61 62 63 64 65 66 |
# File 'lib/logstash/codecs/json_lines.rb', line 59 def register if original_params['decode_size_limit_bytes'].nil? deprecation_logger.deprecated "The default value for `decode_size_limit_bytes`, currently at 512Mb, will be lowered in a future version to prevent Out of Memory errors from abnormally large messages or missing delimiters. Please set a value that reflects the largest expected message size (e.g. 20971520 for 20Mb)" end @buffer = FileWatch::BufferedTokenizer.new(@delimiter, @decode_size_limit_bytes) @converter = LogStash::Util::Charset.new(@charset) @converter.logger = @logger end |