Class: LogStash::Outputs::Kafka
- Inherits:
-
Base
- Object
- Base
- LogStash::Outputs::Kafka
- Defined in:
- lib/logstash/outputs/kafka.rb
Overview
Write events to a Kafka topic. This uses the Kafka Producer API to write messages to a topic on the broker.
The only required configuration is the topic name. The default codec is json, so events will be persisted on the broker in json format. If you select a codec of plain, Logstash will encode your messages with not only the message but also with a timestamp and hostname. If you do not want anything but your message passing through, you should make the output configuration something like:
- source,ruby
-
output
kafka { codec => plain { format => "%{message" } }}
For more information see kafka.apache.org/documentation.html#theproducer
Kafka producer configuration: kafka.apache.org/documentation.html#producerconfigs
Instance Method Summary collapse
-
#receive(event) ⇒ Object
def register.
- #register ⇒ Object
- #teardown ⇒ Object
Instance Method Details
#receive(event) ⇒ Object
def register
158 159 160 161 162 163 164 165 166 167 |
# File 'lib/logstash/outputs/kafka.rb', line 158 def receive(event) return unless output?(event) if event == LogStash::SHUTDOWN finished return end @partition_key = if @partition_key_format.nil? then nil else event.sprintf(@partition_key_format) end @codec.encode(event) @partition_key = nil end |
#register ⇒ Object
118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 |
# File 'lib/logstash/outputs/kafka.rb', line 118 def register LogStash::Logger.setup_log4j(@logger) = { :broker_list => @broker_list, :compression_codec => @compression_codec, :compressed_topics => @compressed_topics, :request_required_acks => @request_required_acks, :serializer_class => @serializer_class, :partitioner_class => @partitioner_class, :request_timeout_ms => @request_timeout_ms, :producer_type => @producer_type, :key_serializer_class => @key_serializer_class, :message_send_max_retries => , :retry_backoff_ms => @retry_backoff_ms, :topic_metadata_refresh_interval_ms => , :queue_buffering_max_ms => @queue_buffering_max_ms, :queue_buffering_max_messages => , :queue_enqueue_timeout_ms => @queue_enqueue_timeout_ms, :batch_num_messages => , :send_buffer_bytes => @send_buffer_bytes, :client_id => @client_id } @producer = Kafka::Producer.new() @producer.connect @logger.info('Registering kafka producer', :topic_id => @topic_id, :broker_list => @broker_list) @codec.on_event do |event, data| begin @producer.send_msg(event.sprintf(@topic_id),@partition_key,data) rescue LogStash::ShutdownSignal @logger.info('Kafka producer got shutdown signal') rescue => e @logger.warn('kafka producer threw exception, restarting', :exception => e) end end end |
#teardown ⇒ Object
169 170 171 |
# File 'lib/logstash/outputs/kafka.rb', line 169 def teardown @producer.close end |