Module: ScoutApm::Agent::Reporting
- Included in:
- ScoutApm::Agent
- Defined in:
- lib/scout_apm/agent/reporting.rb
Instance Method Summary collapse
-
#add_metric_ids(metrics) ⇒ Object
Before reporting, lookup metric_id for each MetricMeta.
-
#process_metrics ⇒ Object
Called in the worker thread.
- #reporter ⇒ Object
-
#run_samplers ⇒ Object
Called from #process_metrics, which is run via the background worker.
Instance Method Details
#add_metric_ids(metrics) ⇒ Object
Before reporting, lookup metric_id for each MetricMeta. This speeds up reporting on the server-side.
63 64 65 66 67 68 69 |
# File 'lib/scout_apm/agent/reporting.rb', line 63 def add_metric_ids(metrics) metrics.each do |,stats| if metric_id = metric_lookup[] .metric_id = metric_id end end end |
#process_metrics ⇒ Object
Called in the worker thread. Merges in-memory metrics w/those on disk and reports metrics to the server.
11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
# File 'lib/scout_apm/agent/reporting.rb', line 11 def process_metrics logger.debug "Processing metrics" run_samplers capacity.process payload = layaway.deposit_and_deliver metrics = payload[:metrics] slow_transactions = payload[:slow_transactions] if payload.any? add_metric_ids(metrics) logger.warn "Some data may be lost - metric size is at limit" if metrics.size == ScoutApm::Store::MAX_SIZE # for debugging, count the total number of requests controller_count = 0 metrics.each do |,stats| if .metric_name =~ /\AController/ controller_count += stats.call_count end end logger.debug("Metrics: #{metrics}") logger.debug("SlowTrans: #{slow_transactions}") payload = ScoutApm::Serializers::PayloadSerializer.serialize(metrics, slow_transactions) slow_transactions_kb = Marshal.dump(slow_transactions).size/1024 # just for performance debugging logger.debug "#{config.value('name')} Delivering total payload [#{payload.size/1024} KB] for #{controller_count} requests and slow transactions [#{slow_transactions_kb} KB] for #{slow_transactions.size} transactions of durations: #{slow_transactions.map(&:total_call_time).join(',')}." response = reporter.report(payload) if response and response.is_a?(Net::HTTPSuccess) directives = ScoutApm::Serializers::DirectiveSerializer.deserialize(response.body) self.metric_lookup.merge!(directives[:metric_lookup]) if directives[:reset] logger.info "Resetting metric_lookup." self.metric_lookup = Hash.new end logger.debug "Metric Cache Size: #{metric_lookup.size}" elsif response logger.warn "Error on checkin to #{reporter.uri.to_s}: #{response.inspect}" end end rescue logger.warn "Error on checkin to #{reporter.uri.to_s}" logger.info $!. logger.debug $!.backtrace end |
#reporter ⇒ Object
5 6 7 |
# File 'lib/scout_apm/agent/reporting.rb', line 5 def reporter @reporter ||= ScoutApm::Reporter.new(:checkin, config, logger) end |
#run_samplers ⇒ Object
Called from #process_metrics, which is run via the background worker.
72 73 74 75 76 77 78 79 80 81 82 83 |
# File 'lib/scout_apm/agent/reporting.rb', line 72 def run_samplers @samplers.each do |sampler| begin result = sampler.run store.track!(sampler.metric_name, result, {:scope => nil}) if result rescue => e logger.info "Error reading #{sampler.human_name}" logger.debug e. logger.debug e.backtrace.join("\n") end end end |