Class: Sidekiq::Metrics::Query
- Inherits:
-
Object
- Object
- Sidekiq::Metrics::Query
- Defined in:
- lib/sidekiq/metrics/query.rb
Overview
Allows caller to query for Sidekiq execution metrics within Redis. Caller sets a set of attributes to act as filters. #fetch will call Redis and return a Hash of results.
NB: all metrics and times/dates are UTC only. We explicitly do not support timezones.
Defined Under Namespace
Classes: JobResult, MarkResult, Result
Constant Summary collapse
- ROLLUPS =
{ # minutely aggregates per minute minutely: [60, ->(time) { time.strftime("j|%y%m%d|%-H:%M") }], # hourly aggregates every 10 minutes so we'll have six data points per hour hourly: [600, ->(time) { m = time.min mins = (m < 10) ? "0" : m.to_s[0] time.strftime("j|%y%m%d|%-H:#{mins}") }] }
Class Method Summary collapse
Instance Method Summary collapse
- #for_job(klass, minutes: nil, hours: nil) ⇒ Object
-
#initialize(pool: nil, now: Time.now) ⇒ Query
constructor
A new instance of Query.
-
#top_jobs(class_filter: nil, minutes: nil, hours: nil) ⇒ Object
Get metric data for all jobs from the last hour
class_filter
: return only results for classes matching filterminutes
: the number of fine-grained minute buckets to retrievehours
: the number of coarser-grained 10-minute buckets to retrieve, in hours.
Constructor Details
#initialize(pool: nil, now: Time.now) ⇒ Query
Returns a new instance of Query.
16 17 18 19 20 |
# File 'lib/sidekiq/metrics/query.rb', line 16 def initialize(pool: nil, now: Time.now) @time = now.utc @pool = pool || Sidekiq.default_configuration.redis_pool @klass = nil end |
Class Method Details
.bkt_time_s(time, granularity) ⇒ Object
162 163 164 165 166 |
# File 'lib/sidekiq/metrics/query.rb', line 162 def self.bkt_time_s(time, granularity) # truncate time to ten minutes ("8:40", not "8:43") or one minute truncation = (granularity == :hourly) ? 600 : 60 Time.at(time.to_i - time.to_i % truncation).utc.iso8601 end |
Instance Method Details
#for_job(klass, minutes: nil, hours: nil) ⇒ Object
76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 |
# File 'lib/sidekiq/metrics/query.rb', line 76 def for_job(klass, minutes: nil, hours: nil) time = @time minutes = 60 unless minutes || hours # DoS protection, sanity check minutes = 60 if minutes && minutes > 480 hours = 72 if hours && hours > 72 granularity = hours ? :hourly : :minutely result = Result.new(granularity) result.ends_at = time count = hours ? hours * 6 : minutes stride, keyproc = ROLLUPS[granularity] redis_results = @pool.with do |conn| conn.pipelined do |pipe| count.times do |idx| key = keyproc.call(time) pipe.hmget key, "#{klass}|ms", "#{klass}|p", "#{klass}|f" time -= stride end end end result.starts_at = time time = @time @pool.with do |conn| redis_results.each do |(ms, p, f)| result.job_results[klass].add_metric "ms", time, ms.to_i if ms result.job_results[klass].add_metric "p", time, p.to_i if p result.job_results[klass].add_metric "f", time, f.to_i if f result.job_results[klass].add_hist time, Histogram.new(klass).fetch(conn, time).reverse if minutes time -= stride end end result.marks = fetch_marks(result.starts_at..result.ends_at, granularity) result end |
#top_jobs(class_filter: nil, minutes: nil, hours: nil) ⇒ Object
Get metric data for all jobs from the last hour
+class_filter+: return only results for classes matching filter
+minutes+: the number of fine-grained minute buckets to retrieve
+hours+: the number of coarser-grained 10-minute buckets to retrieve, in hours
37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 |
# File 'lib/sidekiq/metrics/query.rb', line 37 def top_jobs(class_filter: nil, minutes: nil, hours: nil) time = @time minutes = 60 unless minutes || hours # DoS protection, sanity check minutes = 60 if minutes && minutes > 480 hours = 72 if hours && hours > 72 granularity = hours ? :hourly : :minutely result = Result.new(granularity) result.ends_at = time count = hours ? hours * 6 : minutes stride, keyproc = ROLLUPS[granularity] redis_results = @pool.with do |conn| conn.pipelined do |pipe| count.times do |idx| key = keyproc.call(time) pipe.hgetall key time -= stride end end end result.starts_at = time time = @time redis_results.each do |hash| hash.each do |k, v| kls, metric = k.split("|") next if class_filter && !class_filter.match?(kls) result.job_results[kls].add_metric metric, time, v.to_i end time -= stride end result.marks = fetch_marks(result.starts_at..result.ends_at, granularity) result end |