Class: Roundhouse::ProcessSet
- Inherits:
-
Object
- Object
- Roundhouse::ProcessSet
- Includes:
- Enumerable
- Defined in:
- lib/roundhouse/api.rb
Overview
Enumerates the set of Roundhouse processes which are actively working right now. Each process send a heartbeat to Redis every 5 seconds so this set should be relatively accurate, barring network partitions.
Yields a Roundhouse::Process.
Class Method Summary collapse
-
.cleanup ⇒ Object
Cleans up dead processes recorded in Redis.
Instance Method Summary collapse
- #each ⇒ Object
-
#initialize(clean_plz = true) ⇒ ProcessSet
constructor
A new instance of ProcessSet.
-
#size ⇒ Object
This method is not guaranteed accurate since it does not prune the set based on current heartbeat.
Constructor Details
#initialize(clean_plz = true) ⇒ ProcessSet
Returns a new instance of ProcessSet.
680 681 682 |
# File 'lib/roundhouse/api.rb', line 680 def initialize(clean_plz=true) self.class.cleanup if clean_plz end |
Class Method Details
.cleanup ⇒ Object
Cleans up dead processes recorded in Redis. Returns the number of processes cleaned.
686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 |
# File 'lib/roundhouse/api.rb', line 686 def self.cleanup count = 0 Roundhouse.redis do |conn| procs = conn.smembers('processes').sort heartbeats = conn.pipelined do procs.each do |key| conn.hget(key, 'info') end end # the hash named key has an expiry of 60 seconds. # if it's not found, that means the process has not reported # in to Redis and probably died. to_prune = [] heartbeats.each_with_index do |beat, i| to_prune << procs[i] if beat.nil? end count = conn.srem('processes', to_prune) unless to_prune.empty? end count end |
Instance Method Details
#each ⇒ Object
708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 |
# File 'lib/roundhouse/api.rb', line 708 def each procs = Roundhouse.redis { |conn| conn.smembers('processes') }.sort Roundhouse.redis do |conn| # We're making a tradeoff here between consuming more memory instead of # making more roundtrips to Redis, but if you have hundreds or thousands of workers, # you'll be happier this way result = conn.pipelined do procs.each do |key| conn.hmget(key, 'info', 'busy', 'beat') end end result.each do |info, busy, at_s| hash = Roundhouse.load_json(info) yield Process.new(hash.merge('busy' => busy.to_i, 'beat' => at_s.to_f)) end end nil end |
#size ⇒ Object
This method is not guaranteed accurate since it does not prune the set based on current heartbeat. #each does that and ensures the set only contains Roundhouse processes which have sent a heartbeat within the last 60 seconds.
734 735 736 |
# File 'lib/roundhouse/api.rb', line 734 def size Roundhouse.redis { |conn| conn.scard('processes') } end |