Class: SolidQueue::RecurringExecution
Defined Under Namespace
Classes: AlreadyRecorded
Class Method Summary collapse
- .clear_in_batches(batch_size: 500) ⇒ Object
- .create_or_insert!(**attributes) ⇒ Object
- .record(task_key, run_at, &block) ⇒ Object
Methods inherited from Execution
create_all_from_jobs, #discard, discard_all_from_jobs, discard_all_in_batches, execution_data_from_jobs, type, #type
Methods inherited from Record
non_blocking_lock, supports_insert_conflict_target?
Class Method Details
.clear_in_batches(batch_size: 500) ⇒ Object
35 36 37 38 39 40 |
# File 'app/models/solid_queue/recurring_execution.rb', line 35 def clear_in_batches(batch_size: 500) loop do records_deleted = clearable.limit(batch_size).delete_all break if records_deleted == 0 end end |
.create_or_insert!(**attributes) ⇒ Object
10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
# File 'app/models/solid_queue/recurring_execution.rb', line 10 def create_or_insert!(**attributes) if supports_insert_conflict_target? # PostgreSQL fails and aborts the current transaction when it hits a duplicate key conflict # during two concurrent INSERTs for the same value of an unique index. We need to explicitly # indicate unique_by to ignore duplicate rows by this value when inserting unless insert(attributes, unique_by: [ :task_key, :run_at ]).any? raise AlreadyRecorded end else create!(**attributes) end rescue ActiveRecord::RecordNotUnique raise AlreadyRecorded end |
.record(task_key, run_at, &block) ⇒ Object
25 26 27 28 29 30 31 32 33 |
# File 'app/models/solid_queue/recurring_execution.rb', line 25 def record(task_key, run_at, &block) transaction do block.call.tap do |active_job| if active_job && active_job.successfully_enqueued? create_or_insert!(job_id: active_job.provider_job_id, task_key: task_key, run_at: run_at) end end end end |