Class: DaimonSkycrawlers::Crawler::Default

Inherits:
Base
  • Object
show all
Defined in:
lib/daimon_skycrawlers/crawler/default.rb

Overview

The default crawler

This crawler can GET/POST given URL and store response to storage

Instance Attribute Summary

Attributes inherited from Base

#n_processed_urls, #storage

Instance Method Summary collapse

Methods inherited from Base

#connection, #get, #initialize, #post, #prepare, #process, #setup_connection, #skipped?

Methods included from DaimonSkycrawlers::Configurable

#configure

Methods included from DaimonSkycrawlers::Callbacks

#after_process, #before_process, #clear_after_process_callbacks, #clear_before_process_callbacks, #run_after_process_callbacks, #run_before_process_callbacks

Constructor Details

This class inherits a constructor from DaimonSkycrawlers::Crawler::Base

Instance Method Details

#fetch(url, message) ⇒ Faraday::Response

GET/POST given url

Parameters:

  • url (String)

    URI or path

  • message (Hash)

    message can include anything

Returns:

  • (Faraday::Response)

    HTTP response



19
20
21
22
23
24
25
26
27
# File 'lib/daimon_skycrawlers/crawler/default.rb', line 19

def fetch(url, message)
  params = message[:params] || {}
  method = message[:method] || "GET"
  if method == "POST"
    post(url, params)
  else
    get(url, params)
  end
end