Module: Operation::Great::Justice
- Defined in:
- lib/operation/great/justice.rb,
lib/operation/great/justice/version.rb
Constant Summary collapse
- VERSION =
"0.0.1"
Class Method Summary collapse
- .adjectives ⇒ Object
-
.generate(token) ⇒ Object
Given a token try and deterministly generate a code name, which will always be the same for any given token.
- .nouns ⇒ Object
Class Method Details
.adjectives ⇒ Object
41 42 43 |
# File 'lib/operation/great/justice.rb', line 41 def adjectives @adjectives ||= File.read(File.('../../../../adjectives.txt', __FILE__)).split("\n") end |
.generate(token) ⇒ Object
Given a token try and deterministly generate a code name, which will always be the same for any given token.
11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
# File 'lib/operation/great/justice.rb', line 11 def generate(token) # Split the token into two equal length arrays of bytes first = [] last = [] token.bytes.each_slice(2) { |a| first << a.first; last << a.last } puts "first: #{first}, last: #{last}" # Do a mind-numbingly simple XOR hash of each array. first = first.inject(0) { |h,i| h ^ i } last = last.inject(0) { |h,i| h ^ i } # Scale the hash to the size of each word list first = (first.to_f / 255 * adjectives.size).to_i last = (last.to_f / 255 * nouns.size).to_i # Collect our words first = adjectives[first] last = nouns[last] # Capitalize our words first = first.slice(0,1).capitalize + first.slice(1..-1) last = last.slice(0,1).capitalize + last.slice(1..-1) "Operation #{first} #{last}" end |
.nouns ⇒ Object
37 38 39 |
# File 'lib/operation/great/justice.rb', line 37 def nouns @nouns ||= File.read(File.('../../../../nouns.txt', __FILE__)).split("\n") end |