Module: Repper::Tokenizer
- Defined in:
- lib/repper/tokenizer.rb
Class Method Summary collapse
- .call(regexp, delimiters: ['/', '/'], flags: nil) ⇒ Object
-
.flatten(exp, acc = [], delimiters: nil, flags: nil) ⇒ Object
Turn Regexp::Parser AST back into a flat Array of visual elements that match the Regexp notation.
- .make_token(exp, text) ⇒ Object
Class Method Details
.call(regexp, delimiters: ['/', '/'], flags: nil) ⇒ Object
7 8 9 10 11 12 |
# File 'lib/repper/tokenizer.rb', line 7 def call(regexp, delimiters: ['/', '/'], flags: nil) tree = Regexp::Parser.parse(regexp, options: flags =~ /x/ && Regexp::EXTENDED) flatten(tree, delimiters: delimiters, flags: flags) rescue ::Regexp::Parser::Error => e raise e.extend(Repper::Error) end |
.flatten(exp, acc = [], delimiters: nil, flags: nil) ⇒ Object
Turn Regexp::Parser AST back into a flat Array of visual elements that match the Regexp notation.
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
# File 'lib/repper/tokenizer.rb', line 16 def flatten(exp, acc = [], delimiters: nil, flags: nil) # Add opening entry. exp.is?(:root) && acc << make_token(exp, delimiters[0]) # Ignore nesting of invisible intermediate branches for better visuals. exp.is?(:sequence) && exp.nesting_level -= 1 exp.parts.each do |part| if part.instance_of?(::String) acc << make_token(exp, part) else # part.is_a?(Regexp::Expression::Base) flatten(part, acc) end end exp.quantified? && flatten(exp.quantifier, acc) # Add closing entry. exp.is?(:root) && begin flags ||= exp..keys.join acc << make_token(exp, "#{delimiters[1]}#{flags.chars.uniq.sort.join}") end acc end |
.make_token(exp, text) ⇒ Object
42 43 44 45 46 47 48 49 50 |
# File 'lib/repper/tokenizer.rb', line 42 def make_token(exp, text) Token.new( type: exp.type, subtype: exp.token, level: exp.nesting_level, text: text, id: exp.respond_to?(:identifier) && exp.identifier, ) end |