Method: EBNF::LL1::Lexer.tokenize

Defined in:
lib/ebnf/ll1/lexer.rb

.tokenize(input, terminals, **options) {|lexer| ... } ⇒ Lexer

Tokenizes the given ‘input` string or stream.

Parameters:

  • input (String, #to_s)
  • terminals (Array<Array<Symbol, Regexp>>)

    Array of symbol, regexp pairs used to match terminals. If the symbol is nil, it defines a Regexp to match string terminals.

  • options (Hash{Symbol => Object})

Yields:

  • (lexer)

Yield Parameters:

Returns:

Raises:



101
102
103
104
# File 'lib/ebnf/ll1/lexer.rb', line 101

def self.tokenize(input, terminals, **options, &block)
  lexer = self.new(input, terminals, **options)
  block_given? ? block.call(lexer) : lexer
end