Class: CTokenizer::LexerBase

Inherits:
Object
  • Object
show all
Includes:
CTokenizer
Defined in:
lib/dbc/ctokenizer.rb

Overview

wraps a lexer and uses that to produce new tokens

Instance Method Summary collapse

Methods included from CTokenizer

check_string, check_token, #collect, create_newlines, #each, error, #error, join, line_count, #parse_error, split, split_token, #to_a, #token_error, #warning, whitespace?

Constructor Details

#initialize(str, file = nil, line = 1) ⇒ LexerBase

Returns a new instance of LexerBase.



234
235
236
237
238
239
240
# File 'lib/dbc/ctokenizer.rb', line 234

def initialize(str, file=nil, line=1)
	if (str.class <= String)
		@source = Lexer.new(str, file, line)
	else
		@source = str
	end
end

Instance Method Details

#empty?Boolean

Returns:

  • (Boolean)


259
260
261
# File 'lib/dbc/ctokenizer.rb', line 259

def empty?
	@source.empty?
end

#fileObject



242
243
244
# File 'lib/dbc/ctokenizer.rb', line 242

def file
	@source.file
end

#lineObject



245
246
247
# File 'lib/dbc/ctokenizer.rb', line 245

def line
	@source.line
end

#peek_nonspaceObject



249
250
251
# File 'lib/dbc/ctokenizer.rb', line 249

def peek_nonspace
	@source.peek_nonspace
end

#shiftObject



253
254
255
256
257
# File 'lib/dbc/ctokenizer.rb', line 253

def shift
	t = @source.shift
	CTokenizer.check_token(t)
	t
end