Module: TokenStream
- Included in:
- RDoc::AnyMethod, RDoc::RubyParser
- Defined in:
- lib/rdoc/tokenstream.rb
Overview
A TokenStream is a list of tokens, gathered during the parse of some entity (say a method). Entities populate these streams by being registered with the lexer. Any class can collect tokens by including TokenStream. From the outside, you use such an object by calling the start_collecting_tokens method, followed by calls to add_token and pop_token
Instance Method Summary collapse
- #add_token(tk) ⇒ Object
- #add_tokens(tks) ⇒ Object
- #pop_token ⇒ Object
- #start_collecting_tokens ⇒ Object
- #token_stream ⇒ Object
Instance Method Details
#add_token(tk) ⇒ Object
16 17 18 |
# File 'lib/rdoc/tokenstream.rb', line 16 def add_token(tk) @token_stream << tk end |
#add_tokens(tks) ⇒ Object
19 20 21 |
# File 'lib/rdoc/tokenstream.rb', line 19 def add_tokens(tks) tks.each {|tk| add_token(tk)} end |
#pop_token ⇒ Object
22 23 24 |
# File 'lib/rdoc/tokenstream.rb', line 22 def pop_token @token_stream.pop end |
#start_collecting_tokens ⇒ Object
13 14 15 |
# File 'lib/rdoc/tokenstream.rb', line 13 def start_collecting_tokens @token_stream = [] end |
#token_stream ⇒ Object
9 10 11 |
# File 'lib/rdoc/tokenstream.rb', line 9 def token_stream @token_stream end |