Class: BufferedTokenStream
- Inherits:
-
TokenStream
- Object
- TokenStream
- BufferedTokenStream
- Defined in:
- lib/antlr4/BufferedTokenStream.rb
Direct Known Subclasses
Instance Attribute Summary collapse
-
#fetchedEOF ⇒ Object
Returns the value of attribute fetchedEOF.
-
#index ⇒ Object
Returns the value of attribute index.
-
#tokens ⇒ Object
Returns the value of attribute tokens.
-
#tokenSource ⇒ Object
Returns the value of attribute tokenSource.
Instance Method Summary collapse
-
#adjustSeekIndex(i) ⇒ Object
Allowed derived classes to modify the behavior of operations which change the current stream position by adjusting the target token index of a seek operation.
- #consume ⇒ Object
-
#fetch(n) ⇒ Object
Add n elements to buffer.
-
#fill ⇒ Object
Get all tokens from lexer until EOF#/.
- #filterForChannel(left, right, channel) ⇒ Object
- #get(index) ⇒ Object
-
#getHiddenTokensToLeft(tokenIndex, channel = -1)) ⇒ Object
Collect all tokens on specified channel to the left of the current token up until we see a token on DEFAULT_TOKEN_CHANNEL.
-
#getHiddenTokensToRight(tokenIndex, channel = -1)) ⇒ Object
Collect all tokens on specified channel to the right of the current token up until we see a token on DEFAULT_TOKEN_CHANNEL or EOF.
- #getSourceName ⇒ Object
-
#getText(interval = nil) ⇒ Object
Get the text of all tokens in this buffer.#/.
-
#getTokens(start, stop, types = nil) ⇒ Object
Get all tokens from start..stop inclusively#/.
-
#initialize(_tokenSource) ⇒ BufferedTokenStream
constructor
A new instance of BufferedTokenStream.
- #LA(i) ⇒ Object
- #lazyInit ⇒ Object
- #LB(k) ⇒ Object
- #LT(k) ⇒ Object
- #mark ⇒ Object
-
#nextTokenOnChannel(i, channel) ⇒ Object
Given a starting index, return the index of the next token on channel.
-
#previousTokenOnChannel(i, channel) ⇒ Object
Given a starting index, return the index of the previous token on channel.
- #release(marker) ⇒ Object
- #reset ⇒ Object
- #seek(index) ⇒ Object
-
#setTokenSource(tokenSource) ⇒ Object
Reset this token stream by setting its token source.#/.
- #setup ⇒ Object
-
#sync(i) ⇒ @code true
Make sure index i in tokens has a token.
Constructor Details
#initialize(_tokenSource) ⇒ BufferedTokenStream
Returns a new instance of BufferedTokenStream.
5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 5 def initialize(_tokenSource) # The {@link TokenSource} from which tokens for this stream are fetched. @tokenSource = _tokenSource # A collection of all tokens fetched from the token source. The list is # considered a complete view of the input once {@link #fetchedEOF} is set # to {@code true}. self.tokens = Array.new # The index into {@link #tokens} of the current token (next token to # {@link #consume}). {@link #tokens}{@code [}{@link #p}{@code ]} should be # {@link #LT LT(1)}. # # <p>This field is set to -1 when the stream is first constructed or when # {@link #setTokenSource} is called, indicating that the first token has # not yet been fetched from the token source. For additional information, # see the documentation of {@link IntStream} for a description of # Initializing Methods.</p> self.index = -1 # Indicates whether the {@link Token#EOF} token has been fetched from # {@link #tokenSource} and added to {@link #tokens}. This field improves # performance for the following cases # # <ul> # <li>{@link #consume}: The lookahead check in {@link #consume} to prevent # consuming the EOF symbol is optimized by checking the values of # {@link #fetchedEOF} and {@link #p} instead of calling {@link #LA}.</li> # <li>{@link #fetch}: The check to prevent adding multiple EOF symbols into # {@link #tokens} is trivial with this field.</li> # <ul> self.fetchedEOF = false end |
Instance Attribute Details
#fetchedEOF ⇒ Object
Returns the value of attribute fetchedEOF.
4 5 6 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 4 def fetchedEOF @fetchedEOF end |
#index ⇒ Object
Returns the value of attribute index.
4 5 6 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 4 def index @index end |
#tokens ⇒ Object
Returns the value of attribute tokens.
4 5 6 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 4 def tokens @tokens end |
#tokenSource ⇒ Object
Returns the value of attribute tokenSource.
4 5 6 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 4 def tokenSource @tokenSource end |
Instance Method Details
#adjustSeekIndex(i) ⇒ Object
Allowed derived classes to modify the behavior of operations which change the current stream position by adjusting the target token index of a seek operation. The default implementation simply returns i. If an exception is thrown in this method, the current stream index should not be changed.
<p>For example, CommonTokenStream overrides this method to ensure that the seek target is always an on-channel token.</p>
164 165 166 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 164 def adjustSeekIndex(i) return i end |
#consume ⇒ Object
57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 57 def consume() skipEofCheck = false if self.index >= 0 then if self.fetchedEOF then # the last token in tokens is EOF. skip check if p indexes any # fetched token except the last. skipEofCheck = self.index < self.tokens.length - 1 else # no EOF token in tokens. skip check if p indexes a fetched token. skipEofCheck = self.index < self.tokens.length end else # not yet initialized skipEofCheck = false end if not skipEofCheck and self.LA(1) == Token::EOF then raise IllegalStateException.new("cannot consume EOF") end if self.sync(self.index + 1) then self.index = self.adjustSeekIndex(self.index + 1) end end |
#fetch(n) ⇒ Object
Add n elements to buffer.
/
98 99 100 101 102 103 104 105 106 107 108 109 110 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 98 def fetch(n) return 0 if self.fetchedEOF 1.upto(n) do |i| # for i in 0..n-1 do t = self.tokenSource.nextToken() t.tokenIndex = self.tokens.length self.tokens.push(t) if t.type==Token::EOF then self.fetchedEOF = true return i # i + 1 end end return n end |
#fill ⇒ Object
Get all tokens from lexer until EOF#/
300 301 302 303 304 305 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 300 def fill self.lazyInit() while fetch(1000)==1000 do nil end end |
#filterForChannel(left, right, channel) ⇒ Object
249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 249 def filterForChannel(left, right, channel) hidden = [] for i in left..right do t = self.tokens[i] if channel==-1 then if t.channel!= Lexer::DEFAULT_TOKEN_CHANNEL hidden.push(t) end elsif t.channel==channel then hidden.push(t) end end return nil if hidden.length==0 return hidden end |
#get(index) ⇒ Object
53 54 55 56 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 53 def get(index) self.lazyInit() return self.tokens[index] end |
#getHiddenTokensToLeft(tokenIndex, channel = -1)) ⇒ Object
Collect all tokens on specified channel to the left of
the current token up until we see a token on DEFAULT_TOKEN_CHANNEL.
If channel is -1, find any non default channel token.
235 236 237 238 239 240 241 242 243 244 245 246 247 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 235 def getHiddenTokensToLeft(tokenIndex, channel=-1) self.lazyInit() if tokenIndex<0 or tokenIndex>=self.tokens.length raise Exception.new("#{tokenIndex} not in 0..#{self.tokens.length-1}") end prevOnChannel = self.previousTokenOnChannel(tokenIndex - 1, Lexer::DEFAULT_TOKEN_CHANNEL) return nil if prevOnChannel == tokenIndex - 1 # if none on channel to left, prevOnChannel=-1 then from=0 from_ = prevOnChannel+1 to = tokenIndex-1 return self.filterForChannel(from_, to, channel) end |
#getHiddenTokensToRight(tokenIndex, channel = -1)) ⇒ Object
Collect all tokens on specified channel to the right of
the current token up until we see a token on DEFAULT_TOKEN_CHANNEL or
EOF. If channel is -1, find any non default channel token.
216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 216 def getHiddenTokensToRight(tokenIndex, channel=-1) self.lazyInit() if self.tokenIndex<0 or tokenIndex>=self.tokens.length then raise Exception.new("#{tokenIndex} not in 0..#{self.tokens.length-1}") end nextOnChannel = self.nextTokenOnChannel(tokenIndex + 1, Lexer::DEFAULT_TOKEN_CHANNEL) from_ = tokenIndex+1 # if none onchannel to right, nextOnChannel=-1 so set to = last token if nextOnChannel==-1 to = self.tokens.length-1 else to = nextOnChannel end return self.filterForChannel(from_, to, channel) end |
#getSourceName ⇒ Object
265 266 267 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 265 def getSourceName return self.tokenSource.getSourceName() end |
#getText(interval = nil) ⇒ Object
Get the text of all tokens in this buffer.#/
270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 270 def getText(interval=nil) self.lazyInit() self.fill() if interval.nil? interval = [0, self.tokens.length-1] end start = interval[0] if start.kind_of? Token start = start.tokenIndex end stop = interval[1] if stop.kind_of? Token stop = stop.tokenIndex end if start.nil? or stop.nil? or start<0 or stop<0 return "" end if stop >= self.tokens.length stop = self.tokens.length-1 end StringIO.open do |buf| for i in start..stop do t = self.tokens[i] break if t.type==Token::EOF buf.write(t.text) end return buf.string() end end |
#getTokens(start, stop, types = nil) ⇒ Object
Get all tokens from start..stop inclusively#/
113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 113 def getTokens(start, stop, types=nil) if start<0 or stop<0 then return nil end self.lazyInit() subset = Array.new if stop >= self.tokens.length stop = self.tokens.length-1 end for i in start..stop-1 do t = self.tokens[i] if t.type==Token::EOF break end if (types.nil? or types.member?(t.type)) then subset.push(t) end end return subset end |
#LA(i) ⇒ Object
133 134 135 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 133 def LA(i) return self.LT(i).type end |
#lazyInit ⇒ Object
168 169 170 171 172 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 168 def lazyInit if self.index == -1 then self.setup() end end |
#LB(k) ⇒ Object
136 137 138 139 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 136 def LB(k) return nil if (self.index-k) < 0 return self.tokens[self.index-k] end |
#LT(k) ⇒ Object
140 141 142 143 144 145 146 147 148 149 150 151 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 140 def LT(k) self.lazyInit() return nil if k==0 return self.LB(-k) if k < 0 i = self.index + k - 1 self.sync(i) if i >= self.tokens.length then # return EOF token # EOF must be last token return self.tokens[self.tokens.length-1] end return self.tokens[i] end |
#mark ⇒ Object
38 39 40 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 38 def mark return 0 end |
#nextTokenOnChannel(i, channel) ⇒ Object
Given a starting index, return the index of the next token on channel.
Return i if tokens[i] is on channel. Return -1 if there are no tokens
on channel between i and EOF.
/
192 193 194 195 196 197 198 199 200 201 202 203 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 192 def nextTokenOnChannel(i, channel) self.sync(i) return -1 if i>=self.tokens.length token = self.tokens[i] while token.channel!=self.channel do return -1 if token.type==Token::EOF i = i + 1 self.sync(i) token = self.tokens[i] end return i end |
#previousTokenOnChannel(i, channel) ⇒ Object
Given a starting index, return the index of the previous token on channel.
Return i if tokens[i] is on channel. Return -1 if there are no tokens
on channel between i and 0.
207 208 209 210 211 212 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 207 def previousTokenOnChannel(i, channel) while i>=0 and self.tokens[i].channel!=channel do i = i - 1 end return i end |
#release(marker) ⇒ Object
42 43 44 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 42 def release(marker) # no resources to release end |
#reset ⇒ Object
46 47 48 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 46 def reset() self.seek(0) end |
#seek(index) ⇒ Object
49 50 51 52 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 49 def seek( index) self.lazyInit() self.index = self.adjustSeekIndex(index) end |
#setTokenSource(tokenSource) ⇒ Object
Reset this token stream by setting its token source.#/
180 181 182 183 184 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 180 def setTokenSource(tokenSource) self.tokenSource = tokenSource self.tokens = [] self.index = -1 end |
#setup ⇒ Object
174 175 176 177 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 174 def setup() self.sync(0) self.index = self.adjustSeekIndex(0) end |
#sync(i) ⇒ @code true
Make sure index i in tokens has a token.
/
85 86 87 88 89 90 91 92 93 |
# File 'lib/antlr4/BufferedTokenStream.rb', line 85 def sync(i) #assert i >= 0 n = i - self.tokens.length + 1 # how many more elements we need? if n > 0 then fetched = self.fetch(n) return fetched >= n end return true end |