A TokenStream is a list of tokens, gathered during the parse of some entity (say a method). Entities populate these streams by being registered with the lexer. Any class can collect tokens by including TokenStream. From the outside, you use such an object by calling the #start_collecting_tokens method, followed by calls to #add_token and pop_token.

Methods
A
C
P
S
T
Instance Public methods
add_token(*tokens)
Alias for: add_tokens
add_tokens(*tokens)

Adds tokens to the collected tokens

Also aliased as: add_token
# File ../ruby/lib/rdoc/token_stream.rb, line 13
def add_tokens(*tokens)
  tokens.flatten.each { |token| @token_stream << token }
end
collect_tokens()

Starts collecting tokens

Also aliased as: start_collecting_tokens
# File ../ruby/lib/rdoc/token_stream.rb, line 22
def collect_tokens
  @token_stream = []
end
pop_token()

Remove the last token from the collected tokens

# File ../ruby/lib/rdoc/token_stream.rb, line 31
def pop_token
  @token_stream.pop
end
start_collecting_tokens()
Alias for: collect_tokens
token_stream()

Current token stream

# File ../ruby/lib/rdoc/token_stream.rb, line 38
def token_stream
  @token_stream
end
tokens_to_s()

Returns a string representation of the token stream

# File ../ruby/lib/rdoc/token_stream.rb, line 45
def tokens_to_s
  token_stream.map { |token| token.text }.join ''
end