Instantiate with a buffer.
Invoked when a scanner matches a pattern. The provided value should be the index of the last element of the matching pattern, which is converted back to a void[] index.
Return the current token as a slice of the content.
See if set of characters holds a particular instance.
Locate the next token. Returns the token if found, null otherwise. Null indicates an end of stream condition. To sweep a conduit for lines using method next():
Called when a scanner fails to find a matching pattern. This may cause more content to be loaded, and a rescan initiated.
Iterate over the set of tokens. This should really provide read-only access to the tokens, but D does not support that at this time.
Iterate over a set of tokens, exposing a token count starting at zero.
Iterate over a set of tokens and delimiters, exposing a token count starting at zero.
The pattern scanner, implemented via subclasses.
Set the provided stream as the scanning source.
Set the content of the current slice to the provided start and end points.
Set the content of the current slice to the provided start and end points, and delimiter to the segment between end & next (inclusive.)
Return the hosting conduit.
Read from conduit into a target array. The provided dst will be populated with content from the conduit.
Load the bits from a stream, and return them all in an array. The dst array can be provided as an option, which will be expanded as necessary to consume the input.
Clear any buffered content.
Seek on this stream. Target conduits that don't support seeking will throw an IOException.
Return the upstream host of this filter.
Close the input.
The base class for a set of stream iterators. These operate upon a buffered input stream, and are designed to deal with partial content. That is, stream iterators go to work the moment any data becomes available in the buffer. Contrast this behaviour with the ocean.text.Util iterators, which operate upon the extent of an array.
There are two types of iterators supported; exclusive and inclusive. The former are the more common kind, where a token is delimited by elements that are considered foreign. Examples include space, comma, and end-of-line delineation. Inclusive tokens are just the opposite: they look for patterns in the text that should be part of the token itself - everything else is considered foreign. Currently ocean.io.stream includes the exclusive variety only.
Each pattern is exposed to the client as a slice of the original content, where the slice is transient. If you need to retain the exposed content, then you should .dup it appropriately.
The content provided to these iterators is intended to be fully read-only. All current tokenizers abide by this rule, but it is possible a user could mutate the content through a token slice. To enforce the desired read-only aspect, the code would have to introduce redundant copying or the compiler would have to support read-only arrays (now in D2).
See Delimiters, Lines, Patterns, Quotes.