Class TokenStream
A TokenStream enumerates the sequence of tokens, either from Fields of a Document or from query text.
this is an abstract class; concrete subclasses are:
- Tokenizer, a TokenStream whose input is a System.IO.TextReader; and
- TokenFilter, a TokenStream whose input is another TokenStream.
TokenStream now extends AttributeSource, which provides access to all of the token IAttributes for the TokenStream. Note that only one instance per System.Attribute is created and reused for every token. This approach reduces object creation and allows local caching of references to the System.Attributes. See IncrementToken() for further details.
The workflow of the new TokenStream API is as follows:
- Instantiation of TokenStream/TokenFilters which add/get attributes to/from the AttributeSource.
- The consumer calls Reset().
- The consumer retrieves attributes from the stream and stores local references to all attributes it wants to access.
- The consumer calls IncrementToken() until it returns false consuming the attributes after each call.
- The consumer calls End() so that any end-of-stream operations can be performed.
- The consumer calls Dispose() to release any resource when finished using the TokenStream.
You can find some example code for the new API in the analysis documentation.
Sometimes it is desirable to capture a current state of a TokenStream, e.g., for buffering purposes (see CachingTokenFilter, TeeSinkTokenFilter). For this usecase CaptureState() and RestoreState(AttributeSource.State) can be used.
The TokenStream-API in Lucene is based on the decorator pattern. Therefore all non-abstract subclasses must be sealed or have at least a sealed implementation of IncrementToken()! This is checked when assertions are enabled.
Implements
Inherited Members
Namespace: Lucene.Net.Analysis
Assembly: Lucene.Net.dll
Syntax
public abstract class TokenStream : AttributeSource, IDisposable
Constructors
| Improve this Doc View SourceTokenStream()
A TokenStream using the default attribute factory.
Declaration
protected TokenStream()
TokenStream(AttributeSource)
A TokenStream that uses the same attributes as the supplied one.
Declaration
protected TokenStream(AttributeSource input)
Parameters
Type | Name | Description |
---|---|---|
AttributeSource | input |
TokenStream(AttributeSource.AttributeFactory)
A TokenStream using the supplied AttributeSource.AttributeFactory for creating new IAttribute instances.
Declaration
protected TokenStream(AttributeSource.AttributeFactory factory)
Parameters
Type | Name | Description |
---|---|---|
AttributeSource.AttributeFactory | factory |
Methods
| Improve this Doc View SourceDispose()
Declaration
public void Dispose()
Dispose(Boolean)
Releases resources associated with this stream.
If you override this method, always call base.Dispose(disposing)
, otherwise
some internal state will not be correctly reset (e.g., Tokenizer will
throw System.InvalidOperationException on reuse).
Declaration
protected virtual void Dispose(bool disposing)
Parameters
Type | Name | Description |
---|---|---|
System.Boolean | disposing |
End()
This method is called by the consumer after the last token has been
consumed, after IncrementToken() returned false
(using the new TokenStream API). Streams implementing the old API
should upgrade to use this feature.
This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but a WhitespaceTokenizer was used.
Additionally any skipped positions (such as those removed by a stopfilter) can be applied to the position increment, or any adjustment of other attributes where the end-of-stream value may be important.
If you override this method, always call base.End();
.
Declaration
public virtual void End()
Exceptions
Type | Condition |
---|---|
System.IO.IOException | If an I/O error occurs |
IncrementToken()
Consumers (i.e., IndexWriter) use this method to advance the stream to the next token. Implementing classes must implement this method and update the appropriate IAttributes with the attributes of the next token.
The producer must make no assumptions about the attributes after the method has been returned: the caller may arbitrarily change it. If the producer needs to preserve the state for subsequent calls, it can use CaptureState() to create a copy of the current attribute state.
this method is called for every token of a document, so an efficient implementation is crucial for good performance. To avoid calls to AddAttribute<T>() and GetAttribute<T>(), references to all IAttributes that this stream uses should be retrieved during instantiation.
To ensure that filters and consumers know which attributes are available, the attributes must be added during instantiation. Filters and consumers are not required to check for availability of attributes in IncrementToken().
Declaration
public abstract bool IncrementToken()
Returns
Type | Description |
---|---|
System.Boolean | false for end of stream; true otherwise |
Reset()
This method is called by a consumer before it begins consumption using IncrementToken().
Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh.
If you override this method, always call base.Reset()
, otherwise
some internal state will not be correctly reset (e.g., Tokenizer will
throw System.InvalidOperationException on further usage).
Declaration
public virtual void Reset()