Class TokenStream
A Token
this is an abstract class; concrete subclasses are:
- Tokenizer, a Token
Stream whose input is a; and - Token
Filter , a TokenStream whose input is another TokenStream .
Token
The workflow of the new Token
- Instantiation of Token
Stream /TokenFilter s which add/get attributes to/from the AttributeSource . - The consumer calls Reset().
- The consumer retrieves attributes from the stream and stores local references to all attributes it wants to access.
- The consumer calls Increment
Token() until it returns false consuming the attributes after each call. - The consumer calls End() so that any end-of-stream operations can be performed.
- The consumer calls Dispose() to release any resource when finished
using the Token
Stream .
You can find some example code for the new API in the analysis documentation.
Sometimes it is desirable to capture a current state of a Token
The Token
Implements
Inherited Members
Namespace: Lucene.Net.Analysis
Assembly: Lucene.Net.dll
Syntax
public abstract class TokenStream : AttributeSource, IDisposable
Constructors
| Improve this Doc View SourceTokenStream()
A Token
Declaration
protected TokenStream()
TokenStream(AttributeSource)
A Token
Declaration
protected TokenStream(AttributeSource input)
Parameters
Type | Name | Description |
---|---|---|
Attribute |
input |
TokenStream(AttributeSource.AttributeFactory)
A Token
Declaration
protected TokenStream(AttributeSource.AttributeFactory factory)
Parameters
Type | Name | Description |
---|---|---|
Attribute |
factory |
Methods
| Improve this Doc View SourceDispose()
Declaration
public void Dispose()
Dispose(Boolean)
Releases resources associated with this stream.
If you override this method, always call base.Dispose(disposing)
, otherwise
some internal state will not be correctly reset (e.g., Tokenizer will
throw
Declaration
protected virtual void Dispose(bool disposing)
Parameters
Type | Name | Description |
---|---|---|
System. |
disposing |
End()
This method is called by the consumer after the last token has been
consumed, after Incrementfalse
(using the new Token
This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but a WhitespaceTokenizer was used.
Additionally any skipped positions (such as those removed by a stopfilter) can be applied to the position increment, or any adjustment of other attributes where the end-of-stream value may be important.
If you override this method, always call base.End();
.
Declaration
public virtual void End()
IncrementToken()
Consumers (i.e., Index
The producer must make no assumptions about the attributes after the method
has been returned: the caller may arbitrarily change it. If the producer
needs to preserve the state for subsequent calls, it can use
Capture
this method is called for every token of a document, so an efficient implementation is crucial for good performance. To avoid calls to AddAttribute<T>() and GetAttribute<T>(), references to all IAttributes that this stream uses should be retrieved during instantiation.
To ensure that filters and consumers know which attributes are available,
the attributes must be added during instantiation. Filters and consumers
are not required to check for availability of attributes in
Increment
Declaration
public abstract bool IncrementToken()
Returns
Type | Description |
---|---|
System. |
false for end of stream; true otherwise |
Reset()
This method is called by a consumer before it begins consumption using
Increment
Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh.
If you override this method, always call base.Reset()
, otherwise
some internal state will not be correctly reset (e.g., Tokenizer will
throw
Declaration
public virtual void Reset()