Show / Hide Table of Contents

    Class TokenStream

    A TokenStream enumerates the sequence of tokens, either from Fields of a Document or from query text.

    this is an abstract class; concrete subclasses are:

    • Tokenizer, a TokenStream whose input is a ; and
    • TokenFilter, a TokenStream whose input is another TokenStream.
    A new TokenStream API has been introduced with Lucene 2.9. this API has moved from being Token-based to IAttribute-based. While Token still exists in 2.9 as a convenience class, the preferred way to store the information of a Token is to use s.

    TokenStream now extends AttributeSource, which provides access to all of the token IAttributes for the TokenStream. Note that only one instance per is created and reused for every token. This approach reduces object creation and allows local caching of references to the s. See IncrementToken() for further details.

    The workflow of the new TokenStream API is as follows:

    1. Instantiation of TokenStream/TokenFilters which add/get attributes to/from the AttributeSource.
    2. The consumer calls Reset().
    3. The consumer retrieves attributes from the stream and stores local references to all attributes it wants to access.
    4. The consumer calls IncrementToken() until it returns false consuming the attributes after each call.
    5. The consumer calls End() so that any end-of-stream operations can be performed.
    6. The consumer calls Dispose() to release any resource when finished using the TokenStream.
    To make sure that filters and consumers know which attributes are available, the attributes must be added during instantiation. Filters and consumers are not required to check for availability of attributes in IncrementToken().

    You can find some example code for the new API in the analysis documentation.

    Sometimes it is desirable to capture a current state of a TokenStream, e.g., for buffering purposes (see CachingTokenFilter, TeeSinkTokenFilter). For this usecase CaptureState() and RestoreState(AttributeSource.State) can be used.

    The TokenStream-API in Lucene is based on the decorator pattern. Therefore all non-abstract subclasses must be sealed or have at least a sealed implementation of IncrementToken()! This is checked when assertions are enabled.

    Inheritance
    System.Object
    AttributeSource
    TokenStream
    NumericTokenStream
    TokenFilter
    Tokenizer
    Implements
    IDisposable
    Inherited Members
    AttributeSource.GetAttributeFactory()
    AttributeSource.GetAttributeClassesEnumerator()
    AttributeSource.GetAttributeImplsEnumerator()
    AttributeSource.AddAttributeImpl(Attribute)
    AttributeSource.AddAttribute<T>()
    AttributeSource.HasAttributes
    AttributeSource.HasAttribute<T>()
    AttributeSource.GetAttribute<T>()
    AttributeSource.ClearAttributes()
    AttributeSource.CaptureState()
    AttributeSource.RestoreState(AttributeSource.State)
    AttributeSource.GetHashCode()
    AttributeSource.Equals(Object)
    AttributeSource.ReflectAsString(Boolean)
    AttributeSource.ReflectWith(IAttributeReflector)
    AttributeSource.CloneAttributes()
    AttributeSource.CopyTo(AttributeSource)
    AttributeSource.ToString()
    Namespace: Lucene.Net.Analysis
    Assembly: Lucene.Net.dll
    Syntax
    public abstract class TokenStream : AttributeSource, IDisposable

    Constructors

    | Improve this Doc View Source

    TokenStream()

    A TokenStream using the default attribute factory.

    Declaration
    protected TokenStream()
    | Improve this Doc View Source

    TokenStream(AttributeSource)

    A TokenStream that uses the same attributes as the supplied one.

    Declaration
    protected TokenStream(AttributeSource input)
    Parameters
    Type Name Description
    AttributeSource input
    | Improve this Doc View Source

    TokenStream(AttributeSource.AttributeFactory)

    A TokenStream using the supplied AttributeSource.AttributeFactory for creating new IAttribute instances.

    Declaration
    protected TokenStream(AttributeSource.AttributeFactory factory)
    Parameters
    Type Name Description
    AttributeSource.AttributeFactory factory

    Methods

    | Improve this Doc View Source

    Dispose()

    Declaration
    public void Dispose()
    | Improve this Doc View Source

    Dispose(Boolean)

    Releases resources associated with this stream.

    If you override this method, always call base.Dispose(disposing), otherwise some internal state will not be correctly reset (e.g., Tokenizer will throw on reuse).

    Declaration
    protected virtual void Dispose(bool disposing)
    Parameters
    Type Name Description
    System.Boolean disposing
    | Improve this Doc View Source

    End()

    This method is called by the consumer after the last token has been consumed, after IncrementToken() returned false (using the new TokenStream API). Streams implementing the old API should upgrade to use this feature.

    This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but a WhitespaceTokenizer was used.

    Additionally any skipped positions (such as those removed by a stopfilter) can be applied to the position increment, or any adjustment of other attributes where the end-of-stream value may be important.

    If you override this method, always call base.End();.

    Declaration
    public virtual void End()
    | Improve this Doc View Source

    IncrementToken()

    Consumers (i.e., IndexWriter) use this method to advance the stream to the next token. Implementing classes must implement this method and update the appropriate IAttributes with the attributes of the next token.

    The producer must make no assumptions about the attributes after the method has been returned: the caller may arbitrarily change it. If the producer needs to preserve the state for subsequent calls, it can use CaptureState() to create a copy of the current attribute state.

    this method is called for every token of a document, so an efficient implementation is crucial for good performance. To avoid calls to AddAttribute<T>() and GetAttribute<T>(), references to all IAttributes that this stream uses should be retrieved during instantiation.

    To ensure that filters and consumers know which attributes are available, the attributes must be added during instantiation. Filters and consumers are not required to check for availability of attributes in IncrementToken().

    Declaration
    public abstract bool IncrementToken()
    Returns
    Type Description
    System.Boolean

    false for end of stream; true otherwise

    | Improve this Doc View Source

    Reset()

    This method is called by a consumer before it begins consumption using IncrementToken().

    Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh.

    If you override this method, always call base.Reset(), otherwise some internal state will not be correctly reset (e.g., Tokenizer will throw on further usage).

    Declaration
    public virtual void Reset()

    Implements

    IDisposable
    • Improve this Doc
    • View Source
    Back to top Copyright © 2020 Licensed to the Apache Software Foundation (ASF)