Class Lucene43NGramTokenizer
Old broken version of NGramTokenizer.
Implements
Inherited Members
Namespace: Lucene.Net.Analysis.NGram
Assembly: Lucene.Net.Analysis.Common.dll
Syntax
[Obsolete]
public sealed class Lucene43NGramTokenizer : Tokenizer, IDisposable
Constructors
Lucene43NGramTokenizer(AttributeFactory, TextReader, int, int)
Creates Lucene43NGramTokenizer with given min and max n-grams.
Declaration
public Lucene43NGramTokenizer(AttributeSource.AttributeFactory factory, TextReader input, int minGram, int maxGram)
Parameters
Type | Name | Description |
---|---|---|
AttributeSource.AttributeFactory | factory | Lucene.Net.Util.AttributeSource.AttributeFactory to use |
TextReader | input | TextReader holding the input to be tokenized |
int | minGram | the smallest n-gram to generate |
int | maxGram | the largest n-gram to generate |
Lucene43NGramTokenizer(TextReader)
Creates Lucene43NGramTokenizer with default min and max n-grams.
Declaration
public Lucene43NGramTokenizer(TextReader input)
Parameters
Type | Name | Description |
---|---|---|
TextReader | input | TextReader holding the input to be tokenized |
Lucene43NGramTokenizer(TextReader, int, int)
Creates Lucene43NGramTokenizer with given min and max n-grams.
Declaration
public Lucene43NGramTokenizer(TextReader input, int minGram, int maxGram)
Parameters
Type | Name | Description |
---|---|---|
TextReader | input | TextReader holding the input to be tokenized |
int | minGram | the smallest n-gram to generate |
int | maxGram | the largest n-gram to generate |
Fields
DEFAULT_MAX_NGRAM_SIZE
Old broken version of NGramTokenizer.
Declaration
public const int DEFAULT_MAX_NGRAM_SIZE = 2
Field Value
Type | Description |
---|---|
int |
DEFAULT_MIN_NGRAM_SIZE
Old broken version of NGramTokenizer.
Declaration
public const int DEFAULT_MIN_NGRAM_SIZE = 1
Field Value
Type | Description |
---|---|
int |
Methods
End()
This method is called by the consumer after the last token has been
consumed, after Lucene.Net.Analysis.TokenStream.IncrementToken() returned false
(using the new Lucene.Net.Analysis.TokenStream API). Streams implementing the old API
should upgrade to use this feature.
base.End();
.
Declaration
public override void End()
Overrides
Exceptions
Type | Condition |
---|---|
IOException | If an I/O error occurs |
IncrementToken()
Returns the next token in the stream, or null at EOS.
Declaration
public override bool IncrementToken()
Returns
Type | Description |
---|---|
bool |
Overrides
Reset()
This method is called by a consumer before it begins consumption using Lucene.Net.Analysis.TokenStream.IncrementToken().
Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh. If you override this method, always callbase.Reset()
, otherwise
some internal state will not be correctly reset (e.g., Lucene.Net.Analysis.Tokenizer will
throw InvalidOperationException on further usage).
Declaration
public override void Reset()