Class Analyzer
An Analyzer builds TokenStreams, which analyze text. It thus represents a policy for extracting index terms from text.
In order to define what analysis is done, subclasses must define their TokenStreamComponents in CreateComponents(string, TextReader). The components are then reused in each call to GetTokenStream(string, TextReader). Simple example:Analyzer analyzer = Analyzer.NewAnonymous(createComponents: (fieldName, reader) =>
{
Tokenizer source = new FooTokenizer(reader);
TokenStream filter = new FooFilter(source);
filter = new BarFilter(filter);
return new TokenStreamComponents(source, filter);
});
For more examples, see the Lucene.Net.Analysis namespace documentation.
For some concrete implementations bundled with Lucene, look in the analysis modules:
- [Common](../analysis-common/overview.html): Analyzers for indexing content in different languages and domains.
- [ICU](../icu/Lucene.Net.Analysis.Icu.html): Exposes functionality from ICU to Apache Lucene.
- [Kuromoji](../analysis-kuromoji/Lucene.Net.Analysis.Ja.html): Morphological analyzer for Japanese text.
- [Morfologik](../analysis-morfologik/Lucene.Net.Analysis.Morfologik.html): Dictionary-driven lemmatization for the Polish language.
- [OpenNLP](../analysis-opennlp/Lucene.Net.Analysis.OpenNlp.html): Analysis integration with Apache OpenNLP.
- [Phonetic](../analysis-phonetic/Lucene.Net.Analysis.Phonetic.html): Analysis for indexing phonetic signatures (for sounds-alike search).
- [Smart Chinese](../analysis-smartcn/Lucene.Net.Analysis.Cn.Smart.html): Analyzer for Simplified Chinese, which indexes words.
- [Stempel](../analysis-stempel/Lucene.Net.Analysis.Stempel.html): Algorithmic Stemmer for the Polish Language.
Implements
Inherited Members
Namespace: Lucene.Net.Analysis
Assembly: Lucene.Net.dll
Syntax
public abstract class Analyzer : IDisposable
Constructors
Analyzer()
Create a new Analyzer, reusing the same set of components per-thread across calls to GetTokenStream(string, TextReader).
Declaration
protected Analyzer()
Analyzer(ReuseStrategy)
Expert: create a new Analyzer with a custom ReuseStrategy.
NOTE: if you just want to reuse on a per-field basis, its easier to use a subclass of AnalyzerWrapper such asLucene.Net.Analysis.Common.Miscellaneous.PerFieldAnalyzerWrapper
instead.
Declaration
protected Analyzer(ReuseStrategy reuseStrategy)
Parameters
Type | Name | Description |
---|---|---|
ReuseStrategy | reuseStrategy |
Fields
GLOBAL_REUSE_STRATEGY
A predefined ReuseStrategy that reuses the same components for every field.
Declaration
public static readonly ReuseStrategy GLOBAL_REUSE_STRATEGY
Field Value
Type | Description |
---|---|
ReuseStrategy |
PER_FIELD_REUSE_STRATEGY
A predefined ReuseStrategy that reuses components per-field by maintaining a Map of TokenStreamComponents per field name.
Declaration
public static readonly ReuseStrategy PER_FIELD_REUSE_STRATEGY
Field Value
Type | Description |
---|---|
ReuseStrategy |
Properties
Strategy
Returns the used ReuseStrategy.
Declaration
public ReuseStrategy Strategy { get; }
Property Value
Type | Description |
---|---|
ReuseStrategy |
Methods
CreateComponents(string, TextReader)
Creates a new TokenStreamComponents instance for this analyzer.
Declaration
protected abstract TokenStreamComponents CreateComponents(string fieldName, TextReader reader)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | the name of the fields content passed to the TokenStreamComponents sink as a reader |
TextReader | reader | the reader passed to the Tokenizer constructor |
Returns
Type | Description |
---|---|
TokenStreamComponents | the TokenStreamComponents for this analyzer. |
Dispose()
Frees persistent resources used by this Analyzer
Declaration
public void Dispose()
Dispose(bool)
Frees persistent resources used by this Analyzer
Declaration
protected virtual void Dispose(bool disposing)
Parameters
Type | Name | Description |
---|---|---|
bool | disposing |
GetOffsetGap(string)
Just like GetPositionIncrementGap(string), except for Token offsets instead. By default this returns 1. this method is only called if the field produced at least one token for indexing.
Declaration
public virtual int GetOffsetGap(string fieldName)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | the field just indexed |
Returns
Type | Description |
---|---|
int | offset gap, added to the next token emitted from GetTokenStream(string, TextReader).
this value must be |
GetPositionIncrementGap(string)
Invoked before indexing a IIndexableField instance if terms have already been added to that field. This allows custom analyzers to place an automatic position increment gap between IIndexableField instances using the same field name. The default value position increment gap is 0. With a 0 position increment gap and the typical default token position increment of 1, all terms in a field, including across IIndexableField instances, are in successive positions, allowing exact PhraseQuery matches, for instance, across IIndexableField instance boundaries.
Declaration
public virtual int GetPositionIncrementGap(string fieldName)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | IIndexableField name being indexed. |
Returns
Type | Description |
---|---|
int | position increment gap, added to the next token emitted from GetTokenStream(string, TextReader).
this value must be |
GetTokenStream(string, TextReader)
Returns a TokenStream suitable for fieldName
, tokenizing
the contents of text
.
Declaration
public TokenStream GetTokenStream(string fieldName, TextReader reader)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | the name of the field the created TokenStream is used for |
TextReader | reader | the reader the streams source reads from |
Returns
Type | Description |
---|---|
TokenStream | TokenStream for iterating the analyzed content of TextReader |
Exceptions
Type | Condition |
---|---|
ObjectDisposedException | if the Analyzer is disposed. |
IOException | if an i/o error occurs (may rarely happen for strings). |
See Also
GetTokenStream(string, string)
Returns a TokenStream suitable for fieldName
, tokenizing
the contents of text
.
Declaration
public TokenStream GetTokenStream(string fieldName, string text)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | the name of the field the created TokenStream is used for |
string | text | the string the streams source reads from |
Returns
Type | Description |
---|---|
TokenStream | TokenStream for iterating the analyzed content of |
Exceptions
Type | Condition |
---|---|
ObjectDisposedException | if the Analyzer is disposed. |
IOException | if an i/o error occurs (may rarely happen for strings). |
See Also
InitReader(string, TextReader)
Override this if you want to add a CharFilter chain.
The default implementation returnsreader
unchanged.
Declaration
protected virtual TextReader InitReader(string fieldName, TextReader reader)
Parameters
Type | Name | Description |
---|---|---|
string | fieldName | IIndexableField name being indexed |
TextReader | reader | original TextReader |
Returns
Type | Description |
---|---|
TextReader | reader, optionally decorated with CharFilter(s) |
NewAnonymous(Func<string, TextReader, TokenStreamComponents>)
Creates a new instance with the ability to specify the body of the CreateComponents(string, TextReader)
method through the createComponents
parameter.
Simple example:
var analyzer = Analyzer.NewAnonymous(createComponents: (fieldName, reader) =>
{
Tokenizer source = new FooTokenizer(reader);
TokenStream filter = new FooFilter(source);
filter = new BarFilter(filter);
return new TokenStreamComponents(source, filter);
});
LUCENENET specific
Declaration
public static Analyzer NewAnonymous(Func<string, TextReader, TokenStreamComponents> createComponents)
Parameters
Type | Name | Description |
---|---|---|
Func<string, TextReader, TokenStreamComponents> | createComponents | A delegate method that represents (is called by) the CreateComponents(string, TextReader) method. It accepts a string fieldName and a TextReader reader and returns the TokenStreamComponents for this analyzer. |
Returns
Type | Description |
---|---|
Analyzer | A new Analyzer.AnonymousAnalyzer instance. |
NewAnonymous(Func<string, TextReader, TokenStreamComponents>, ReuseStrategy)
Creates a new instance with the ability to specify the body of the CreateComponents(string, TextReader)
method through the createComponents
parameter and allows the use of a ReuseStrategy.
Simple example:
var analyzer = Analyzer.NewAnonymous(createComponents: (fieldName, reader) =>
{
Tokenizer source = new FooTokenizer(reader);
TokenStream filter = new FooFilter(source);
filter = new BarFilter(filter);
return new TokenStreamComponents(source, filter);
}, reuseStrategy);
LUCENENET specific
Declaration
public static Analyzer NewAnonymous(Func<string, TextReader, TokenStreamComponents> createComponents, ReuseStrategy reuseStrategy)
Parameters
Type | Name | Description |
---|---|---|
Func<string, TextReader, TokenStreamComponents> | createComponents | An delegate method that represents (is called by) the CreateComponents(string, TextReader) method. It accepts a string fieldName and a TextReader reader and returns the TokenStreamComponents for this analyzer. |
ReuseStrategy | reuseStrategy | A custom ReuseStrategy instance. |
Returns
Type | Description |
---|---|
Analyzer | A new Analyzer.AnonymousAnalyzer instance. |
NewAnonymous(Func<string, TextReader, TokenStreamComponents>, Func<string, TextReader, TextReader>)
Creates a new instance with the ability to specify the body of the CreateComponents(string, TextReader)
method through the createComponents
parameter and the body of the InitReader(string, TextReader)
method through the initReader
parameter.
Simple example:
var analyzer = Analyzer.NewAnonymous(createComponents: (fieldName, reader) =>
{
Tokenizer source = new FooTokenizer(reader);
TokenStream filter = new FooFilter(source);
filter = new BarFilter(filter);
return new TokenStreamComponents(source, filter);
}, initReader: (fieldName, reader) =>
{
return new HTMLStripCharFilter(reader);
});
LUCENENET specific
Declaration
public static Analyzer NewAnonymous(Func<string, TextReader, TokenStreamComponents> createComponents, Func<string, TextReader, TextReader> initReader)
Parameters
Type | Name | Description |
---|---|---|
Func<string, TextReader, TokenStreamComponents> | createComponents | A delegate method that represents (is called by) the CreateComponents(string, TextReader) method. It accepts a string fieldName and a TextReader reader and returns the TokenStreamComponents for this analyzer. |
Func<string, TextReader, TextReader> | initReader | A delegate method that represents (is called by) the InitReader(string, TextReader)
method. It accepts a string fieldName and a TextReader reader and
returns the TextReader that can be modified or wrapped by the |
Returns
Type | Description |
---|---|
Analyzer | A new Analyzer.AnonymousAnalyzer instance. |
NewAnonymous(Func<string, TextReader, TokenStreamComponents>, Func<string, TextReader, TextReader>, ReuseStrategy)
Creates a new instance with the ability to specify the body of the CreateComponents(string, TextReader)
method through the createComponents
parameter, the body of the InitReader(string, TextReader)
method through the initReader
parameter, and allows the use of a ReuseStrategy.
Simple example:
var analyzer = Analyzer.NewAnonymous(createComponents: (fieldName, reader) =>
{
Tokenizer source = new FooTokenizer(reader);
TokenStream filter = new FooFilter(source);
filter = new BarFilter(filter);
return new TokenStreamComponents(source, filter);
}, initReader: (fieldName, reader) =>
{
return new HTMLStripCharFilter(reader);
}, reuseStrategy);
LUCENENET specific
Declaration
public static Analyzer NewAnonymous(Func<string, TextReader, TokenStreamComponents> createComponents, Func<string, TextReader, TextReader> initReader, ReuseStrategy reuseStrategy)
Parameters
Type | Name | Description |
---|---|---|
Func<string, TextReader, TokenStreamComponents> | createComponents | A delegate method that represents (is called by) the CreateComponents(string, TextReader) method. It accepts a string fieldName and a TextReader reader and returns the TokenStreamComponents for this analyzer. |
Func<string, TextReader, TextReader> | initReader | A delegate method that represents (is called by) the InitReader(string, TextReader)
method. It accepts a string fieldName and a TextReader reader and
returns the TextReader that can be modified or wrapped by the |
ReuseStrategy | reuseStrategy | A custom ReuseStrategy instance. |
Returns
Type | Description |
---|---|
Analyzer | A new Analyzer.AnonymousAnalyzer instance. |