Class Token
A Token is an occurrence of a term from the text of a field. It consists of a term's text, the start and end offset of the term in the text of the field, and a type string.
The start and end offsets permit applications to re-associate a token with its source text, e.g., to display highlighted query terms in a document browser, or to show matching text fragments in a KWIC (KeyWord In Context) display, etc. The type is a string, assigned by a lexical analyzer (a.k.a. tokenizer), naming the lexical or syntactic class that the token belongs to. For example an end of sentence marker token might be implemented with type "eos". The default token type is "word". A Token can optionally have metadata (a.k.a. payload) in the form of a variable length byte array. Use GetPayload() to retrieve the payloads from the index. NOTE: As of 2.9, Token implements all IAttribute interfaces that are part of core Lucene and can be found in the Lucene.Net.Analysis.TokenAttributes namespace. Even though it is not necessary to use Token anymore, with the new TokenStream API it can be used as convenience class that implements all IAttributes, which is especially useful to easily switch from the old to the new TokenStream API.Tokenizers and TokenFilters should try to re-use a Token instance when possible for best performance, by implementing the IncrementToken() API. Failing that, to create a new Token you should first use one of the constructors that starts with null text. To load the token from a char[] use CopyBuffer(char[], int, int). To load from a string use SetEmpty() followed by Append(string) or Append(string, int, int). Alternatively you can get the Token's termBuffer by calling either Buffer, if you know that your text is shorter than the capacity of the termBuffer or ResizeBuffer(int), if there is any possibility that you may need to grow the buffer. Fill in the characters of your term into this buffer, with ToCharArray(int, int) if loading from a string, or with Copy(Array, int, Array, int, int), and finally call SetLength(int) to set the length of the term text. See LUCENE-969 for details.
Typical Token reuse patterns:
- Copying text from a string (type is reset to DEFAULT_TYPE if not specified):
return reusableToken.Reinit(string, startOffset, endOffset[, type]);
- Copying some text from a string (type is reset to DEFAULT_TYPE if not specified):
return reusableToken.Reinit(string, 0, string.Length, startOffset, endOffset[, type]);
- Copying text from char[] buffer (type is reset to DEFAULT_TYPE if not specified):
return reusableToken.Reinit(buffer, 0, buffer.Length, startOffset, endOffset[, type]);
- Copying some text from a char[] buffer (type is reset to DEFAULT_TYPE if not specified):
return reusableToken.Reinit(buffer, start, end - start, startOffset, endOffset[, type]);
- Copying from one one Token to another (type is reset to DEFAULT_TYPE if not specified):
return reusableToken.Reinit(source.Buffer, 0, source.Length, source.StartOffset, source.EndOffset[, source.Type]);
- Clear() initializes all of the fields to default values. this was changed in contrast to Lucene 2.4, but should affect no one.
- Because TokenStreams can be chained, one cannot assume that the Token's current type is correct.
- The startOffset and endOffset represent the start and offset in the source text, so be careful in adjusting them.
- When caching a reusable token, clone it. When injecting a cached token into a stream that can be reset, clone it again.
Please note: With Lucene 3.1, the ToString() method had to be changed to match the J2N.Text.ICharSequence interface introduced by the interface ICharTermAttribute. this method now only prints the term text, no additional information anymore.
Implements
Inherited Members
Namespace: Lucene.Net.Analysis
Assembly: Lucene.Net.dll
Syntax
public class Token : CharTermAttribute, ICharTermAttribute, ICharSequence, ITermToBytesRefAttribute, IAppendable, ITypeAttribute, IPositionIncrementAttribute, IFlagsAttribute, IOffsetAttribute, IPayloadAttribute, IPositionLengthAttribute, IAttribute
Constructors
Token()
Constructs a Token will null text.
Declaration
public Token()
Token(char[], int, int, int, int)
Constructs a Token with the given term buffer (offset & length), start and end offsets
Declaration
public Token(char[] startTermBuffer, int termBufferOffset, int termBufferLength, int start, int end)
Parameters
Type | Name | Description |
---|---|---|
char[] | startTermBuffer | buffer containing term text |
int | termBufferOffset | the index in the buffer of the first character |
int | termBufferLength | number of valid characters in the buffer |
int | start | start offset in the source text |
int | end | end offset in the source text |
Token(int, int)
Constructs a Token with null text and start & end offsets.
Declaration
public Token(int start, int end)
Parameters
Type | Name | Description |
---|---|---|
int | start | start offset in the source text |
int | end | end offset in the source text |
Token(int, int, int)
Constructs a Token with null text and start & end offsets plus flags. NOTE: flags is EXPERIMENTAL.
Declaration
public Token(int start, int end, int flags)
Parameters
Type | Name | Description |
---|---|---|
int | start | start offset in the source text |
int | end | end offset in the source text |
int | flags | The bits to set for this token |
Token(int, int, string)
Declaration
public Token(int start, int end, string typ)
Parameters
Type | Name | Description |
---|---|---|
int | start | start offset in the source text |
int | end | end offset in the source text |
string | typ | the lexical type of this Token |
Token(string, int, int)
Constructs a Token with the given term text, and start & end offsets. The type defaults to "word." NOTE: for better indexing speed you should instead use the char[] termBuffer methods to set the term text.
Declaration
public Token(string text, int start, int end)
Parameters
Type | Name | Description |
---|---|---|
string | text | term text |
int | start | start offset in the source text |
int | end | end offset in the source text |
Token(string, int, int, int)
Constructs a Token with the given text, start and end offsets, & type. NOTE: for better indexing speed you should instead use the char[] termBuffer methods to set the term text.
Declaration
public Token(string text, int start, int end, int flags)
Parameters
Type | Name | Description |
---|---|---|
string | text | term text |
int | start | start offset in the source text |
int | end | end offset in the source text |
int | flags | token type bits |
Token(string, int, int, string)
Constructs a Token with the given text, start and end offsets, & type. NOTE: for better indexing speed you should instead use the char[] termBuffer methods to set the term text.
Declaration
public Token(string text, int start, int end, string typ)
Parameters
Type | Name | Description |
---|---|---|
string | text | term text |
int | start | start offset in the source text |
int | end | end offset in the source text |
string | typ | token type |
Fields
TOKEN_ATTRIBUTE_FACTORY
Convenience factory that returns Token as implementation for the basic attributes and return the default impl (with "Impl" appended) for all other attributes. @since 3.0
Declaration
public static readonly AttributeSource.AttributeFactory TOKEN_ATTRIBUTE_FACTORY
Field Value
Type | Description |
---|---|
AttributeSource.AttributeFactory |
Properties
EndOffset
Returns this Token's ending offset, one greater than the position of the last character corresponding to this token in the source text. The length of the token in the source text is (
EndOffset
- StartOffset).
Declaration
public int EndOffset { get; }
Property Value
Type | Description |
---|---|
int |
See Also
Flags
Get the bitset for any bits that have been set.
This is completely distinct from Type, although they do share similar purposes. The flags can be used to encode information about the token for use by other TokenFilters.Declaration
public virtual int Flags { get; set; }
Property Value
Type | Description |
---|---|
int |
See Also
Payload
Gets or Sets this Token's payload.
Declaration
public virtual BytesRef Payload { get; set; }
Property Value
Type | Description |
---|---|
BytesRef |
See Also
PositionIncrement
Gets or Sets the position increment (the distance from the prior term). The default value is one.
Declaration
public virtual int PositionIncrement { get; set; }
Property Value
Type | Description |
---|---|
int |
Exceptions
Type | Condition |
---|---|
ArgumentException | if value is set to a negative value. |
See Also
PositionLength
Gets or Sets the position length of this Token (how many positions this token spans).
The default value is one.Declaration
public virtual int PositionLength { get; set; }
Property Value
Type | Description |
---|---|
int |
Exceptions
Type | Condition |
---|---|
ArgumentException | if value is set to zero or negative. |
See Also
StartOffset
Returns this Token's starting offset, the position of the first character corresponding to this token in the source text.
Note that the difference between EndOffset and StartOffset may not be equal to termText.Length, as the term text may have been altered by a stemmer or some other filter.Declaration
public int StartOffset { get; }
Property Value
Type | Description |
---|---|
int |
See Also
Type
Gets or Sets this Token's lexical type. Defaults to "word".
Declaration
public string Type { get; set; }
Property Value
Type | Description |
---|---|
string |
Methods
Clear()
Resets the term text, payload, flags, and positionIncrement, startOffset, endOffset and token type to default.
Declaration
public override void Clear()
Overrides
Clone()
Shallow clone. Subclasses must override this if they need to clone any members deeply,
Declaration
public override object Clone()
Returns
Type | Description |
---|---|
object |
Overrides
Clone(char[], int, int, int, int)
Makes a clone, but replaces the term buffer & start/end offset in the process. This is more efficient than doing a full clone (and then calling CopyBuffer(char[], int, int)) because it saves a wasted copy of the old termBuffer.
Declaration
public virtual Token Clone(char[] newTermBuffer, int newTermOffset, int newTermLength, int newStartOffset, int newEndOffset)
Parameters
Type | Name | Description |
---|---|---|
char[] | newTermBuffer | |
int | newTermOffset | |
int | newTermLength | |
int | newStartOffset | |
int | newEndOffset |
Returns
Type | Description |
---|---|
Token |
CopyTo(IAttribute)
Copies the values from this Attribute into the passed-in
target
attribute. The target
implementation must support all the
IAttributes this implementation supports.
Declaration
public override void CopyTo(IAttribute target)
Parameters
Type | Name | Description |
---|---|---|
IAttribute | target |
Overrides
Equals(object)
Determines whether the specified object is equal to the current object.
Declaration
public override bool Equals(object obj)
Parameters
Type | Name | Description |
---|---|---|
object | obj | The object to compare with the current object. |
Returns
Type | Description |
---|---|
bool | true if the specified object is equal to the current object; otherwise, false. |
Overrides
GetHashCode()
Serves as the default hash function.
Declaration
public override int GetHashCode()
Returns
Type | Description |
---|---|
int | A hash code for the current object. |
Overrides
ReflectWith(IAttributeReflector)
This method is for introspection of attributes, it should simply add the key/values this attribute holds to the given IAttributeReflector.
The default implementation calls Reflect(Type, string, object) for all non-static fields from the implementing class, using the field name as key and the field value as value. The IAttribute class is also determined by Reflection. Please note that the default implementation can only handle single-Attribute implementations. Custom implementations look like this (e.g. for a combined attribute implementation):public void ReflectWith(IAttributeReflector reflector)
{
reflector.Reflect(typeof(ICharTermAttribute), "term", GetTerm());
reflector.Reflect(typeof(IPositionIncrementAttribute), "positionIncrement", GetPositionIncrement());
}
If you implement this method, make sure that for each invocation, the same set of IAttribute
interfaces and keys are passed to Reflect(Type, string, object) in the same order, but possibly
different values. So don't automatically exclude e.g. null
properties!
Declaration
public override void ReflectWith(IAttributeReflector reflector)
Parameters
Type | Name | Description |
---|---|---|
IAttributeReflector | reflector |
Overrides
See Also
Reinit(Token)
Copy the prototype token's fields into this one. Note: Payloads are shared.
Declaration
public virtual void Reinit(Token prototype)
Parameters
Type | Name | Description |
---|---|---|
Token | prototype | source Token to copy fields from |
Reinit(Token, char[], int, int)
Copy the prototype token's fields into this one, with a different term. Note: Payloads are shared.
Declaration
public virtual void Reinit(Token prototype, char[] newTermBuffer, int offset, int length)
Parameters
Type | Name | Description |
---|---|---|
Token | prototype | existing Token |
char[] | newTermBuffer | buffer containing new term text |
int | offset | the index in the buffer of the first character |
int | length | number of valid characters in the buffer |
Reinit(Token, string)
Copy the prototype token's fields into this one, with a different term. Note: Payloads are shared.
Declaration
public virtual void Reinit(Token prototype, string newTerm)
Parameters
Type | Name | Description |
---|---|---|
Token | prototype | existing Token |
string | newTerm | new term text |
Reinit(char[], int, int, int, int)
Shorthand for calling Clear(), CopyBuffer(char[], int, int), SetOffset(int, int), Type (set) on DEFAULT_TYPE
Declaration
public virtual Token Reinit(char[] newTermBuffer, int newTermOffset, int newTermLength, int newStartOffset, int newEndOffset)
Parameters
Type | Name | Description |
---|---|---|
char[] | newTermBuffer | |
int | newTermOffset | |
int | newTermLength | |
int | newStartOffset | |
int | newEndOffset |
Returns
Type | Description |
---|---|
Token | this Token instance |
Reinit(char[], int, int, int, int, string)
Shorthand for calling Clear(), CopyBuffer(char[], int, int), SetOffset(int, int), Type (set)
Declaration
public virtual Token Reinit(char[] newTermBuffer, int newTermOffset, int newTermLength, int newStartOffset, int newEndOffset, string newType)
Parameters
Type | Name | Description |
---|---|---|
char[] | newTermBuffer | |
int | newTermOffset | |
int | newTermLength | |
int | newStartOffset | |
int | newEndOffset | |
string | newType |
Returns
Type | Description |
---|---|
Token | this Token instance |
Reinit(string, int, int)
Shorthand for calling Clear(), Append(string), SetOffset(int, int), Type (set) on DEFAULT_TYPE
Declaration
public virtual Token Reinit(string newTerm, int newStartOffset, int newEndOffset)
Parameters
Type | Name | Description |
---|---|---|
string | newTerm | |
int | newStartOffset | |
int | newEndOffset |
Returns
Type | Description |
---|---|
Token | this Token instance |
Reinit(string, int, int, int, int)
Shorthand for calling Clear(), Append(string, int, int), SetOffset(int, int), Type (set) on DEFAULT_TYPE
Declaration
public virtual Token Reinit(string newTerm, int newTermOffset, int newTermLength, int newStartOffset, int newEndOffset)
Parameters
Type | Name | Description |
---|---|---|
string | newTerm | |
int | newTermOffset | |
int | newTermLength | |
int | newStartOffset | |
int | newEndOffset |
Returns
Type | Description |
---|---|
Token | this Token instance |
Reinit(string, int, int, int, int, string)
Shorthand for calling Clear(), Append(string, int, int), SetOffset(int, int), Type (set)
Declaration
public virtual Token Reinit(string newTerm, int newTermOffset, int newTermLength, int newStartOffset, int newEndOffset, string newType)
Parameters
Type | Name | Description |
---|---|---|
string | newTerm | |
int | newTermOffset | |
int | newTermLength | |
int | newStartOffset | |
int | newEndOffset | |
string | newType |
Returns
Type | Description |
---|---|
Token | this Token instance |
Reinit(string, int, int, string)
Shorthand for calling Clear(), Append(string), SetOffset(int, int), Type (set)
Declaration
public virtual Token Reinit(string newTerm, int newStartOffset, int newEndOffset, string newType)
Parameters
Type | Name | Description |
---|---|---|
string | newTerm | |
int | newStartOffset | |
int | newEndOffset | |
string | newType |
Returns
Type | Description |
---|---|
Token | this Token instance |
SetOffset(int, int)
Set the starting and ending offset.
Declaration
public virtual void SetOffset(int startOffset, int endOffset)
Parameters
Type | Name | Description |
---|---|---|
int | startOffset | |
int | endOffset |
Exceptions
Type | Condition |
---|---|
ArgumentException | If |