This TokenFilter provides the ability to set aside attribute states that have already been analyzed. This is useful in situations where multiple fields share many common analysis steps and then go their separate ways.

It is also useful for doing things like entity extraction or proper noun analysis as part of the analysis workflow and saving off those tokens for use in another field.

            TeeSinkTokenFilter source1 = new TeeSinkTokenFilter(new WhitespaceTokenizer(reader1));
            TeeSinkTokenFilter.SinkTokenStream sink1 = source1.newSinkTokenStream();
            TeeSinkTokenFilter.SinkTokenStream sink2 = source1.newSinkTokenStream();
            TeeSinkTokenFilter source2 = new TeeSinkTokenFilter(new WhitespaceTokenizer(reader2));
            source2.addSinkTokenStream(sink1);
            source2.addSinkTokenStream(sink2);
            TokenStream final1 = new LowerCaseFilter(source1);
            TokenStream final2 = source2;
            TokenStream final3 = new EntityDetect(sink1);
            TokenStream final4 = new URLDetect(sink2);
            d.add(new Field("f1", final1));
            d.add(new Field("f2", final2));
            d.add(new Field("f3", final3));
            d.add(new Field("f4", final4));
            
In this example,
CopyC#
sink1
and
CopyC#
sink2
will both get tokens from both
CopyC#
reader1
and
CopyC#
reader2
after whitespace tokenizer and now we can further wrap any of these in extra analysis, and more "sources" can be inserted if desired. It is important, that tees are consumed before sinks (in the above example, the field names must be less the sink's field names). If you are not sure, which stream is consumed first, you can simply add another sink and then pass all tokens to the sinks at once using {@link #consumeAllTokens}. This TokenFilter is exhausted after this. In the above example, change the example above to:
            ...
            TokenStream final1 = new LowerCaseFilter(source1.newSinkTokenStream());
            TokenStream final2 = source2.newSinkTokenStream();
            sink1.consumeAllTokens();
            sink2.consumeAllTokens();
            ...
            
In this case, the fields can be added in any order, because the sources are not used anymore and all sinks are ready.

Note, the EntityDetect and URLDetect TokenStreams are for the example and do not currently exist in Lucene.

The TeeSinkTokenFilter..::..AnonymousClassSinkFilter type exposes the following members.

Constructors

Methods

  NameDescription
Public methodAccept (Overrides TeeSinkTokenFilter..::..SinkFilter..::..Accept(AttributeSource).)
Public methodEquals
Determines whether the specified Object is equal to the current Object.
(Inherited from Object.)
Protected methodFinalize
Allows an Object to attempt to free resources and perform other cleanup operations before the Object is reclaimed by garbage collection.
(Inherited from Object.)
Public methodGetHashCode
Serves as a hash function for a particular type.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodReset
Called by {@link SinkTokenStream#Reset()}. This method does nothing by default and can optionally be overridden.
(Inherited from TeeSinkTokenFilter..::..SinkFilter.)
Public methodToString
Returns a String that represents the current Object.
(Inherited from Object.)

See Also