Fork me on GitHub
Search Results for

    Show / Hide Table of Contents

    run-trec-eval

    Name

    benchmark-run-trec-eval - Runs a TREC evaluation.

    Synopsis

    lucene benchmark run-trec-eval <INPUT_TOPICS_FILE> <INPUT_QUERY_RELEVANCE_FILE> <OUTPUT_SUBMISSION_FILE> <INDEX_DIRECTORY> [-t|--query-on-title] [-d|--query-on-description] [-n|--query-on-narrative] [?|-h|--help]
    

    Arguments

    INPUT_TOPICS_FILE

    Input file containing queries.

    INPUT_QUERY_RELEVANCE_FILE

    Input file conataining relevance judgements.

    OUTPUT_SUBMISSION_FILE

    Output submission file for TREC evaluation.

    INDEX_DIRECTORY

    The index directory.

    Options

    ?|-h|--help

    Prints out a short help for the command.

    -t|--query-on-title

    Use title field in query. This flag will automatically be on if no other field is specified.

    -d|--query-on-description

    Use description field in query.

    -n|--query-on-narrative

    Use narrative field in query.

    Example

    Runs a TREC evaluation on the c:\topics queries file and the c:\queries relevance judgements on the c:\lucene-index index using the description and narrative fields and places the output in c:\output.txt.

    lucene benchmark run-trec-eval c:\topics.txt c:\queries.txt c:\submissions.txt c:\output.txt c:\lucene-index -d -n

    In this article
    Back to top Copyright © 2025 The Apache Software Foundation, Licensed under the Apache License, Version 2.0
    Apache Lucene.Net, Lucene.Net, Apache, the Apache feather logo, and the Apache Lucene.Net project logo are trademarks of The Apache Software Foundation.
    All other marks mentioned may be trademarks or registered trademarks of their respective owners.