Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Feature Request: Support for Chunk Ranking in File Search #121

@aledc7

Description

@aledc7

I'm requesting support for chunk ranking in the file search tool when using openai-php/laravel. Currently, the file search returns all results it deems relevant, but this can lead to lower-quality responses if the model uses content with low relevance. It would be useful to adjust this behavior by enabling chunk ranking configuration in the file_search tool to ensure only highly relevant chunks are used.

The expected functionality would allow:

Inspecting file search chunks: Using parameters like include to retrieve the specific file chunks used during a response generation run.

Configurable chunk ranking: Adjusting settings like:

ranker: Which ranker to use, e.g., auto or default_2024_08_21.
score_threshold: A value between 0.0 and 1.0, to filter file chunks based on their relevance score, improving the quality of responses.
For example, in the OpenAI API, you can inspect the file chunks during a run as follows:

run_step = client.beta.threads.runs.steps.retrieve(
    thread_id="thread_abc123",
    run_id="run_abc123",
    step_id="step_abc123",
    include=["step_details.tool_calls[*].file_search.results[*].content"]
)

This feature would significantly enhance the precision of responses generated from file searches. It would be great if this could be incorporated into future releases.

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions