Documentation Index
Fetch the complete documentation index at: https://openlayer.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
Definition
The answer relevancy test measures how relevant the answer (output) is given the question. This metric is based on the Ragas response relevancy metric.Taxonomy
- Task types: LLM.
- Availability: and .
Why it matters
- Answer relevancy ensures that your LLM generates responses that are directly related to the input question or prompt.
- This metric helps identify when your model is providing off-topic or tangential responses that don’t address the user’s actual query.
- It’s particularly important for chatbots, Q&A systems, and any application where staying on-topic is crucial for user experience.
Required columns
To compute this metric, your dataset must contain the following columns:- Input: The question or prompt given to the LLM
- Outputs: The generated answer/response from your LLM
Test configuration examples
If you are writing atests.json, here are a few valid configurations for the answer relevancy test:
Related
- Ragas integration - Learn more about Ragas metrics.
- Answer correctness test - Measure factual accuracy of answers.
- Aggregate metrics - Overview of all available metrics.

