This documents describes the data schema used for running RAG experiments to evaluate system performance.
Datasources describe a collection of documents that can be used in a Test Run.
id name description
A Document is a reference to the text content that will be used for generating Questions and performing the search.
id datasource_id name location
A QASet is a reference to a location for a set of data composed of Questions and Correct Answers for a corresponding Document.
id datasource_id document_id name location
An Question and a corresponding Correct Answer based on information in a corresponding Document.
id qaset_id document_id question answer
A Test Run is a single test that was run on a set of Documents in a Datasource
id datasource_id description
A Response is an answer to a corresponding Question, which was generated by your RAG system.
id test_run_id question_id response
A Context is a portion of a Document that was returned with the Response by your RAG system.
id response_id text similarity_score
An Eval Function is a reference to a function that will be executed to evaluate the Response and Context to generate a score.
id name description function_index
A Test Eval Config is a set of Eval Function that will be run for a particular Test Run.
id test_run_id eval_function_id
A Response Eval is the score returned by an Eval Function for a given Reponse and Context
id test_run_id question_id response_id test_eval_config_id eval_score