We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We want to use our Arango graph to be incorporated with LLM.
As input, the new model will take a graph/subgraph, process it, save it in the vector db and create a RAG.
For that, we define all the attributes in config file (YAML or python dictionary) with the following variables (preliminary):
model_name_path: "llama-v2-b-chat-hf" generation_params: temperature: 0.1 .... graph_db: username: password: url:.... vector_db: type: chroma_db username: password: url:.... RAG_params: <params related to integrating vector db to llm>
The CAG RAG model takes the config file and the subgraph (using AQL output, or a postprocessed format) and outputs the RAG model.
The text was updated successfully, but these errors were encountered:
hecking
roxanneelbaff
obensch
No branches or pull requests
We want to use our Arango graph to be incorporated with LLM.
As input, the new model will take a graph/subgraph, process it, save it in the vector db and create a RAG.
CONFIG
For that, we define all the attributes in config file (YAML or python dictionary) with the following variables (preliminary):
CagRag Model
The CAG RAG model takes the config file and the subgraph (using AQL output, or a postprocessed format) and outputs the RAG model.
Questions to answer
The text was updated successfully, but these errors were encountered: