KBase Research Assistant is powered by multiple Language Model (LLM)-based Agents designed to interact conversationally with users. This Assistant will serve as a guide, facilitating users' navigation within the KBase platform.
Uses Poetry to handle requirements
- Setup a conda environment
- Run poetry install
TODO. For now, see the SpinUp docs and tutorials. You'll be making a pretty simple deployment of a single node, with a single service, with a single ingress to get this up and running.
- Build the new docker image
docker build --platform="linux/amd64" -t registry.nersc.gov/kbase/kbase-chat-assistant .
- Push to the registry. You might need to login with your NERSC id, pw+2fa
docker push registry.nersc.gov/kbase/kbase-chat-assistant
- Log in to https://rancher2.spin.nersc.gov
- Click on the
Development
server - On the left side find "Workloads". Click that, then "Deployments" underneath.
- Find the chat assistant (namespace:
knowledge-engine
, container:chat-assistant
) - On the far right, hit the 3-dots menu, then
Redeploy
- It can take a while for the container to get the memo that it should shut down, so just let it go.
- Add the Knowledge Graph
- User interface