Skip to content

LLM agent system for HCI research question co-creation, brainstorming and ideation

License

Notifications You must be signed in to change notification settings

yiren-liu/coquest

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Logo

CoQuest: Agent LLM for Research Question Co-Creation

This is the code for the system introduced in SIGCHI 2024 paper "CoQuest: Exploring Research Question Co-Creation with an LLM-based Agent".

About The Project

Product Screen Shot

We proposed a novel system called CoQuest, which allows an AI agent to initiate research question (RQ) generation by tapping the power of LLMs and taking humans' feedback into a co-creation process.

Major features of the CoQuest system:

  • RQ Flow Editor that facilitates a user’s major interactions, such as generating RQs, providing input and feedback to AI, and editing the RQ flow (e.g., drag and delete).
  • Paper Graph Visualizer that displays the literature space related to each RQ.
  • AI Thoughts that explains AI’s rationale of why each RQ is generated.

System Framework

(back to top)

Demo Video

coquest-demo-github.mp4

Built With

We thankfully acknowledge the following projects/libraries based on the which this prototype is made possible.

(back to top)

Getting Started

This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.

Prerequisites

Recommended: Install Docker (https://docs.docker.com/get-docker/) and docker-compose.

Alternatively, to install from source, you would need to install Node.js and Python>=3.10.

Installation

Adding API configs

This is required for both running from docker or source.

  1. First create a config file backend/.env based on your API choice and info. Refer to the backend/.env.example for an example.

  2. IF you are using Azure OpenAI API, create another Azure config file at backend/azure.yaml. Refer to the backend/azure.yaml.example for an example.

From docker (Recommended)

  1. To deploy locally, build and run docker containers with docker-compose (Note that this runs a DEV server)
docker-compose up

From source

  1. Clone the repo
git clone https://github.com/yiren-liu/coquest.git
  1. Install and run server for the frontend
cd frontend/rq-flow
npm install
npm start
  1. Install backend Python server requirements
cd backend/
pip install -r requirements.txt
python -m spacy download en_core_web_sm
  1. Run backend server (Dev mode)
python main.py

The backend DB for logging change be changed to any self-hosted postgres DB by modifying the .env configs.

(back to top)

Usage

After deploying the service locally, visit: http://localhost:3000/app

Replacing the paper pool with your own

The paper pool used in the search function is vectorized and stored using ChromaDB under backend/paper_graph/db. Currently, the embedding model we used is OpenAI Ada 2, but you could use any other models if needed. When swapping the paper pool, modify and run the backend/paper_graph/get_embeddings.py.

(back to top)

Contact

Yiren Liu - @yirenl2 - [email protected]

SALT Lab @ UIUC: https://socialcomputing.web.illinois.edu/index.html

(back to top)

About

LLM agent system for HCI research question co-creation, brainstorming and ideation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published