Skip to content

A chatbot that simulates technical interviews using the Groq LLM and StreamLit for the user interface

Notifications You must be signed in to change notification settings

Collincg/technical-interviewer-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Technical Interviewer Chatbot

You can use the chatbot here Technical Interview Chatbot!

This project is a Technical Interview Chatbot designed to simulate a technical interview environment for individuals in the fields of Computer/Data Science and related areas. By using this chatbot, you can practice both behavioral and technical questions in a stress-free setting. The chatbot leverages the Groq API and the llama-3.1-70b-versatile LLM for language model processing, LangChain for managing prompts and conversation history, and Streamlit for the frontend interface.


A preview of the StreamLit User Interface:

Screenshot 2024-12-12 at 12 55 47 PM

Features

  • Groq API: Utilizes Groq's advanced language models for generating interview questions, processing user responses, and providing feedback.
  • LangChain:
    • Manages the conversation flow and integrates memory using LangChainCommunity and LangChainCore.
    • Ensures that the chatbot retains context across multiple turns of the conversation.
  • Streamlit Frontend: Provides an interactive web-based interface for users to engage with the chatbot.

Prerequisites

  • Basic knowledge of Python programming.
  • LangChain is used to manage conversation flow and memory. See this guide on their website to learn more Build a Chatbot | 🦜️🔗 LangChain.
  • A GROQ API Key (or any preferred API key) to use the AI model (this is stored securely using Streamlit secrets).
  • Some basic and intuitive StreamLit commands are used.

Recommended Setup

  1. Virtual Environment: Create a virtual environment to manage dependencies:

    python -m venv chatbot_env
    source chatbot_env/bin/activate  # On Windows: chatbot_env\Scripts\activate
    pip install -r requirements.txt
  2. See the requirements.txt to setup necessary packages.

  3. GROQ API Key: visit GroqCloud to create and use a free API key in your project.

  4. Deployment: Deploy the app using StreamLit Cloud for easy access.

    • When creating a new project on StreamLit Cloud, you will see an option to simply connect a GitHub repository and choose the main file. StreamLit will handle the rest!

    • Before clicking "Deploy", make sure to visit the advanced settings:

      Screenshot 2024-12-12 at 12 06 32 PM
    • Here you will need to enter your API key with the correct format below. "Secrets" is StreamLit's way of securly attaching your API key to your project.

      Screenshot 2024-12-12 at 12 08 23 PM
    • If StreamLit still can't access the API key after deployment, find advanced settings again on the running website and double check the "Secrets" to ensure your API is present and correctly formated.


Code Explanation

See the programming guide: programmingGuide.md for a deeper understanding of streamlitChatWithMemory.py.


Contributors

  • Collin Graff

    • Role: Lead Developer
    • Contributions: Concept design, coding, and deployment.
  • David Robert

    • Role: Assistant Developer
    • Contributions: Code improvements, testing, and troubleshooting.

About

A chatbot that simulates technical interviews using the Groq LLM and StreamLit for the user interface

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages