Skip to content

TabbyML/tabby

This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

30ae239 Β· Jan 9, 2025
Jan 9, 2025
Dec 26, 2024
Sep 23, 2024
Nov 21, 2024
Jan 8, 2025
Jan 9, 2025
Jul 12, 2024
Jan 9, 2025
May 3, 2024
Nov 28, 2023
Sep 6, 2024
Dec 27, 2024
May 10, 2024
Dec 13, 2023
Feb 1, 2024
Nov 17, 2023
Jun 26, 2024
May 17, 2024
Jun 5, 2023
Jan 9, 2025
Jan 24, 2024
Oct 31, 2024
Jan 9, 2025
Jan 9, 2025
Jan 11, 2024
Nov 1, 2024
Sep 23, 2024
Dec 11, 2024
Feb 8, 2024
Aug 29, 2024
Dec 28, 2024
May 21, 2024
Feb 7, 2024
Aug 29, 2024

Repository files navigation

🐾 Tabby

πŸ“š Docs β€’ πŸ’¬ Slack β€’ πŸ—ΊοΈ Roadmap

latest release PRs Welcome Docker pulls codecov

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open Live Demo

Demo

πŸ”₯ What's New

  • 12/06/2024 Llamafile deployment integration and enhanced Answer Engine user experience are coming in Tabby v0.21.0!πŸš€
  • 11/10/2024 Switching between different backend chat models is supported in Answer Engine with Tabby v0.20.0!
  • 10/30/2024 Tabby v0.19.0 featuring recent shared threads on the main page to improve their discoverability.
Archived
  • 07/09/2024 πŸŽ‰Announce Codestral integration in Tabby!
  • 07/05/2024 Tabby v0.13.0 introduces Answer Engine, a central knowledge engine for internal engineering teams. It seamlessly integrates with dev team's internal data, delivering reliable and precise answers to empower developers.
  • 06/13/2024 VSCode 1.7 marks a significant milestone with a versatile Chat experience throughout your coding experience. Come and they the latest chat in side-panel and editing via chat command!
  • 06/10/2024 Latest πŸ“ƒblogpost drop on an enhanced code context understanding in Tabby!
  • 06/06/2024 Tabby v0.12.0 release brings πŸ”—seamless integrations (Gitlab SSO, Self-hosted GitHub/GitLab, etc.), to βš™οΈflexible configurations (HTTP API integration) and 🌐expanded capabilities (repo-context in Code Browser)!
  • 05/22/2024 Tabby VSCode 1.6 comes with multiple choices in inline completion, and the auto-generated commit messagesπŸ±πŸ’»!
  • 05/11/2024 v0.11.0 brings significant enterprise upgrades, including πŸ“Šstorage usage stats, πŸ”—GitHub & GitLab integration, πŸ“‹Activities page, and the long-awaited πŸ€–Ask Tabby feature!
  • 04/22/2024 v0.10.0 released, featuring the latest Reports tab with team-wise analytics for Tabby usage.
  • 04/19/2024 πŸ“£ Tabby now incorporates locally relevant snippets(declarations from local LSP, and recently modified code) for code completion!
  • 04/17/2024 CodeGemma and CodeQwen model series have now been added to the official registry!
  • 03/20/2024 v0.9 released, highlighting a full feature admin UI.
  • 12/23/2023 Seamlessly deploy Tabby on any cloud with SkyServe πŸ›« from SkyPilot.
  • 12/15/2023 v0.7.0 released with team management and secured access!
  • 10/15/2023 RAG-based code completion is enabled by detail in v0.3.0πŸŽ‰! Check out the blogpost explaining how Tabby utilizes repo-level context to get even smarter!
  • 11/27/2023 v0.6.0 released!
  • 11/09/2023 v0.5.5 released! With a redesign of UI + performance improvement.
  • 10/24/2023 ⛳️ Major updates for Tabby IDE plugins across VSCode/Vim/IntelliJ!
  • 10/04/2023 Check out the model directory for the latest models supported by Tabby.
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 πŸ₯³.
  • 08/28/2023 Experimental support for the CodeLlama 7B.
  • 08/24/2023 Tabby is now on JetBrains Marketplace!

πŸ‘‹ Getting Started

You can find our documentation here.

Run Tabby in 1 Minute

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct

For additional options (e.g inference type, parallelism), please refer to the documentation page.

🀝 Contributing

Full guide at CONTRIBUTING.md;

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt install protobuf-compiler libopenblas-dev
  1. Install useful tools:
# For Ubuntu
apt install make sqlite3 graphviz
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌍 Community

  • 🎀 Twitter / X - engage with TabbyML for all things possible
  • πŸ“š LinkedIn - follow for the latest from the community
  • πŸ’Œ Newsletter - subscribe to unlock Tabby insights and secrets

πŸ”† Activity

Git Repository Activity

🌟 Star History

Star History Chart