diff --git a/README.md b/README.md index ad5d3870cb14..69fa16622850 100644 --- a/README.md +++ b/README.md @@ -28,48 +28,15 @@ **LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. -

Follow LocalAI

- -

- -Follow LocalAI_API - - -Join LocalAI Discord Community - - -

Connect with the Creator

- -

- -Follow mudler_it - - -Follow on Github - -

- -

Share LocalAI Repository

- -

- - -Follow _LocalAI -Share on Telegram - -Share on Reddit - Buy Me A Coffee - -

- -## πŸ’» [Getting started](https://localai.io/basics/getting_started/index.html) - ## πŸ”₯πŸ”₯ Hot topics / Roadmap [Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) -πŸ†• New! [LLM finetuning guide](https://localai.io/advanced/fine-tuning/) +- 🐸 Coqui +- Inline templates: https://github.com/mudler/LocalAI/pull/1452 +- Mixtral: https://github.com/mudler/LocalAI/pull/1449 +- Img2vid https://github.com/mudler/LocalAI/pull/1442 +- Musicgen https://github.com/mudler/LocalAI/pull/1387 Hot topics (looking for contributors): - Backends v2: https://github.com/mudler/LocalAI/issues/1126 @@ -77,22 +44,15 @@ Hot topics (looking for contributors): If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22 +

+ +Follow LocalAI_API + + +Join LocalAI Discord Community + - -


- -In a nutshell: - -- Local, OpenAI drop-in alternative REST API. You own your data. -- NO GPU required. NO Internet access is required either - - Optional, GPU Acceleration is available in `llama.cpp`-compatible LLMs. See also the [build section](https://localai.io/basics/build/index.html). -- Supports multiple models -- πŸƒ Once loaded the first time, it keep models loaded in memory for faster inference -- ⚑ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance. - -LocalAI was created by [Ettore Di Giacinto](https://github.com/mudler/) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome! - -Note that this started just as a [fun weekend project](https://localai.io/#backstory) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)! +## πŸ’» [Getting started](https://localai.io/basics/getting_started/index.html) ## πŸš€ [Features](https://localai.io/features/) diff --git a/docs/content/_index.en.md b/docs/content/_index.en.md index de8d7496e358..a5d1c01db860 100644 --- a/docs/content/_index.en.md +++ b/docs/content/_index.en.md @@ -24,8 +24,6 @@ title = "LocalAI" **LocalAI** is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU. It is maintained by [mudler](https://github.com/mudler). -

Follow LocalAI

-

Follow LocalAI_API @@ -34,33 +32,6 @@ title = "LocalAI" Join LocalAI Discord Community -

Connect with the Creator

- -

- -Follow mudler_it - - -Follow on Github - -

- -

Share LocalAI Repository

- -

- - -Follow _LocalAI -Share on Telegram - -Share on Reddit - Buy Me A Coffee - -

- -
- In a nutshell: - Local, OpenAI drop-in alternative REST API. You own your data. @@ -70,9 +41,10 @@ In a nutshell: - πŸƒ Once loaded the first time, it keep models loaded in memory for faster inference - ⚑ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance. -LocalAI was created by [Ettore Di Giacinto](https://github.com/mudler/) and is a community-driven project, focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome! +LocalAI is focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome! + +Note that this started just as a fun weekend project by [mudler](https://github.com/mudler) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)! -Note that this started just as a [fun weekend project](https://localai.io/#backstory) in order to try to create the necessary pieces for a full AI assistant like `ChatGPT`: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)! ## πŸš€ Features @@ -86,19 +58,6 @@ Note that this started just as a [fun weekend project](https://localai.io/#backs - πŸ–ΌοΈ [Download Models directly from Huggingface ](https://localai.io/models/) - πŸ†• [Vision API](https://localai.io/features/gpt-vision/) - -## πŸ”₯πŸ”₯ Hot topics / Roadmap - -[Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) - -πŸ†• New! [LLM finetuning guide](https://localai.io/advanced/fine-tuning/) - -Hot topics (looking for contributors): -- Backends v2: https://github.com/mudler/LocalAI/issues/1126 -- Improving UX v2: https://github.com/mudler/LocalAI/issues/1373 - -If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22 - ## How does it work? LocalAI is an API written in Go that serves as an OpenAI shim, enabling software already developed with OpenAI SDKs to seamlessly integrate with LocalAI. It can be effortlessly implemented as a substitute, even on consumer-grade hardware. This capability is achieved by employing various C++ backends, including [ggml](https://github.com/ggerganov/ggml), to perform inference on LLMs using both CPU and, if desired, GPU. Internally LocalAI backends are just gRPC server, indeed you can specify and build your own gRPC server and extend LocalAI in runtime as well. It is possible to specify external gRPC server and/or binaries that LocalAI will manage internally. @@ -139,6 +98,8 @@ LocalAI couldn't have been built without the help of great software already avai - https://github.com/rhasspy/piper - https://github.com/cmp-nct/ggllm.cpp + + ## Backstory As much as typical open source projects starts, I, [mudler](https://github.com/mudler/), was fiddling around with [llama.cpp](https://github.com/ggerganov/llama.cpp) over my long nights and wanted to have a way to call it from `go`, as I am a Golang developer and use it extensively. So I've created `LocalAI` (or what was initially known as `llama-cli`) and added an API to it.