-
Notifications
You must be signed in to change notification settings - Fork 31
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
4 changed files
with
130 additions
and
0 deletions.
There are no files selected for viewing
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,52 @@ | ||
![动手学Ollama](images/header.svg) | ||
|
||
# 💻 handy-ollama 🦙 | ||
Learning to deploy Ollama with hands-on practice, making the deployment of large language models accessible to everyone! | ||
|
||
## 项目简介 | ||
动手学Ollama,实现大模型本地化部署,快速在本地管理以及运行大模型,让CPU也可以玩转大模型部署! | ||
|
||
## 立项理由 | ||
随着大模型的飞速发展,市面上出现了越来越多的开源大模型,但是许多模型的部署需要利用GPU资源,如何让大模型时代的红利普惠到每一个人,让每一个人都可以部署属于自己的大模型。Ollama是一个开源的大语言部署服务工具,只需CPU即可部署大模型。我们希望通过动手学Ollama这一开源教程,帮助学习者快速上手Ollama,让每一位大模型爱好者、学习者以及开发者都能在本地部署自己的大模型,进而开发一些大模型应用,让大模型赋能千行百业! | ||
|
||
## 项目受众 | ||
- 希望不受GPU资源限制,在本地运行大模型; | ||
- 希望在消费级硬件上进行大模型有效的推理; | ||
- 希望在本地部署大模型,开发大模型应用; | ||
- 希望在本地管理大模型,让本地模型安全可靠。 | ||
|
||
## 项目亮点 | ||
本项目旨在使用CPU部署本地大模型,虽然目前已经有很多LLM相关的教程,但是这些教程中模型基本上都需要GPU资源,这对于很多资源受限的学习者不是很友好。因此,本项目通过动手学Ollama,帮助学习者快速上手本地CPU部署大模型。 | ||
|
||
## 项目规划 | ||
### 目录(持续更新中...) | ||
- [x] 1 [Ollama 介绍](docs/C1/1.%20Ollama介绍.md) @[友东](https://github.com/AXYZdong) | ||
- [x] 2 Ollama 安装与配置 | ||
- [x] [macOS](docs/C2/1.%20Ollama在macOS下的安装与配置.md) @[天奥](https://github.com/lta155) | ||
- [x] [Windows](docs/C2/2.%20Ollama在Windows下的安装与配置.md) @[Yuki](https://github.com/fuyueagain) | ||
- [x] [Linux](docs/C2/3.%20Ollama在Linux下的安装与配置.md) @[Yuki](https://github.com/fuyueagain) | ||
- [x] [Docker](docs/C2/4.%20Ollama在Docker下的安装与配置.md) @[Yuki](https://github.com/fuyueagain) | ||
- [x] 3 [自定义导入模型](docs/C3/1.%20自定义导入模型.md) @[杨卓](https://github.com/little1d) | ||
- [x] 4 Ollama REST API | ||
- [x] [Ollama API 使用指南](docs/C4/1.%20Ollama%20API%20使用指南.md)@[林通](https://github.com/kjlintong) | ||
- [x] [在 Python 中使用 Ollama API](docs/C4/2.%20在%20Python%20中使用%20Ollama%20API.md) @[春阳](https://github.com/ChunyangChai) | ||
- [x] [在 Java 中使用 Ollama API](docs/C4/4.%20在%20Java%20中使用%20Ollama%20API.md) @[林通](https://github.com/kjlintong) | ||
- [x] 5 Ollama 在 LangChain 中的使用 | ||
- [x] [在 Python 中的集成](docs/C5/1.%20Ollama在LangChain中的使用%20-%20Python集成.md) @[鑫民](https://github.com/fancyboi999) | ||
- [x] [在 JavaScript 中的集成](docs/C5/2.%20Ollama在LangChain中的使用%20-%20JavaScript集成.md) @[鑫民](https://github.com/fancyboi999) | ||
- [x] 6 Ollama 可视化界面部署 | ||
- [x] [使用 FastAPI 部署 Ollama 可视化对话界面](docs/C6/1.%20使用%20FastAPI%20部署%20Ollama%20可视化对话界面.md) @[友东](https://github.com/AXYZdong) | ||
- [x] [使用 WebUI 部署 Ollama 可视化对话界面](docs/C6/2.%20使用%20WebUI%20部署%20Ollama%20可视化对话界面.md) @[友东](https://github.com/AXYZdong) | ||
- [ ] 7 应用案例 | ||
- [ ] 搭建本地的 AI Copilot 编程助手 | ||
- [ ] Dify 接入 Ollama 部署的本地模型 | ||
- [x] 使用 LangChain 搭建本地 RAG 应用 @[舒凡](https://github.com/Tsumugii24) | ||
|
||
|
||
## 致谢 | ||
|
||
特别感谢以下为教程做出贡献的同学! | ||
|
||
<a href="https://github.com/AXYZdong/handy-ollama/graphs/contributors"> | ||
<img src="https://contrib.rocks/image?repo=AXYZdong/handy-ollama" /> | ||
</a> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
* 目录 | ||
* [第 1 章 Ollama介绍](C1/1.%20Ollama介绍.md) | ||
* [第 2 章 Ollama安装与配置](C2/1.%20Ollama在macOS下的安装与配置.md) | ||
* [2.1 macOS](C2/1.%20Ollama在macOS下的安装与配置.md) | ||
* [2.2 Windows](C2/2.%20Ollama在Windows下的安装与配置.md) | ||
* [2.3 Linux](C2/3.%20Ollama在Linux下的安装与配置.md) | ||
* [2.4 Docker](C2/4.%20Ollama在Docker下的安装与配置.md) | ||
* [第 3 章 自定义导入模型](C3/1.%20自定义导入模型.md) | ||
* 第 4 章 Ollama REST API | ||
* [4.1 Ollama API 使用指南](C4/1.%20Ollama%20API%20使用指南.md) | ||
* [4.2 在 Python 中使用 Ollama API](C4/2.%20在%20Python%20中使用%20Ollama%20API.md) | ||
* [4.3 在 Java 中使用 Ollama API](C4/3.%20在%20Java%20中使用%20Ollama%20API.md) | ||
* 第 5 章 Ollama 在 LangChain 中的使用 | ||
* [5.1 在 Python 中的集成](C5/1.%20Ollama在LangChain中的使用%20-%20Python集成.md) | ||
* [5.2 在 JavaScript 中的集成](C5/2.%20Ollama在LangChain中的使用%20-%20JavaScript集成.md) | ||
* 第 6 章 Ollama可视化界面部署 | ||
* [6.1 使用 FastAPI 部署 Ollama 可视化对话界面](C6/1.%20使用%20FastAPI%20部署%20Ollama%20可视化对话界面.md) | ||
* [6.2 使用 WebUI 部署 Ollama 可视化对话界面](C6/2.%20使用%20WebUI%20部署%20Ollama%20可视化对话界面.md) | ||
* 第 7 章 应用案例 | ||
* 搭建本地的 AI Copilot 编程助手 | ||
* Dify 接入 Ollama 部署的本地模型 | ||
* 使用 LangChain 搭建本地 RAG 应用 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,56 @@ | ||
<!DOCTYPE html> | ||
<html lang="en"> | ||
|
||
<head> | ||
<meta charset="UTF-8"> | ||
<title>💻 动手学 Ollama 🦙</title> | ||
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1" /> | ||
<meta name="description" content="Description"> | ||
<meta name="viewport" | ||
content="width=device-width, user-scalable=no, initial-scale=1.0, maximum-scale=1.0, minimum-scale=1.0"> | ||
<link rel="stylesheet" href="//cdn.jsdelivr.net/npm/docsify@latest/lib/themes/vue.css"> | ||
</head> | ||
|
||
<body> | ||
<div id="app"></div> | ||
<script src="//cdn.jsdelivr.net/npm/[email protected]/dist/mermaid.min.js"></script> | ||
<script> | ||
window.$docsify = { | ||
name: '💻 动手学 Ollama 🦙', | ||
repo: 'https://github.com/AXYZdong/handy-ollama', | ||
loadSidebar: true, | ||
auto2top: true, | ||
subMaxLevel: 2, | ||
alias: { | ||
'/.*/_sidebar.md': '/_sidebar.md' | ||
}, | ||
pagination: { | ||
previousText: '上一章节', | ||
nextText: '下一章节', | ||
}, | ||
count: { | ||
countable: true, | ||
fontsize: '0.9em', | ||
color: 'rgb(90,90,90)', | ||
language: 'chinese' | ||
} | ||
} | ||
</script> | ||
<!-- Put them above docsify.min.js --> | ||
<script src="//cdn.jsdelivr.net/npm/docsify@latest/lib/docsify.min.js"></script> | ||
<!-- code render--> | ||
<script src="//cdn.jsdelivr.net/npm/prismjs@latest/components/prism-bash.js"></script> | ||
<script src="//cdn.jsdelivr.net/npm/prismjs@latest/components/prism-python.js"></script> | ||
<script src="//cdn.jsdelivr.net/npm/docsify-pagination@latest/dist/docsify-pagination.min.js"></script> | ||
<script src="//cdn.jsdelivr.net/npm/docsify-copy-code"></script> | ||
|
||
<script src="https://cdn.jsdelivr.net/npm/katex@latest/dist/katex.min.js"></script> | ||
<link rel="stylesheet" href="//cdn.jsdelivr.net/npm/katex@latest/dist/katex.min.css" /> | ||
<script src="https://cdn.jsdelivr.net/npm/marked@3"></script> | ||
<!-- CDN files for docsify-katex --> | ||
<script src="//cdn.jsdelivr.net/npm/docsify-katex@latest/dist/docsify-katex.js"></script> | ||
<!-- 字数统计 --> | ||
<script src="//unpkg.com/docsify-count/dist/countable.js"></script> | ||
</body> | ||
|
||
</html> |