Skip to content

Commit

Permalink
Add self-hosted docs
Browse files Browse the repository at this point in the history
  • Loading branch information
mosquito committed Nov 28, 2024
1 parent e610621 commit a3b9bfe
Show file tree
Hide file tree
Showing 6 changed files with 917 additions and 485 deletions.
91 changes: 91 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
name: Deploy Documentation to Cloudflare R2

# Trigger the workflow on pushes to the main branch
on:
push:
branches:
- main

jobs:
build-and-deploy:
runs-on: ubuntu-latest

steps:
# 1. Checkout the repository
- uses: actions/checkout@v2

# 2. Set up Python 3.12
- name: Setup Python 3.12
uses: actions/setup-python@v2
with:
python-version: "3.12"

# 3. Cache virtualenv
- name: Cache virtualenv
id: venv-cache
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ github.job }}-${{ github.ref }}-3.12
restore-keys: |
venv-${{ runner.os }}-${{ github.job }}-${{ github.ref }}-
venv-${{ runner.os }}-${{ github.job }}-
venv-${{ runner.os }}-
# 4. Install Poetry
- name: Install Poetry
run: python -m pip install poetry

# 5. Cache Poetry and pip dependencies (Optional but recommended)
- name: Cache Poetry and pip
uses: actions/cache@v3
with:
path: |
~/.cache/pypoetry
~/.cache/pip
key: poetry-pip-${{ runner.os }}-${{ hashFiles('**/poetry.lock') }}
restore-keys: |
poetry-pip-${{ runner.os }}-
# 6. Install project dependencies using Poetry
- name: Install Dependencies with Poetry
run: poetry install --no-interaction --no-ansi

# 7. Install additional requirements for documentation
- name: Install Documentation Requirements
run: |
cd docs
pip install -r requirements.txt
# 8. Build the documentation
- name: Build Documentation
run: |
cd docs
make html
# Adjust the build command if you're using a different tool

# 9. Install AWS CLI (required for S3-compatible APIs like Cloudflare R2)
- name: Install AWS CLI
run: |
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
# 10. Configure AWS CLI for Cloudflare R2
- name: Configure AWS CLI for Cloudflare R2
run: |
aws configure set aws_access_key_id ${{ secrets.CF_R2_ACCESS_KEY_ID }}
aws configure set aws_secret_access_key ${{ secrets.CF_R2_SECRET_ACCESS_KEY }}
aws configure set default.region us-east-1 # R2 uses us-east-1 by default
aws configure set default.output json
# 11. Sync the built documentation to the Cloudflare R2 bucket
- name: Sync to Cloudflare R2
env:
CF_R2_ENDPOINT: ${{ secrets.CF_R2_ENDPOINT }}
CF_R2_BUCKET_NAME: ${{ secrets.CF_R2_BUCKET_NAME }}
run: |
aws s3 sync docs/build/html s3://$CF_R2_BUCKET_NAME \
--delete \
--acl public-read \
--endpoint-url $CF_R2_ENDPOINT
13 changes: 0 additions & 13 deletions .readthedocs.yaml

This file was deleted.

4 changes: 4 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,12 @@
"sphinx.ext.doctest",
"sphinx.ext.coverage",
"sphinx.ext.viewcode",
"sphinxcontrib.googleanalytics",
]

googleanalytics_id = "G-VNYV7TYPS6"
googleanalytics_enabled = True

# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]

Expand Down
6 changes: 3 additions & 3 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -410,7 +410,7 @@ And the caller side might be written like this:
`FastStream`_
~~~~~~~~~
~~~~~~~~~~~~~

**FastStream** is a powerful and easy-to-use Python library for building asynchronous services that interact with event streams..

Expand All @@ -420,10 +420,10 @@ If you need no deep dive into **RabbitMQ** details, you can use more high-level
from faststream import FastStream
from faststream.rabbit import RabbitBroker
broker = RabbitBroker("amqp://guest:guest@localhost:5672/")
app = FastStream(broker)
@broker.subscriber("user")
async def user_created(user_id: int):
assert isinstance(user_id, int)
Expand Down
Loading

0 comments on commit a3b9bfe

Please sign in to comment.