Skip to content

Commit

Permalink
final readme touches
Browse files Browse the repository at this point in the history
  • Loading branch information
stelios-ritual committed Oct 7, 2024
1 parent c11be3a commit 77d6b86
Show file tree
Hide file tree
Showing 10 changed files with 98 additions and 370 deletions.
15 changes: 12 additions & 3 deletions projects/gpt4/container/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,31 @@

In this example, we will run a minimalist container that makes use of the OpenAI [completions API](https://platform.openai.com/docs/api-reference/chat) to serve text generation requests.

Check out the full tutorial [here](https://learn.ritual.net/examples/running_gpt_4).

## Requirements

To use the model you'll need to have an OpenAI API key. Get one on [OpenAI](https://openai.com/)'s website.

## Run the Container
## Build the Container

Build and run this container as follows:
Simply run the following command to build the container:

```bash
make build
```

## Run the Container

To run the container, you can use the following command:

```bash
make run
```

## Test the Container

You can test this container by making inference requests directly through your terminal:
You can test the container by making inference requests directly via your terminal:

```bash
curl -X POST localhost:3000/service_output -H "Content-Type: application/json" \
Expand Down
76 changes: 32 additions & 44 deletions projects/hello-world/container/README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,20 @@
# Creating an infernet-compatible `hello-world` container
# Creating a `hello-world` container

In this tutorial, we'll create a simple hello-world container that can be used
with infernet.
In this tutorial, we'll create a simple, Infernet-compatible `hello-world` container. Check out the full tutorial [here](https://learn.ritual.net/examples/hello_world).

> [!NOTE]
> This directory `containers/hello-world` already includes the final result
> of this tutorial. Run the following tutorial in a new directory.
**Note:** This directory already includes the final result of this tutorial. Therefore, we recommend you follow the tutorial in a new directory.

Let's get started! 🎉

## Step 1: create a simple flask-app and a requirements.txt file
## Step 1: Create a simple app

First, we'll create a simple flask-app that returns a hello-world message.
We begin by creating a `src` directory:
First, we'll create a simple flask-app that returns a hello-world message. We begin by creating a `src` directory:

```
mkdir src
```

Inside `src`, we create a `app.py` file with the following content:
Inside `src`, we'll create a `app.py` file with the following content:

```python
from typing import Any
Expand All @@ -41,22 +37,18 @@ def create_app() -> Flask:
return app
```

As you can see, the app has two endpoints: `/` and `/service_output`. The first
one is simply used to ping the service, while the second one is used for infernet.
As you can see, the app has two endpoints: `/` and `/service_output`. The first one is simply used to ping the service, while the second one is used for requesting jobs.

We can see that our app uses the `flask` package. Additionally, we'll need to
install the `gunicorn` package to run the app. We'll create a `requirements.txt`
file with the following content:
We can see that our app uses the `flask` package. Additionally, we'll need to install the `gunicorn` package to run the app. We'll create a `requirements.txt` file with the following content:

```
Flask>=3.0.0,<4.0.0
gunicorn>=22.0.0,<23.0.0
```

## Step 2: create a Dockerfile
## Step 2: Create a Dockerfile

Next, we'll create a Dockerfile that builds the flask-app and runs it.
At the top-level directory, create a `Dockerfile` with the following content:
Next, we'll create a Dockerfile that builds and runs our app. At the top-level directory, create a `Dockerfile` with the following content:

```dockerfile
FROM python:3.11-slim as builder
Expand Down Expand Up @@ -88,46 +80,41 @@ This is a simple Dockerfile that:
3. Copies the source code
4. Runs the app on port `3000`

> [!IMPORTANT]
> App must be exposed on port `3000`. Infernet's orchestrator
> will always assume that the container apps are exposed on that port within the container.
> Users can then remap this port to any port that they want on the host machine
> using the `port` parameter in the container specs.
**Important:** App must be exposed on port `3000`. Infernet's orchestrator will always assume that the container apps are exposed on that port within the container. Users can then remap this port to any port that they want on the host machine using the `port` parameter in the container specs.

By now, your project directory should look like this:

```
.
├── Dockerfile
├── README.md
├── requirements.txt
└── src
├── __init__.py
└── app.py
└── requirements.txt
```

## Step 3: build the container
## Step 3: Build the container

Now, we can build the container. At the top-level directory, run:

```
```bash
docker build -t hello-world .
```

## Step 4: run the container
## Step 4: Run the container

Finally, we can run the container. In one terminal, run:

```
```bash
docker run --rm -p 3000:3000 --name hello hello-world
```

## Step 5: ping the container
## Step 5: Ping the container

In another terminal, run:

```
curl "localhost:3000"
```bash
curl localhost:3000
```

It should return something like:
Expand All @@ -136,28 +123,29 @@ It should return something like:
Hello world service!
```

Congratulations! You've created a simple hello-world container that can be
used with infernet. 🎉
Congratulations! You've created a simple hello-world container that can be used with infernet. 🎉

## Step 6: request a service output
## Step 6: Request a service output

Now, let's request a service output. Note that this endpoint is called by
the infernet node, not by the user. For debugging purposes however, it's useful to
be able to call it manually.
Now, let's request a service output. Note that this endpoint is called by the Infernet Node, not the user. However, for development and debugging purposes, it's useful to call it directly.

In your terminal, run:

```
curl -X POST -H "Content-Type: application/json" -d '{"input": "hello"}' localhost:3000/service_output
```bash
curl -X POST localhost:3000/service_output -H "Content-Type: application/json" -d '{"input": "hello"}'
```

The output should be something like:

```
```json
{"output": "hello, world!, your input was: {'input': 'hello'}"}
```

Your users will never call this endpoint directly. Instead, they will:
Your users will never call this endpoint directly. Instead, they will either:

1. Create an offchain [Job Request](#step-6-request-a-service-output) via the node API, or
2. Create an on-chain [Subscription](https://docs.ritual.net/infernet/sdk/architecture#subscriptions) from their smart contract

## Next steps

1. Either [create an off-chain job request](../hello-world#L36) through the node API
2. Or they will make a subscription on their contracts
Check out the more advanced examples in this repository, as well as their walkthrough tutorials on [Ritual Learn](https://learn.ritual.net/examples/overview).
Loading

0 comments on commit 77d6b86

Please sign in to comment.