diff --git a/CHANGELOG.md b/CHANGELOG.md index 6ac99fe..8b8e2ef 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -11,6 +11,12 @@ All notable changes to this project will be documented in this file. - Simplified examples to the minimum core functionality necessary and removed all dependencies on `infernet-ml`. - Updated images used for deploying the Infernet Node. +## [1.0.2] - 2024-07-31 + +### Changed +- Set `trail_head_blocks` to `0` in `config.json` for all projects. This fixes an issue where the node would not start due to a lack of trailing blocks. +- Updated `registry_address` to `0x663F3ad617193148711d28f5334eE4Ed07016602` to point to the correct registry address + ## [1.0.1] - 2024-07-31 ### Fixed diff --git a/README.md b/README.md index e374c87..504dc16 100644 --- a/README.md +++ b/README.md @@ -14,6 +14,6 @@ model to infernet. Using this example will make it easier for you to deploy your 4. [Prompt to NFT](projects/prompt-to-nft/prompt-to-nft.md): In this example, we use [stablediffusion](https://github.com/Stability-AI/stablediffusion) to mint NFTs on-chain using a prompt. 5. [TGI Inference with Mistral-7b](projects/tgi-llm/tgi-llm.md): This example shows you how to deploy an arbitrary -LLM model using [Huggingface's TGI](https://huggingface.co/docs/text-generation-inference/en/index), and use it with an infernet node. +LLM model using [Huggingface's TGI](https://huggingface.co/docs/text-generation-inference/en/index), and use it with an Infernet Node. 6. [Running OpenAI's GPT-4 on Infernet](projects/gpt4/gpt4.md): This example shows you how to deploy OpenAI's GPT-4 model to infernet. diff --git a/projects/gpt4/contracts/README.md b/projects/gpt4/contracts/README.md index b36189f..2b3344e 100644 --- a/projects/gpt4/contracts/README.md +++ b/projects/gpt4/contracts/README.md @@ -2,7 +2,7 @@ This is a minimalist foundry project that implements a [callback consumer](https://docs.ritual.net/infernet/sdk/consumers/Callback) that makes a prompt to the [container](../container/README.md), which then makes a call to OpenAI's GPT4. For an -end-to-end flow of how this works, follow the [guide here](../gpt4.md). +end-to-end flow of how this works, follow our [GPT4 tutorial](https://learn.ritual.net/examples/running_gpt_4). ## Deploying diff --git a/projects/hello-world/contracts/README.md b/projects/hello-world/contracts/README.md index 932e8b1..b1ac531 100644 --- a/projects/hello-world/contracts/README.md +++ b/projects/hello-world/contracts/README.md @@ -6,11 +6,10 @@ contract, [`SaysGm`](./src/SaysGM.sol). This readme explains how to compile and deploy the contract to the Infernet Anvil Testnet network. For a detailed tutorial on how to write a consumer contract, refer to the [tutorial doc](./Tutorial.md). - > [!IMPORTANT] > Ensure that you are running the following scripts with the Infernet Anvil Testnet network. -> The [tutorial](../hello-world) at the root of this repository explains how to -> bring up an infernet node. +> Check out the [hello-world tutorial](https://learn.ritual.net/examples/hello_world) for a walkthrough +> of setting up and running an Infernet Node. ### Installing the libraries @@ -29,15 +28,19 @@ The deploy script at `script/Deploy.s.sol` deploys the `SaysGM` contract to the We have the [following make target](./Makefile#L9) to deploy the contract. Refer to the Makefile for more understanding around the deploy scripts. + ```bash make deploy ``` ### Requesting a job + We also have a script called `CallContract.s.sol` that requests a job to the `SaysGM` contract. Refer to the [script](./script/CallContract.s.sol) for more details. Similar to deployment, you can run that script using the following convenience make target. + ```bash make call-contract ``` + Refer to the [Makefile](./Makefile#L14) for more details. diff --git a/projects/onnx-iris/contracts/README.md b/projects/onnx-iris/contracts/README.md index e90b151..f25e2dc 100644 --- a/projects/onnx-iris/contracts/README.md +++ b/projects/onnx-iris/contracts/README.md @@ -7,8 +7,8 @@ This readme explains how to compile and deploy the contract to the Infernet Anvi > [!IMPORTANT] > Ensure that you are running the following scripts with the Infernet Anvil Testnet network. -> The [tutorial](../../hello-world/README.mdADME.md) at the root of this repository explains how to -> bring up an infernet node. +> Check our the [ONNX tutorial](https://learn.ritual.net/examples/running_an_onnx_model) for a walkthrough +> of setting up and running an Infernet Node. ### Installing the libraries diff --git a/projects/torch-iris/contracts/README.md b/projects/torch-iris/contracts/README.md index 4fc0051..1a67a2f 100644 --- a/projects/torch-iris/contracts/README.md +++ b/projects/torch-iris/contracts/README.md @@ -7,8 +7,8 @@ This readme explains how to compile and deploy the contract to the Infernet Anvi > [!IMPORTANT] > Ensure that you are running the following scripts with the Infernet Anvil Testnet network. -> The [tutorial](../../hello-world/README.mdADME.md) at the root of this repository explains how to -> bring up an infernet node. +> Check out the [Torch tutorial](https://learn.ritual.net/examples/running_a_torch_model) for a walkthrough +> of setting up and running an Infernet Node. ### Installing the libraries