From 92664a2b034217bbf2af60fa06224caff121a2e1 Mon Sep 17 00:00:00 2001 From: jmoreira-valory Date: Thu, 21 Mar 2024 13:28:04 +0100 Subject: [PATCH] doc: fix return type --- README.md | 45 ++++++++++++++++++++++++++++----------------- 1 file changed, 28 insertions(+), 17 deletions(-) diff --git a/README.md b/README.md index 1379d69a..18ac94dc 100644 --- a/README.md +++ b/README.md @@ -11,14 +11,15 @@ - The execution of AI tasks, such as image generation using DALL-E, prompt processing with ChatGPT, or more intricate operations involving on-chain transactions, poses a number of challenges, including: + - Access to proprietary APIs, which may come with associated fees/subscriptions. - Proficiency in the usage of the the related open-source technologies, which may entail facing their inherent complexities. AI Mechs run on the [Gnosis chain](https://www.gnosis.io/), and enables you to post *AI tasks requests* on-chain and get their result delivered back to you efficiently. An AI Mech will execute these tasks for you. All you need is some xDAI in your wallet to reward the worker service executing your task. AI Mechs are **hassle-free**, **crypto-native**, and **infinitely composable**. > :bulb: These are just a few ideas on what capabilities can be brought on-chain with AI Mechs: +> > - fetch real-time **web search** results > - integrate **multi-sig wallets**, > - **simulate** chain transactions @@ -61,7 +62,7 @@ The project consists of three components: 5. The Mech reads the request metadata from IPFS using its hash. -6. The Mech selects the appropriate tool to handle the request from the `tool` entry in the metadata, and runs the tool with the given arguments, usually a prompt. In this example, the mech has been requested to interact with OpenAI's API, so it forwards the prompt to it, but the tool can implement any other desired behavior. +6. The Mech selects the appropriate tool to handle the request from the `tool` entry in the metadata, and runs the tool with the given arguments, usually a prompt. In this example, the mech has been requested to interact with OpenAI's API, so it forwards the prompt to it, but the tool can implement any other desired behavior. 7. The Mech gets a response from the tool. @@ -104,7 +105,7 @@ Follow these instructions to have your local environment prepared to run the dem poetry install && poetry shell ``` -2. Fetch the software packages using the [Open Autonomy](https://docs.autonolas.network/open-autonomy/) CLI +2. Fetch the software packages using the [Open Autonomy](https://docs.autonolas.network/open-autonomy/) CLI: ```bash autonomy packages sync --update-packages @@ -137,16 +138,19 @@ Now, you have two options to run the worker: as a standalone agent or as a servi ### Option 1: Run the Mech as a standalone agent 1. Ensure you have a file with a private key (`ethereum_private_key.txt`). You can generate a new private key file using the Open Autonomy CLI: + ```bash autonomy generate-key ethereum ``` 2. From one terminal, run the agent: + ```bash bash run_agent.sh ``` 3. From another terminal, run the Tendermint node: + ```bash bash run_tm.sh ``` @@ -154,16 +158,19 @@ Now, you have two options to run the worker: as a standalone agent or as a servi ### Option 2: Run the Mech as an agent service 1. Ensure you have a file with the agent address and private key (`keys.json`). You can generate a new private key file using the Open Autonomy CLI: + ```bash autonomy generate-key ethereum -n 1 ``` 2. Ensure that the variable `ALL_PARTICIPANTS` in the file `.1env` contains the agent address from `keys.json`: + ```bash ALL_PARTICIPANTS='["your_agent_address"]' ``` 3. Run, the service: + ```bash bash run_service.sh ``` @@ -178,21 +185,20 @@ Use the [mech-client](https://github.com/valory-xyz/mech-client), which can be u To perform mech requests from your service, use the [mech_interact_abci skill](https://github.com/valory-xyz/IEKit/tree/main/packages/valory/skills/mech_interact_abci). This skill abstracts away all the IPFS and contract interactions so you only need to care about the following: -- Add the mech_interact_abci skill to your dependency list, both in `packages.json`, `aea-config.yaml` and any composed `skill.yaml`. +- Add the mech_interact_abci skill to your dependency list, both in `packages.json`, `aea-config.yaml` and any composed `skill.yaml`. -- Import [MechInteractParams and MechResponseSpecs in your `models.py` file](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/impact_evaluator_abci/models.py#L88). You will also need to copy[ some dataclasses to your rounds.py](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L66-L97). +- Import [MechInteractParams and MechResponseSpecs in your `models.py` file](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/impact_evaluator_abci/models.py#L88). You will also need to copy [some dataclasses to your rounds.py](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L66-L97). -- Add mech_requests and mech_responses to your skills' `SynchonizedData` class ([see here](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L181-193)) +- Add mech_requests and mech_responses to your skills' `SynchonizedData` class ([see here](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L181-193)) -- To send a request, [prepare the request metadata](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/behaviours.py#L857), write it to [`synchronized_data.mech_requests`](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L535) and [transition into mech_interact](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L736). +- To send a request, [prepare the request metadata](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/behaviours.py#L857), write it to [`synchronized_data.mech_requests`](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L535) and [transition into mech_interact](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/rounds.py#L736). -- You will need to appropriately chain the `mech_interact_abci` skill with your other skills ([see here](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/impact_evaluator_abci/composition.py#L66)) and `transaction_settlement_abci`. +- You will need to appropriately chain the `mech_interact_abci` skill with your other skills ([see here](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/impact_evaluator_abci/composition.py#L66)) and `transaction_settlement_abci`. -- After the interaction finishes, the responses will be inside [`synchronized_data.mech_responses`](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/behaviours.py#L903) +- After the interaction finishes, the responses will be inside [`synchronized_data.mech_responses`](https://github.com/valory-xyz/IEKit/blob/main/packages/valory/skills/twitter_scoring_abci/behaviours.py#L903) For a complete list of required changes, [use this PR as reference](https://github.com/valory-xyz/market-creator/pull/91). - ## Build your own You can create and mint your own AI Mech that handles requests for tasks that you can define. @@ -200,7 +206,7 @@ You can create and mint your own AI Mech that handles requests for tasks that yo 1. **Create a new tool.** Tools are the components that execute the Requests for AI tasks submitted on [Mech Hub](https://aimechs.autonolas.network/mech). Tools are custom components and should be under the `customs` packages (ex. [valory tools](./packages/valory/customs)). Such file must contain a `run` function satisfying the following interface: ```python - def run(**kwargs) -> Tuple[Optional[str], Optional[Dict[str, Any]], Any, Any]:: + def run(**kwargs) -> Tuple[str, Optional[str], Optional[Dict[str, Any]], Any]: """Run the task""" # Your code here @@ -217,7 +223,7 @@ You can create and mint your own AI Mech that handles requests for tasks that yo =kwargs["api_keys"][]. ``` - - **Output**: It must **always** return a tuple (`Tuple[Optional[str], Any, Optional[Dict[str, Any]], Any]`): + - **Output**: It must **always** return a tuple (`Tuple[str, Optional[str], Optional[Dict[str, Any]], Any]`): - `result_str`: A string-serialized JSON object containing the result of the tool execution (custom format). - `prompt_used`: A string representing the prompt used internally by the tool. This output is only used for analytics and it can be set to `None`. - `generated_tx`: A dictionary containing the fields of a generated transaction to be submitted following the execution of the tool (e.g., a token transfer). It can be set to `None`. Template of a generated transaction: @@ -242,22 +248,25 @@ You can create and mint your own AI Mech that handles requests for tasks that yo ``` You should see an output similar to this: - ``` + + ```text Pushing: /home/ardian/vlr/mech/packages/valory/customs/openai_request Pushed component with: PublicId: valory/openai_request:0.1.0 Package hash: bafybeibdcttrlgp5udygntka5fofi566pitkxhquke37ng7csvndhy4s2i ``` - Your tool will be available on [packages.json](packages/packages.json). + Your tool will be available on [packages.json](packages/packages.json). 3. **Configure your service.** Edit the `.env` file. The demo service has this configuration: + ```bash FILE_HASH_TO_TOOLS=[["bafybeiaodddyn4eruafqg5vldkkjfglj7jg76uvyi5xhi2cysktlu4w6r4",["openai-gpt-3.5-turbo-instruct","openai-gpt-3.5-turbo","openai-gpt-4"]],["bafybeiepc5v4ixwuu5m6p5stck5kf2ecgkydf6crj52i5umnl2qm5swb4i",["stabilityai-stable-diffusion-v1-5","stabilityai-stable-diffusion-xl-beta-v2-2-2","stabilityai-stable-diffusion-512-v2-1","stabilityai-stable-diffusion-768-v2-1"]]] API_KEYS=[["openai","dummy_api_key"],["stabilityai","dummy_api_key"]] ``` To add your new tool with hash `` and sub-tool list `[a, b, c]` and API key `` simply update the variables above to: + ```bash FILE_HASH_TO_TOOLS=[[, [a, b, c]],["bafybeiaodddyn4eruafqg5vldkkjfglj7jg76uvyi5xhi2cysktlu4w6r4",["openai-gpt-3.5-turbo-instruct","openai-gpt-3.5-turbo","openai-gpt-4"]],["bafybeiepc5v4ixwuu5m6p5stck5kf2ecgkydf6crj52i5umnl2qm5swb4i",["stabilityai-stable-diffusion-v1-5","stabilityai-stable-diffusion-xl-beta-v2-2-2","stabilityai-stable-diffusion-512-v2-1","stabilityai-stable-diffusion-768-v2-1"]]] API_KEYS=[[openai, dummy_api_key],[, ]] @@ -269,6 +278,7 @@ You can create and mint your own AI Mech that handles requests for tasks that yo > AI Mechs run on the [Gnosis chain](https://www.gnosis.io/). You must ensure that your wallet is connected to the [Gnosis chain](https://www.gnosis.io/) before using the [Autonolas Protocol](https://protocol.autonolas.network/services/mint) and [Mech Hub](https://aimechs.autonolas.network/factory). Here is an example of the agent NFT metadata once you create the Mech: + ```json { "name": "Autonolas Mech III", @@ -324,15 +334,16 @@ You can create and mint your own AI Mech that handles requests for tasks that yo - `prediction-online`: In addition to training data, it also uses online information to improve the prediction. ## How key files look + A keyfile is just a file with your ethereum private key as a hex-string, example: -``` + +```text 0x0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcd ``` Make sure you don't have any extra characters in the file, like newlines or spaces. - -## Examples of deployed mechs +## Examples of deployed Mechs | Network | Service | | :---: | :---: |