diff --git a/docs/quickstart/quickstart_chains/algorand.md b/docs/quickstart/quickstart_chains/algorand.md
index 78056149242..feec5ce6684 100644
--- a/docs/quickstart/quickstart_chains/algorand.md
+++ b/docs/quickstart/quickstart_chains/algorand.md
@@ -1,7 +1,5 @@
# Algorand Quick Start
-## Goals
-
The goal of this quick guide is to adapt the standard starter project and start indexing [all the PLANET token transfers](https://algoexplorer.io/address/ZW3ISEHZUHPO7OZGMKLKIIMKVICOUDRCERI454I3DB2BH52HGLSO67W754) from Algorand. Check out the video or follow the step by step instructions below.
@@ -9,13 +7,7 @@ The goal of this quick guide is to adapt the standard starter project and start
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section.
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/algorand-subql-starter/tree/main/Algorand/algorand-starter).
@@ -169,71 +161,11 @@ Here, the function receives a `AlgorandTransaction` which includes all transacti
Check out our [Mappings](../../build/mapping/algorand.md) documentation to get more information on mapping functions.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -431,14 +363,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/algorand-subql-starter/tree/main/Algorand/algorand-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/arbitrum.md b/docs/quickstart/quickstart_chains/arbitrum.md
index eb1d944b3e8..36f9f7c7fbe 100644
--- a/docs/quickstart/quickstart_chains/arbitrum.md
+++ b/docs/quickstart/quickstart_chains/arbitrum.md
@@ -1,7 +1,5 @@
# Arbitrum Quick Start
-## Goals
-
The goal of this quick start guide is to index the total claimed dividends paid to users on the [WINR staking contract](https://arbiscan.io/address/0xddAEcf4B02A3e45b96FC2d7339c997E072b0d034) on Arbitrum. Check out the video or follow the step by step instructions below.
@@ -9,25 +7,15 @@ The goal of this quick start guide is to index the total claimed dividends paid
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an Arbitrum Nova project
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Arbitrum/arbitrum-one-winr). We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Arbitrum. Since Arbitrum is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
+The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Arbitrum/arbitrum-one-winr).
:::
-## 1. Your Project Manifest File
+We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Arbitrum. Since Arbitrum is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
-The Project Manifest (`project.ts`) file works as an entry point to your Arbitrum project. It defines most of the details on how SubQuery will index and transform the chain data. For Arbitrum, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/arbitrum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/arbitrum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/arbitrum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all claimed dividends from the WINR contract, the first step is to import the contract abi definition which can be obtained from [here](https://arbiscan.io/address/0xddaecf4b02a3e45b96fc2d7339c997e072b0d034#code). Copy the entire contract ABI and save it as a file called `winr-staking.abi.json` in the `/abis` directory.
@@ -74,11 +62,9 @@ As we are indexing all claimed dividends from the WINR contract, the first step
The above code indicates that you will be running a `handleDividendBatch` mapping function whenever there is a `ClaimDividendBatch` log on any transaction from the [WINR contract](https://arbiscan.io/address/0xddaecf4b02a3e45b96fc2d7339c997e072b0d034#code).
-Check out our [Manifest File](../../build/manifest/arbitrum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight and timestamp along with the user, the total rewards and the dividends.
@@ -98,48 +84,17 @@ type User @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Dividend, User } from "../types";
```
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
+
-In this example SubQuery project, you would import these types like so.
-
-```ts
-import { ClaimDividendBatchLog } from "../types/abi-interfaces/WinrStakingAbi";
-```
-
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see three exported functions: `handleBlock`, `handleLog`, and `handleTransaction`. Replace these functions with the following code:
@@ -186,71 +141,13 @@ The `handleDividendBatch` function receives a `batchDividendLog` parameter of ty
Check out our [Mappings](../../build/mapping/arbitrum.md) documentation to get more information on mapping functions.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation q{here
@@ -315,14 +212,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Arbitrum/arbitrum-one-winr).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/astar-zkatana.md b/docs/quickstart/quickstart_chains/astar-zkatana.md
index 2de747f0277..45cd622a283 100644
--- a/docs/quickstart/quickstart_chains/astar-zkatana.md
+++ b/docs/quickstart/quickstart_chains/astar-zkatana.md
@@ -1,30 +1,16 @@
# Astar zKatana Testnet Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [GACHA Token](https://zkatana.blockscout.com/token/0x28687c2A4638149745A0999D523f813f63b4786F) on Astar's zKatana Test Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an Astar zKatana project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Polygon/polygon-zkevm-starter).
-
-We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Astar zKatana. Since Astar zKatana is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
+The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Astar/astar-zkevm-testnet-starter).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Astar zKatana project. It defines most of the details on how SubQuery will index and transform the chain data. For Astar zKatana, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
+We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Astar zKatana. Since Astar zKatana is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the GACHA contract on Astar zKatana test network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -82,11 +68,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [GACHA Token](https://zkatana.blockscout.com/token/0x28687c2A4638149745A0999D523f813f63b4786F) on Astar's zKatana Test Network.
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -110,51 +94,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
+
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -202,73 +156,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -349,14 +243,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Astar/astar-zkevm-testnet-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/avalanche-crabada.md b/docs/quickstart/quickstart_chains/avalanche-crabada.md
index 593bcdf49db..9e328577bff 100644
--- a/docs/quickstart/quickstart_chains/avalanche-crabada.md
+++ b/docs/quickstart/quickstart_chains/avalanche-crabada.md
@@ -1,35 +1,18 @@
# Avalanche Quick Start - Crabada NFTs
-## Goals
-
The goal of this quick start guide is to index all Crabada NFTs on Avalanche's C-chain.
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Avalanche project**
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Avalanche/crabada-nft).
:::
-## 1. Update Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for Avalanche. Since Avalanche's C-chain is built on Ethereum's EVM, we can use the core Ethereum framework to index it.
:::
-
-The Project Manifest (`project.ts`) file works as an entry point to your Avalanche project. It defines most of the details on how SubQuery will index and transform the chain data. For Avalanche, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
We are indexing actions from the Crabada NFT contract, first you will need to import the contract abi defintion from https://snowtrace.io/address/0xe48b3a0dc82be39bba7b895c9ff1d788a54edc47#code. You can copy the entire JSON and save as a file `./abis/crabada.json` in the root directory.
This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -80,11 +63,9 @@ The above code indicates that you will be running a `handleNewCrab` mapping func
Additionally, whenever there is a `Transfer` log that relates to any token from the [Crabada Legacy Contract](https://snowtrace.io/address/0xCB7569a6Fe3843c32512d4F3AB35eAE65bd1D50c), we run a `handleERC721` mapping function.
-Check out our [Manifest File](../../build/manifest/avalanche.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing three entities, a `Deposit` and a `Withdrawl` each with a [foreign key relationship](../../build/graphql.md#entity-relationships) to the `User`.
@@ -120,50 +101,18 @@ type Address @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
+
```ts
import { Crab, Transfer, Address } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In the example Crabada SubQuery project, you would import these types like so.
-
-```ts
import { NewCrabLog, TransferLog } from "../types/abi-interfaces/Crabada";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -281,73 +230,13 @@ For the `handleNewCrab` mapping function, it receives a `NewCrabLog` which inclu
In `handleERC721`, we recieve a `TransferLog` from the token transfer, and then retrieve the `nftId`, `fromAddress`, and `toAddress` from it. After checking that we have a Crab entity (and creating one if not), we then create a new `Transfer` entity that we defined in our `schema.graphql` and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/avalanche.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -517,14 +406,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Avalanche/crabada-nft).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/avalanche.md b/docs/quickstart/quickstart_chains/avalanche.md
index 335fe187c74..249933e92f3 100644
--- a/docs/quickstart/quickstart_chains/avalanche.md
+++ b/docs/quickstart/quickstart_chains/avalanche.md
@@ -1,7 +1,5 @@
# Avalanche Quick Start - Pangolin Rewards
-## Goals
-
The goal of this quick start guide is to index all token deposits and transfers from the Avalanche's [Pangolin token](https://snowtrace.io/address/0x88afdae1a9f58da3e68584421937e5f564a0135b).
@@ -9,32 +7,18 @@ The goal of this quick start guide is to index all token deposits and transfers
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Avalanche project**
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Avalanche/pangolin-rewards-tutorial).
:::
-## 1. Update Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for Avalanche. Since Avalanche's C-chain is built on Ethereum's EVM, we can use the core Ethereum framework to index it.
:::
-The Project Manifest (`project.ts`) file works as an entry point to your Avalanche project. It defines most of the details on how SubQuery will index and transform the chain data. For Avalanche, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/avalanche.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
We are indexing actions from the Pangolin Rewards contract, first you will need to import the contract abi defintion from [here](https://snowtrace.io/token/0x88afdae1a9f58da3e68584421937e5f564a0135b). You can copy the entire JSON and save as a file `./abis/PangolinRewards.json` in the root directory.
This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -74,11 +58,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleLog` mapping function whenever there is an `RewardPaid` log on any transaction from the [Pangolin Rewards contract](https://snowtrace.io/token/0x88afdae1a9f58da3e68584421937e5f564a0135b).
-Check out our [Manifest File](../../build/manifest/avalanche.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing two entities, `PangolineRewards` and `Users` where receiver is of type `User` and rewards contains a reverse look up to the receiver field.
@@ -99,50 +81,18 @@ type User @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
+
```ts
import { PangolinRewards, User } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example Avalanche SubQuery project, you would import these types like so.
-
-```ts
import { RewardPaidLog } from "../types/abi-interfaces/PangolinRewards";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -191,73 +141,13 @@ Let’s understand how the above code works.
The mapping function here receives an `RewardPaidLog` which includes transaction log data in the payload. We extract this data and first read and confirm that we have a `User` record via `checkGetUser`. We then create a new `PangolinRewards` entity that we defined in our `schema.graphql` and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/avalanche.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -355,14 +245,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Avalanche/pangolin-rewards-tutorial).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/base-goerli.md b/docs/quickstart/quickstart_chains/base-goerli.md
index c57ba935c7a..08ba04d66a9 100644
--- a/docs/quickstart/quickstart_chains/base-goerli.md
+++ b/docs/quickstart/quickstart_chains/base-goerli.md
@@ -1,31 +1,17 @@
# Base Goerli Quick Start
-## Goals
-
The goal of this quick start guide is to index the total faucets dripped to users from the [USDC Faucet contract](https://goerli.basescan.org/address/0x298e0b0a38ff8b99bf1a3b697b0efb2195cfe47d) on [Base Goerli Testnet](https://docs.base.org/using-base/).
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Base project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Base/base-goerli-faucet).
+
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Base. Since Base is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Base project. It defines most of the details on how SubQuery will index and transform the chain data. For Base, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
As we are indexing all dripped faucets from the USDC Faucet contract, the first step is to import the contract abi definition which can be obtained from [here](https://goerli.basescan.org/address/0x298e0b0a38ff8b99bf1a3b697b0efb2195cfe47d). Copy the entire contract ABI and save it as a file called `faucet.abi.json` in the `/abis` directory.
**Update the `datasources` section as follows:**
@@ -83,9 +69,9 @@ The above code indicates that you will be running a `handleDrip` mapping functio
Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight and drip receiver along with an aggregation of the total value of the drip per day.
@@ -106,48 +92,18 @@ type DailyUSDCDrips @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Drip, DailyUSDCDrips } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import { DripTransaction } from "../types/abi-interfaces/FaucetAbi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleDrip` and `handleDailyDrips`:
@@ -195,73 +151,13 @@ export async function handleDailyDrips(
The `handleDrip` function receives a `tx` parameter of type `DripTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -357,14 +253,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Base/base-goerli-faucet).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/base.md b/docs/quickstart/quickstart_chains/base.md
index d813ab884fc..cf510d3ddf2 100644
--- a/docs/quickstart/quickstart_chains/base.md
+++ b/docs/quickstart/quickstart_chains/base.md
@@ -1,34 +1,16 @@
# Base Quick Start
-## Goals
-
The goal of this quick start guide is to index the all the claims from the [Bridge to Base NFT contract](https://basescan.org/token/0xEa2a41c02fA86A4901826615F9796e603C6a4491) on [Base Mainnet](https://docs.base.org/using-base/).
Here is a description from Base team about this NFT collection: _"This NFT commemorates you being early — you’re one of the first to teleport into the next generation of the internet as we work to bring billions of people onchain."_
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Base project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Code
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Base/base-nft).
:::
-## 1. Your Project Manifest File
-
-::: tip Etheruem
-We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Base. Since Base is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
-:::
-
-The Project Manifest (`project.ts`) file works as an entry point to your Base project. It defines most of the details on how SubQuery will index and transform the chain data. For Base, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all user claims from the Bridge to Base NFT contract, the first step is to import the contract abi definition which can be obtained from [here](https://basescan.org/token/0xEa2a41c02fA86A4901826615F9796e603C6a4491#code). Copy the entire contract ABI and save it as a file called `erc721base.abi.json` in the `/abis` directory.
@@ -72,11 +54,13 @@ As we are indexing all user claims from the Bridge to Base NFT contract, the fir
The above code indicates that you will be running a `handleNftClaim` mapping function whenever there is a `TokensClaimed` event being logged on any transaction from the [Bridge to Base NFT contract](https://basescan.org/token/0xEa2a41c02fA86A4901826615F9796e603C6a4491).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+::: tip Etheruem
+We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Base. Since Base is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
+:::
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, claimer and claim receiver along with an aggregation of the total quantity of NFTs claimed per day.
@@ -98,48 +82,18 @@ type DailyAggregation @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Claim, DailyAggregation } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import { TokensClaimedLog } from "../types/abi-interfaces/Erc721baseAbi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleNftClaim` and `handleDailyAggregation`:
@@ -188,73 +142,13 @@ export async function handleDailyAggregation(
The `handleNftClaim` function receives a `log` parameter of type `TokensClaimedLog` which includes log data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -350,14 +244,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Base/base-nft).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/boba-bnb.md b/docs/quickstart/quickstart_chains/boba-bnb.md
index 7ee6321cd93..04f742a77df 100644
--- a/docs/quickstart/quickstart_chains/boba-bnb.md
+++ b/docs/quickstart/quickstart_chains/boba-bnb.md
@@ -1,32 +1,15 @@
# Boba BNB Quick Start
-## Goals
-
Boba is a multichain Layer-2 network that has two blockchains currently; Boba Mainnet (ie Boba ETH), and Boba BNB Chain (ie Boba BNB). This guide will focus on [Boba BNB](https://chainlist.org/chain/56288). For [Boba ETH](https://chainlist.org/chain/288), please refer to [Boba ETH Quick Start](./boba-eth.md).
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped BOBA](https://bobascan.com/token/0xC58aaD327D6D58D979882601ba8DDa0685B505eA?chainid=56288) on [Boba BNB](https://bobascan.com/) Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Boba BNB project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
-
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Boba/boba-bnb-starter).
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Boba BNB. Since Boba BNB is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Boba BNB project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped BOBA contract on boba BNB network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -85,11 +68,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WBOBA contract](https://bobascan.com/token/0xC58aaD327D6D58D979882601ba8DDa0685B505eA?chainid=56288).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -113,51 +94,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -205,73 +156,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -341,14 +232,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Boba/boba-bnb-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/boba-eth.md b/docs/quickstart/quickstart_chains/boba-eth.md
index 001d55cf9e9..a43d51a6e3e 100644
--- a/docs/quickstart/quickstart_chains/boba-eth.md
+++ b/docs/quickstart/quickstart_chains/boba-eth.md
@@ -1,32 +1,15 @@
# Boba ETH Quick Start
-## Goals
-
Boba is a multichain Layer-2 network that has two blockchains currently; Boba Mainnet (ie Boba ETH), and Boba BNB Chain (ie Boba BNB). This guide will focus on [Boba ETH](https://chainlist.org/chain/288). For [Boba BNB](https://chainlist.org/chain/56288), please refer to [Boba BNB Quick Start](./boba-bnb.md).
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://bobascan.com/address/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000) on [Boba Mainnet](https://bobascan.com/) Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Boba ETH project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
-
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Boba/boba-eth-starter).
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Boba ETH. Since Boba ETH is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Boba ETH project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Boba ETH network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -85,11 +68,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://bobascan.com/address/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -113,51 +94,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -205,73 +156,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -382,14 +273,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Boba/boba-eth-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/bsc-pancakeswap-v3.md b/docs/quickstart/quickstart_chains/bsc-pancakeswap-v3.md
index 8ab8407d284..21f2669b42f 100644
--- a/docs/quickstart/quickstart_chains/bsc-pancakeswap-v3.md
+++ b/docs/quickstart/quickstart_chains/bsc-pancakeswap-v3.md
@@ -1,7 +1,5 @@
# BSC Quick Start - PancakeSwap
-## Goals
-
PancakeSwap, a prominent decentralized exchange (DEX) in the web3 ecosystem, relies on indexers to facilitate data retrieval for its user interface, enabling seamless interactions. Indexers like SubQuery systematically organise data related to tokens, liquidity pools, transactions, and other critical information, offering users an efficient and rapid means to search, discover, and analyze data within PancakeSwap.
The main objective of this article is to provide a comprehensive, step-by-step guide on configuring a SubQuery indexer for the PancakeSwap v3 protocol. This guide will encompass all the necessary settings and explore the intricacies of the underlying logic. It serves as an excellent illustration of how to perform indexing for a complex DEX like PancakeSwap.
@@ -24,9 +22,9 @@ In this PancakeSwap indexing project, our main focus is on configuring the index
To gain a deeper understanding of how these core mechanisms work, you can refer to the official [PancakeSwap documentation](https://docs.pancakeswap.finance/developers/smart-contracts/pancakeswap-exchange/v3-contracts).
-
+
-In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialization description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for **PancakeSwapV3Factory** at the bottom of [this page](https://bscscan.com/address/0x0bfbcf9fa4f9c56b0f40a671ad40e0805a091865#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract, as described in sections [1](#1-configuring-the-manifest-file), [2](#1-configuring-the-manifest-file-1), and [3](#1configuring-the-manifest-file).
+You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for **PancakeSwapV3Factory** at the bottom of [this page](https://bscscan.com/address/0x0bfbcf9fa4f9c56b0f40a671ad40e0805a091865#code).
::: tip Note
The code snippets provided further have been simplified for clarity. You can find the full and detailed code [here](https://github.com/subquery/ethereum-subql-starter/tree/main/BNB%20Smart%20Chain/bsc-pancake-swap/) to see all the intricate details.
@@ -36,7 +34,7 @@ The code snippets provided further have been simplified for clarity. You can fin
The core role of the factory contract is to generate liquidity pool smart contracts. Each pool comprises a pair of two tokens, uniting to create an asset pair, and is associated with a specific fee rate. It's important to emphasize that multiple pools can exist with the same asset pair, distinguished solely by their unique swap fees.
-#### 1.Configuring the Manifest File
+
In simple terms, there's only one event that requires configuration, and that's the `PoolCreated` event. After adding this event to the manifest file, it will be represented as follows:
@@ -78,11 +76,9 @@ In simple terms, there's only one event that requires configuration, and that's
}
```
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
+
-#### 2. Updating the GraphQL Schema File
+
Now, let's consider the entities that we can extract from the factory smart contract for subsequent querying. The most obvious ones include:
@@ -166,45 +162,18 @@ The attributes mentioned above represent only a subset of the available attribut
As you explore these attributes, you may notice the relationship between the `Pool` and `Token` entities. Additionally, you'll find numerous derived attributes like `mints` or `swaps`.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
// Import entity types generated from the GraphQL schema
import { Factory, Pool, Token } from "../types";
```
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
+
-#### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Writing mappings for the factory smart contract is a straightforward process. To provide better context, we've included this handler in a separate file `factory.ts` within the `src/mappings` directory. Let's start by importing the necessary modules.
@@ -320,7 +289,7 @@ Throughout this mapping and those that follow, numerous utility functions are em
As we discussed in the introduction of [Configuring the Indexer](#configuring-the-indexer), a new contract is created by the [factory contract](#pancakeswapv3factory) for each newly created pool.
-#### 1. Configuring the Manifest File
+
The contract factory generates fresh contract instances for each new pool, therefore we use [dynamic data sources](../../build/dynamicdatasources.md) to create indexers for each new contract:
@@ -391,7 +360,7 @@ The contract factory generates fresh contract instances for each new pool, there
}
```
-#### 2. Updating the GraphQL Schema File
+
Numerous entities can be derived from each newly created pool smart contract. To highlight some of the most crucial ones, you'll need to extend the `schema.graphql` file with the following entities:
@@ -498,24 +467,7 @@ type Transaction @entity {
Similar to the previously imported entities, we observe various relationships here. In this case, each new entity references both the `Token` and `Pool` entities, establishing a one-to-one relationship. Additionally, each new entity references a `Transaction` entity, which is the only one among the newly added entities not derived from logs. Instead, it's derived from an event to a specific transaction, showcasing the capabilities of the Subquery SDK.
-Now, the next step involves instructing the SubQuery CLI to generate types based on your project's updated GraphQL schema:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create update the existing `src/types` directory. All new entites can now be imported from the following directory:
+
```ts
import { Burn, Mint, Swap } from "../types";
@@ -528,7 +480,7 @@ import {
} from "../types/contracts/Pool";
```
-#### 3. Writing the Mappings
+
In this scenario, the mapping process involves two substeps:
@@ -643,7 +595,7 @@ Finally, the function saves the updated data for the swap, factory, pool, token0
As you may already know, swaps in PancakeSwap V3 are executed within the context of pools. To enable swaps, these pools must be liquid, and users provide liquidity to each specific pool. Each liquidity provision results in a Liquidity Position, essentially an NFT. This design enables a broader range of DeFi use cases. And the contract responsible for managing these provisions is known as the NonfungiblePositionManager.
-#### 1. Configuring the Manifest File
+
For the NonfungiblePositionManager smart contract, we want to introduce the following updates to the manifest file:
@@ -716,7 +668,7 @@ For the NonfungiblePositionManager smart contract, we want to introduce the foll
The configuration process closely resembles what we've seen earlier. However, we now have a completely new smart contract that we'll be handling events from. This entails different ABI, address, and start block values. Naturally, it also introduces new events, which are listed under the `handlers` object.
-#### 2. Updating the GraphQL Schema File
+
From this smart contract, the only new entity we'll emphasize is the `Position`:
@@ -744,24 +696,7 @@ type Position @entity {
Once more, we encounter connections to various entities like `Pool` and `Token`.
-Now, the next step involves instructing the SubQuery CLI to generate types based on your project's updated GraphQL schema:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create update the existing `src/types` directory. All new entites can now be imported from the following directory:
+
```ts
import { Position } from "../types";
@@ -773,7 +708,7 @@ import {
} from "../types/contracts/NonfungiblePositionManager";
```
-#### 3. Writing the Mappings
+
For this contract, we will craft the mappings in a file named `position-manager.ts`. Once again, this separation provides context and clarity.
@@ -858,73 +793,11 @@ To briefly clarify the code provided above: the handler function `handleIncrease
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/BNB%20Smart%20Chain/bsc-pancake-swap/) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
-
-
+
:::details Pools
@@ -1170,14 +1043,4 @@ query {
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the major PancakeSwap V3 entities and accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/bsc.md b/docs/quickstart/quickstart_chains/bsc.md
index 2654d751efc..ca7da2f7aad 100644
--- a/docs/quickstart/quickstart_chains/bsc.md
+++ b/docs/quickstart/quickstart_chains/bsc.md
@@ -1,16 +1,8 @@
# BNB Smart Chain (BSC) Quick Start
-## Goals
-
The goal of this quick start guide is to index all deposits and withdrawls to MOBOX pools. [MOBOX](https://www.mobox.io/) has built a unique infrastructure that builds on the growing DeFi ecosystem and combines it with Gaming through unique NFTs. Using Liquidity Pools, Yield Farming, and NFTs, the GameFi infrastructure will not just find the best yield strategies for users but also generate unique NFTs that can be used across a multitude of games.
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a BNB Smart Chain (BSC) project**
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/BNB%20Smart%20Chain/bsc-mobox-rewards).
@@ -22,18 +14,12 @@ The final code of this project can be found [here](https://github.com/subquery/e
-## 1. Update Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for BNB Smart Chain (BSC). Since BSC is a layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-The Project Manifest (`project.ts`) file works as an entry point to your BSC project. It defines most of the details on how SubQuery will index and transform the chain data. For BSC, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/bsc.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/bsc.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/bsc.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
We are indexing actions from the MOBOX Farming contract, first you will need to import the contract abi definition from [here](https://bscscan.com/address/0xa5f8c5dbd5f286960b9d90548680ae5ebff07652#code). You can copy the entire JSON and save as a file `mobox.abi.json` in the root directory.
@@ -86,11 +72,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleDeposit` mapping function whenever there is an `Deposit` log on any transaction from the [MOBOX Farming contract](https://bscscan.com/address/0xa5f8c5dbd5f286960b9d90548680ae5ebff07652). Simarly, you'll be running a `handleWithdraw` mapping function whenever there is an `Withdraw` logs.
-Check out our [Manifest File](../../build/manifest/bsc.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing three entities, a `Deposit` and a `Withdrawl` each with a [foreign key relationship](../../build/graphql.md#entity-relationships) to the `User`.
@@ -111,50 +95,19 @@ type Pool @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
+
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
-
-```ts
-import { Pool, PoolEvent } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In the example Polygon SubQuery project, you would import these types like so.
+In the example BSC SubQuery project, you would import these types like so.
```ts
import { DepositLog, WithdrawLog } from "../types/abi-interfaces/MoboxAbi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see three exported functions: `handleBlock`, `handleLog`, and `handleTransaction`. Replace these functions with the following code (**note the additional imports**):
@@ -229,73 +182,13 @@ For `handleDeposit`, the function here receives an `DepositLog` which includes t
For `handleWithdraw`, the function here receives an `WithdrawLog` which includes transaction log data in the payload. We extract this data and first confirm if we have a `Pool` record via `checkGetPool`. We then create a new `PoolEvent` entity that we defined in our `schema.graphql` and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_). We also decrease the total pool size by the new withdraw value.
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -483,14 +376,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/BNB%20Smart%20Chain/bsc-mobox-rewards).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/celo.md b/docs/quickstart/quickstart_chains/celo.md
index bdc4df5a67b..ac36f4ca802 100644
--- a/docs/quickstart/quickstart_chains/celo.md
+++ b/docs/quickstart/quickstart_chains/celo.md
@@ -1,30 +1,13 @@
# Celo Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://explorer.celo.org/mainnet/token/0x66803FB87aBd4aaC3cbB3fAd7C3aa01f6F3FB207) on Celo Mainnet.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Celo project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
-
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Celo/celo-starter).
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Celo. Since Celo is an EVM-compatible layer-1, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Celo project. It defines most of the details on how SubQuery will index and transform the chain data. For Celo, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Celo's Network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -80,11 +63,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://explorer.celo.org/mainnet/token/0x66803FB87aBd4aaC3cbB3fAd7C3aa01f6F3FB207).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -108,47 +89,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-If you're creating a new EVM-based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed. It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. All of these types are written to `src/types/abi-interfaces` and `src/types/contracts` directories. In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -189,73 +144,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -390,14 +285,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Celo/celo-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/concordium.md b/docs/quickstart/quickstart_chains/concordium.md
index ab615f58ce3..9e3eb176aa3 100644
--- a/docs/quickstart/quickstart_chains/concordium.md
+++ b/docs/quickstart/quickstart_chains/concordium.md
@@ -1,7 +1,5 @@
# Concordium Quick Start
-## Goals
-
The goal of this quick start guide is to give a quick intro to all features of our Concordium indexer. This SubQuery project indexes all transfer transactions, updated transaction events, and block rewards on the Concordium Test Network - it's a great way to quickly learn how SubQuery works on a real world hands-on example.
::: warning Important
@@ -222,71 +220,13 @@ For the `handleTransactionEvent` mapping function, the functions receives a new
Check out our [Mappings](../../build/mapping/concordium.md) documentation to get more information on mapping functions.
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md). The query shows a list of the most recent prices, and the most active oracles(by number of prices submitted).
+
```graphql
{
@@ -406,14 +346,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/concordium-subql-starter/tree/main/Concordium/concordium-testnet-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-agoric.md b/docs/quickstart/quickstart_chains/cosmos-agoric.md
index f42dd0a7d8d..36352c44461 100644
--- a/docs/quickstart/quickstart_chains/cosmos-agoric.md
+++ b/docs/quickstart/quickstart_chains/cosmos-agoric.md
@@ -1,14 +1,8 @@
# Agoric Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfer events and messages on the [Agoric network](https://agoric.com/).
-::: info
-The Agoric network is a chain based on the Cosmos SDK, which means you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
+
In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
@@ -16,16 +10,7 @@ In every SubQuery project, there are 3 key files to update. Let's begin updating
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Agoric/agoric-starter).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
{
@@ -62,13 +47,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
In the code above we have defined two handlers. `handleEvent` will be executed whenever a `transfer` type is detected within a message filter of `/cosmos.bank.v1beta1.MsgSend` type. `handleMessage` is the other handler that will be triggered when a `/cosmos.bank.v1beta1.MsgSend` filter type is triggered. These handlers ared used to track the transfers and messages within the Agoric network.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
-index all transfer events and messages
+
For this project, you'll need to modify your schema.graphql file as follows. Since we're indexing all [transfer events & messages](https://agoric.explorers.guru/transaction/69D296C6E643621429959A5B25D2F3DE1F1A67A5481FC1B7986322DBEA61BF8D) on the Agoric network, we have a TransferEvent and Message entity that contain a number of fields, including blockHeight, recipient/to, sender/from, and amount
@@ -92,36 +73,13 @@ type Message @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -178,75 +136,15 @@ In the Agoric SubQuery project, we have two main functions, namely `handleMessag
The `handleMessage` function is triggered when a `/cosmos.bank.v1beta1.MsgSend` type message is detected. It receives a message of type `CosmosMessage`, and then extracts key data points such as blockHeight, transaction hash, from, to and amount from the `msg` object.
-The `handleEvent` function is also triggered when a `/cosmos.bank.v1beta1.MsgSend` type message is detected for a transfer. It receives an event of type `CosmosEvent`, and then it also extracts blockHeight, transaction hash, from, to and amount from the `event` object.
-
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
+The `handleEvent` function is also triggered when a `/cosmos.bank.v1beta1.MsgSend` type message is detected for a transfer. It receives an event of type `CosmosEvent`, and then it also extracts blockHeight, transaction hash, from, to and amount from the `event` object.
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
{
@@ -330,14 +228,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Agoric/agoric-starter).
:::
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for indexing Agoric transfer events.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-akash.md b/docs/quickstart/quickstart_chains/cosmos-akash.md
index 4f446aa192c..8916f3765f9 100644
--- a/docs/quickstart/quickstart_chains/cosmos-akash.md
+++ b/docs/quickstart/quickstart_chains/cosmos-akash.md
@@ -1,31 +1,14 @@
# Akash Quick Start
-## Goals
-
The goal of this quick start guide is to index all [reward transactions](https://www.mintscan.io/akash/txs/808FED7F3FE680EEF8E005EC1927C0CF00D2975E4B26CEE7A098D5DA7DEA8217?height=11797219) for delegators in the [Akash network](https://akash.network/).
-::: info
-The Akash network is a chain based on the Cosmos SDK, which means you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Akash/akash-starter).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
{
@@ -60,11 +43,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
In the code above, we have defined a single handler, `handleReward`, that will be executed whenever a `withdraw_rewards` type is detected within a `MsgWithdrawDelegatorReward` type message. This handler is used to track the rewards transactions of delegators in the Akash network.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
For this project, you'll need to modify your schema.graphql file as follows. Since we're indexing all [reward transactions](https://www.mintscan.io/akash/txs/808FED7F3FE680EEF8E005EC1927C0CF00D2975E4B26CEE7A098D5DA7DEA8217?height=11797219) for delegators in the Akash network, we have a DelegatorReward entity that comprises a number of properties, including reward amount, delegator information, validator's address, and so forth.
We also have a Delegator entity, which keeps track of the total rewards of each delegator.
@@ -87,36 +68,13 @@ type Delegator @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -203,73 +161,13 @@ The `handleDelegator` function is invoked within the `handleReward` function. Th
This way, we're able to track all delegator rewards on the Akash network, along with the validator from whom the reward came. It's crucial to note that the `handleDelegator` function handles the Delegator entity creation and updates, whilst the `handleReward` function creates the DelegatorReward entity.
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
{
@@ -308,14 +206,4 @@ You will see the result similar to below:
}
```
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for indexing Akash delegator rewards.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-archway.md b/docs/quickstart/quickstart_chains/cosmos-archway.md
index ba73553e869..a09405ea26f 100644
--- a/docs/quickstart/quickstart_chains/cosmos-archway.md
+++ b/docs/quickstart/quickstart_chains/cosmos-archway.md
@@ -1,31 +1,14 @@
# Archway Quick Start
-## Goals
-
The goal of this quick start guide is to index all [Archway contract metadata](https://docs.archway.io/developers/rewards/managing-rewards#contract-metadata) as well as all rewards paid out to contract developers.
-::: info
-Archway is a chain based on the Cosmos SDK. Therefore you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Create a New Project](../quickstart.md)** section.
-:::
-
-In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Archway/archway-starter).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
{
@@ -71,11 +54,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
The above code defines that you will be running two handlers. A `handleRewardsWithdrawEvent` handler which will be triggered when a `RewardsWithdrawEvent` type is encountered on a `MsgWithdrawRewards` messageFilter type and a `handleSetContractMetadata` handler which will be triggered when a `MsgSetContractMetadata` type is encountered.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. In this project, since we are indexing all [Archway's contract metadata](https://docs.archway.io/developers/rewards/managing-rewards#contract-metadata) as well as all rewards paid to contract developers, we define one entity for each to record each instance of this. Each entity has a number of properties, including id, blockheight, transaction hash and the timestamp, we are also indexing contract, owner and reward addresses.
@@ -101,36 +82,13 @@ type RewardWithdrawl @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -221,73 +179,13 @@ Here we have two functions, `handleSetContractMetadata` and `handleRewardsWithdr
The `handleRewardsWithdrawEvent` function works in a similar way where the event of type `CosmosEvent` is passed into this function and then we look for certain event attributes to index by searching through the attribute keys. Finally, the fields of the `RewardWithdrawl` entity are populated appropriately.
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -340,14 +238,4 @@ You will see the result similar to below:
}
```
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for indexing Archway contract metadata and the rewards paid to contract developers.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-cronos.md b/docs/quickstart/quickstart_chains/cosmos-cronos.md
index 0e1f27798f5..fbb87c03745 100644
--- a/docs/quickstart/quickstart_chains/cosmos-cronos.md
+++ b/docs/quickstart/quickstart_chains/cosmos-cronos.md
@@ -1,7 +1,5 @@
# Cronos Quick Start
-## Goals
-
The goal of this quick start guide is to adapt the standard starter project in the Cronos Network and then begin indexing all transfers of [Cro Crow Token](https://www.crocrow.com/).
@@ -11,28 +9,15 @@ The goal of this quick start guide is to adapt the standard starter project in t
::: warning Important
Cronos is an EVM compatible (Ethermint) chain, as such there are two options for indexing Cronos data. You can index chain data via the standard Cosmos RPC interface, or via Ethereum APIs. For Cronos, we provide a starter project for each.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
:::
-Now, let's move ahead in the process and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/deverka/cronos_crow_token_transfers).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that look for on the blockchain to start indexing.
+
::: warning Important
There are two versions of this file depending on your choice to index data via the ETH or Cosmos RPC
@@ -148,9 +133,9 @@ Please note that Cro Crow token requires a specific ABI interface. You need to:
- Link this file as an erc20 asset in the manifest file.
:::
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. The aim is to index all transfers of [Cro Crow Token](https://www.crocrow.com/).
@@ -163,26 +148,9 @@ type Transfer @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
+
If you're creating as an CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
@@ -190,13 +158,9 @@ If you're creating as an EVM based project, this command will also generate ABI
It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
+
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will see setup types for ABI `TransferEventArgs` and `ApproveCallArgs`. Delete those for approvals. You will also see two exported functions: `handleEthermintEvmEvent` & `handleEthermintEvmCall` or `handleLog` & `handleTransaction`. Delete them as well.
@@ -267,73 +231,13 @@ export async function handleTransfer(
:::
-Let’s understand how the above code works. Here, the function receives an `EthereumLog` or `EthermintEvmEvent` which includes data on the payload. We extract this data and then create a new `Transfer` entity defined earlier in the `schema.graphql` file. After that we use the `.save()` function to save the new entity (SubQuery will automatically save this to the database). Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, never forget to run it locally on your computer and test it. And using Docker is the most hassle-free way to do this.
-
-`docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, no major changes are needed.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
+
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
{
@@ -385,8 +289,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/deverka/cronos_crow_token_transfers).
:::
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data from bLuna.
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-juno.md b/docs/quickstart/quickstart_chains/cosmos-juno.md
index c2bc78a3577..932a8c9888c 100644
--- a/docs/quickstart/quickstart_chains/cosmos-juno.md
+++ b/docs/quickstart/quickstart_chains/cosmos-juno.md
@@ -1,31 +1,14 @@
# Juno Quick Start
-## Goals
-
The goal of this quick start guide is to adapt the standard starter project in the Juno Network and then begin indexing all votes on the [Terra Developer Fund](https://daodao.zone/multisig/juno1lgnstas4ruflg0eta394y8epq67s4rzhg5anssz3rc5zwvjmmvcql6qps2) (which also contributed to SubQuery) from Cosmos.
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-Now, let's move ahead in the process and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/juno-terra-developer-fund-votes).
:::
-## 1. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers to look for on the blockchain to start indexing.
+
```ts
{
@@ -62,11 +45,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
The above code defines that you will be running a `handleTerraDeveloperFund` mapping function whenever there is a message with a `vote` contract call from the [Terra Developer Fund](https://daodao.zone/multisig/juno1lgnstas4ruflg0eta394y8epq67s4rzhg5anssz3rc5zwvjmmvcql6qps2) smart contract.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. The aim is to index all votes on the [Terra Developer Fund](https://daodao.zone/multisig/juno1lgnstas4ruflg0eta394y8epq67s4rzhg5anssz3rc5zwvjmmvcql6qps2).
@@ -80,36 +61,13 @@ type Vote @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will see four exported functions: `handleBlock`, `handleEvent`, `handleMessage`, `handleTransaction`. Delete `handleBlock`, `handleEvent`, and `handleTransaction` functions as you will only deal with the `handleMessage` function.
@@ -141,73 +99,13 @@ Let’s understand how the above code works.
Here, the function receives a `CosmosMessage` which includes message data on the payload. We extract this data and then instantiate a new `Vote` entity defined earlier in the `schema.graphql` file. After that, we add additional information and then use the `.save()` function to save the new entity (SubQuery will automatically save this to the database).
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
+
-## 5. Run Your Project Locally with Docker
+
-Whenever you create a new SubQuery Project, never forget to run it locally on your computer and test it. And using Docker is the most hassle-free way to do this.
+
-`docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, no major changes are needed.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -257,14 +155,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/jamesbayly/juno-terra-developer-fund-votes).
:::
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data from bLuna.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-neutron.md b/docs/quickstart/quickstart_chains/cosmos-neutron.md
index 02db205fec0..f80257b9ecc 100644
--- a/docs/quickstart/quickstart_chains/cosmos-neutron.md
+++ b/docs/quickstart/quickstart_chains/cosmos-neutron.md
@@ -1,31 +1,14 @@
# Neutron Quick Start
-## Goals
-
The goal of this quick start guide is to index all [airdrop claims](https://www.mintscan.io/neutron/wasm/contract/neutron198sxsrjvt2v2lln2ajn82ks76k97mj72mtgl7309jehd0vy8rezs7e6c56) on [Neutron Network](https://www.mintscan.io/neutron/).
-::: info
-Neutron Network is a chain based on the Cosmos SDK. Therefore you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Neutron/neutron-starter).
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
{
@@ -57,11 +40,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
The above code defines that you will be running one handler: A `handleAirdropClaim` message handler which will be triggered when a `claim` message is encountered on a `MsgExecuteContract` type. The `contract` value is the address of the neutron airdrop contract.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. In this project, since we are indexing all [airdrop claims](https://www.mintscan.io/neutron/wasm/contract/neutron198sxsrjvt2v2lln2ajn82ks76k97mj72mtgl7309jehd0vy8rezs7e6c56) on Neutron, we have a `Claim` entity that includes a number of properties, including transaction hash and block data as well as date, amount and receiver data.
@@ -83,36 +64,13 @@ type DailyClaimSummary @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -184,73 +142,13 @@ An airdrop claim object is then created, provided that it hasn't been created al
The `checkGetDailyClaim` function is called by the previous function to determine the total quantity of claims made during the day. It is called when each new claim object is created or updated.
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
{
@@ -340,14 +238,4 @@ You will see the result similar to below:
}
```
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data from bLuna.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-osmosis.md b/docs/quickstart/quickstart_chains/cosmos-osmosis.md
index 10c1df1e64a..8c3b074dc39 100644
--- a/docs/quickstart/quickstart_chains/cosmos-osmosis.md
+++ b/docs/quickstart/quickstart_chains/cosmos-osmosis.md
@@ -2,17 +2,9 @@
[Osmosis](https://osmosis.zone/) is a DEX built on the Cosmos ecosystem. It is designed to allow users to trade tokens from different blockchains that are part of the Cosmos ecosystem. Osmosis uses the IBC protocol to enable the transfer of assets between different blockchains, including [Cosmos Hub](https://github.com/subquery/cosmos-subql-starter/tree/main/CosmosHub/cosmoshub-starter), [Akash](./cosmos-akash.md), and others.
-## Goals
-
This guide acts as your entrance to a detailed tutorial for configuring a SubQuery indexer that is specifically designed to index swaps occurring on Osmosis. Upon completing this guide, you will possess a solid understanding of the process for indexing data related to a complex DEX such as Osmosis.
-::: info
-Osmosis based on the Cosmos SDK, which means you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Osmosis/osmosis-starter). We also offer a [pre-recorded workshop](https://www.youtube.com/watch?v=Rp4d4NbVzo4) for this sample project, simplifying the process of keeping up with it.
@@ -24,17 +16,7 @@ The final code of this project can be found [here](https://github.com/subquery/c
-
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
dataSources: [
@@ -59,9 +41,9 @@ dataSources: [
Within the provided code snippet, we've established a single handler `handleMessage`, which will execute every time a message of the `MsgSwapExactAmountIn` type is detected. This handler is sufficient to monitor and record swaps within Osmosis. Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
For this project, you'll need to modify your `schema.graphql` file as follows.
@@ -93,28 +75,9 @@ type Pool @entity {
Since we're indexing all swaps, we have a Swap entity that comprises a number of properties, including the sender, ammounts, swapRoutes, and so forth. We also deriving `SwapRoute` as a separate entity as it carries important business information. Finally, we also declaring a `Pool` entity and connnect it to `SwapRoute` so that we can get all the routes that a particular pool take place it.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
+
-:::
+
You will find the generated models in the `/src/types/models` directory. You can conveniently import all these entities from the following directory:
@@ -160,13 +123,9 @@ The relevant types can be imported from the directory with the newly generated c
import { MsgSwapExactAmountInMessage } from "../types/CosmosMessageTypes";
```
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -232,71 +191,13 @@ The provided code has a single handler `handleMessage` - the main function respo
🎉 Now, you've effectively developed the handling logic for the Osmosis swaps and populated queryable entities, such as `Swap`, `Pool` and `SwapRoute`. This means you're ready to move on to the [construction phase](#build-your-project) to test the indexer's functionality thus far.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
+
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
:::details Swaps, Routes and Pools
@@ -378,14 +279,4 @@ Try the following query to understand how it works for your new SubQuery starter
:::
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for indexing Osmosis swaps.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-other.md b/docs/quickstart/quickstart_chains/cosmos-other.md
index 3c71e197d4f..bbf7f183b0c 100644
--- a/docs/quickstart/quickstart_chains/cosmos-other.md
+++ b/docs/quickstart/quickstart_chains/cosmos-other.md
@@ -72,14 +72,4 @@ Please contribute your new chain back to our example projects by making a PR bac
We really appreciate it, and will absolutely give you a shout out for your effort on social media to our community!
-## What’s Next?
-
-Congratulations! You have now a locally running Cosmos SubQuery project that accepts GraphQL API requests.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-sei.md b/docs/quickstart/quickstart_chains/cosmos-sei.md
index 4ed2b24152a..340190d8337 100644
--- a/docs/quickstart/quickstart_chains/cosmos-sei.md
+++ b/docs/quickstart/quickstart_chains/cosmos-sei.md
@@ -1,31 +1,14 @@
# Sei Quick Start
-## Goals
-
The goal of this quick start guide is to index all ETH-USD exchange rates provided to [Levana’s Sei DEX protocol](https://blog.levana.finance/levana-perpetual-swap-beta-now-live-on-sei-networks-testnet-a-new-era-for-decentralized-crypto-fc0930ea4b9) by the Pyth price oracle.
-::: info
-Sei Network is a chain based on the Cosmos SDK. Therefore you can index chain data via the standard Cosmos RPC interface.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-In every SubQuery project, there are 3 key files to update. Let's begin updating them one by one.
+
::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Sei/sei-starter).
:::
-## 1. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
```ts
{
@@ -66,11 +49,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
The above code defines that you will be running two handlers. A `handleFundingRateChangeEvent` handler which will be triggered when a `wasm-funding-rate-change` type is encountered on a `MsgExecuteContract` type and a `handleSpotPriceEvent` handler which will be triggered when a `wasm-spot-price` type is encountered on a `MsgExecuteContract` type.
-Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. In this project, since we are indexing all ETH-USD exchange rates provided to [Levana’s Sei DEX protocol](https://blog.levana.finance/levana-perpetual-swap-beta-now-live-on-sei-networks-testnet-a-new-era-for-decentralized-crypto-fc0930ea4b9) by the Pyth price oracle, we have a `ExchangeRate` entity that includes a number of properties, including exchange rate data such as the notional and USD price, the long and short rate and also contract details.
@@ -100,36 +81,13 @@ type DailyAggregation @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
+
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
+
-You will find the generated models in the `/src/types/models` directory.
-
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
-
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -262,73 +220,13 @@ The `handleSpotPriceEvent` handler function works in the same way.
The `updateDailyAggregation` function is called by the previous two functions to determine the highest and lowest price of the day along with the opening and closing price of the day. It is called when each new exhanged rate object is created or update.
-Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -411,14 +309,4 @@ You will see the result similar to below:
}
```
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data from bLuna.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/cosmos-thorchain.md b/docs/quickstart/quickstart_chains/cosmos-thorchain.md
index cefa87166a8..0c2d3f75283 100644
--- a/docs/quickstart/quickstart_chains/cosmos-thorchain.md
+++ b/docs/quickstart/quickstart_chains/cosmos-thorchain.md
@@ -1,16 +1,8 @@
# Thorchain Quick Start
-## Goals
-
The goal of this quick start guide is to indexing all deposit messages of Thorchain.
-::: warning Important
-Thorchain is an chain based on the Cosmos SDK. You can index chain data via the standard Cosmos RPC interface but there is no smart contract layer on Thorchain yet.
-
-Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
-:::
-
-Now, let's move ahead in the process and update these configurations.
+
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
@@ -18,16 +10,7 @@ Previously, in the [1. Create a New Project](../quickstart.md) section, you must
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Thorchain/thorchain-starter).
:::
-## 1. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every transaction, run a mapping function
-- [MessageHandlers](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every message that matches optional filter criteria, run a mapping function
-- [EventHanders](../../build/manifest/cosmos.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that look for on the blockchain to start indexing.
+
```ts
{
@@ -54,9 +37,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
The above code defines that you will be running a `handleMessage` mapping function whenever there is an message emitted with the `/types.MsgDeposit` type. Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Update the `schema.graphql` file as follows. The aim is to index all deposit messages. Since each deposit can include multiple tokens, we need to define a [many-to-many relationship](../../build/graphql.md#man) between the Deposit and Coin - we use the DepositCoin entity to link these two entities.
@@ -85,36 +68,13 @@ type Coin @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, do not forget to regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
+
-As you're creating a new CosmWasm based project, this command will also generate types for your listed protobufs and save them into `src/types` directory, providing you with more typesafety. Read about how this is done in [Cosmos Codegen from CosmWasm Protobufs](../../build/introduction.md#cosmos-codegen-from-cosmwasm-protobufs).
+
-Check out our [GraphQL Schema](../../build/graphql.md) documentation to get more information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s go ahead with the next configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions determine how chain data is transformed into the optimised GraphQL entities that you previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory and update your mapping files to match the following (**note the additional imports**):
@@ -165,71 +125,13 @@ export async function handleMessage(
Let’s understand how the above code works. Here, the function receives an `CosmosMessage` which includes data on the payload that we decode using the supplied `` type definition. We extract this data and then create a new `Deposit` entity defined earlier in the `schema.graphql` file. For each `coin` in the deposit message, we then check if the coin is known, and then link it to the `Deposit` entity using a `DepositCoin`. After that we use the `.save()` function to save the new entity (SubQuery will automatically save this to the database). Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, never forget to run it locally on your computer and test it. And using Docker is the most hassle-free way to do this.
-
-`docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, no major changes are needed.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
+
-:::
-
-::: tip
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
+
-## 6. Query your Project
+
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -278,8 +180,4 @@ You will see the result in JSON
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Thorchain/thorchain-starter).
:::
-## What’s Next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data from bLuna.
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-bayc.md b/docs/quickstart/quickstart_chains/ethereum-bayc.md
index 09b53c702a2..91cc8404d2b 100644
--- a/docs/quickstart/quickstart_chains/ethereum-bayc.md
+++ b/docs/quickstart/quickstart_chains/ethereum-bayc.md
@@ -1,7 +1,5 @@
# Ethereum Quick Start - BAYC (Simple)
-## Goals
-
The goal of this article is to provide a comprehensive guide to setting up an indexer for the Bored Ape Yacht Club (BAYC) smart contract. By the end of this guide, readers will have a clear understanding of the following what BAYC is and why its smart contract data is valuable for indexing. This guide also shows how to set up an indexer, step by step, to track and index data from the BAYC smart contract on the Ethereum blockchain.
**This guide is designed to seamlessly lead you through the steps of configuring your personal BAYC SubQuery indexer.**
@@ -12,13 +10,13 @@ In this BAYC indexing project, our main goal is to set up the indexer to only co
The BAYC contract builds on [OpenZeppelin's ERC721](https://github.com/OpenZeppelin/openzeppelin-contracts/blob/master/contracts/token/ERC721/ERC721.sol) with special BAYC features. You can find the contract's source code on [Etherscan](https://etherscan.io/address/0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D#code) or [Github](https://github.com/OpenZeppelin/openzeppelin-contracts/blob/master/contracts/token/ERC721/ERC721.sol) for easier reading.
-In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialization description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for the main BAYC smart contract at the bottom of [this page](https://etherscan.io/address/0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+
::: tip Note
You can find the full and detailed code [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-bayc) to see all the intricate details.
:::
-### 1.Configuring the Manifest File
+
You only need to set up one handler to index a specific type of log from this contract, which is the `OrderFulfilled` log. Update your manifest file to look like this:
@@ -94,11 +92,9 @@ export default project;
As evident in the manifest file, this project includes two handlers: firstly, a transaction handler responsible for capturing the `mintApe` function, and secondly, a log handler tasked with indexing the `Transfer` log.
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
+
-### 2. Updating the GraphQL Schema File
+
Now, let's think about what information we can get from this smart contract for later searching.
@@ -145,46 +141,19 @@ type Properties @jsonField {
Three entities are derived from the handlers mentioned earlier: `BoredApes`, `Mint` (used to store data associated with BoredApe transaction creation), and `Transfers` of BoredApes. The Bored Ape entity features a `currentOwner`, which changes with each transfer, and it includes properties like metadata stored on IPFS. Clearly, these apes were initially created using a specific function and may have been transferred, and this project monitors both types of transaction entities. Both the `Transfer` and `Mint` entities are associated with a `BoredApe`, enabling retrieval of all transfers and the `Mint` entity within the Bored Ape entity.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
+
-You can conveniently import all these entities from the following directory:
+
```ts
import { Transfer, BoredApe, Properties, Mint } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
import { TransferLog } from "../types/abi-interfaces/BaycAbi";
import { MintApeTransaction } from "../types/abi-interfaces/BaycAbi";
```
-### 3. Writing the Mappings
+
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
+
::: tip Note
For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
@@ -337,71 +306,13 @@ Both handlers use `getOrCreateApe` function. It attempts to retrieve a `BoredApe
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-bayc) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
+
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -436,14 +347,4 @@ query {
}
```
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the major BAYC entities and accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-chainlink.md b/docs/quickstart/quickstart_chains/ethereum-chainlink.md
index d750c0d485b..da9b102744a 100644
--- a/docs/quickstart/quickstart_chains/ethereum-chainlink.md
+++ b/docs/quickstart/quickstart_chains/ethereum-chainlink.md
@@ -2,8 +2,6 @@
[Chainlink](https://chain.link/) is a groundbreaking decentralized oracle network that empowers smart contracts to interact with real-world data seamlessly. At the heart of Chainlink's capabilities lies its [Data Feeds](https://chain.link/data-feeds), an essential component that bridges the gap between blockchain and external data sources.
-## Goals
-
This guide serves as your gateway to a comprehensive guide on setting up a SubQuery indexer specifically tailored to index data from Chainlink Data Feeds. Our mission is to provide you with a detailed, step-by-step journey through the indexer setup process. We'll delve deep into the necessary configurations and explore the intricacies of the underlying logic. By the end of this guide, you'll have a clear understanding of how to index data for a complex oracle network like Chainlink.
This project is an excellent example of [SubQuery's Dynamic Data sources](../../build/dynamicdatasources.md). Chainlink has a `ChainlinkFeedRegistry`, a factory contract that creates other chainlink aggregator contracts. It also gives a real life example of how you can use SubQuery's contract type-generation to access contract functions on the ABI smart contracts.
@@ -18,9 +16,11 @@ In this ChainLink indexing project, our primary focus centers on configuring the
For a more comprehensive understanding of how these fundamental mechanisms operate, you can consult the official [Chainlink documentation](https://docs.chain.link/data-feeds/feed-registry).
-
+
+
+
-In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialization description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for **ChainlinkFeedRegistry** at the bottom of [this page](https://etherscan.io/address/0x47fb2585d2c56fe188d0e6ec628a38b74fceeedf#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+For instance, you can locate the ABI for **ChainlinkFeedRegistry** at the bottom of [this page](https://etherscan.io/address/0x47fb2585d2c56fe188d0e6ec628a38b74fceeedf#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
::: tip Note
The code snippets provided further have been simplified for clarity. You can find the full and detailed code [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-chainlink) to see all the intricate details.
@@ -30,7 +30,7 @@ The code snippets provided further have been simplified for clarity. You can fin
Consider the registry smart contract as a dictionary that comprehensively maps all the available feeds. When a new feed is added, this smart contract triggers an event, from which you can extract the address of the respective feed's smart contract.
-#### 1.Configuring the Manifest File
+
In plain language, you only need to set up one handler to index a specific type of log from this contract, which is the `FeedConfirmed` log. Update your manifest file to look like this:
@@ -72,11 +72,9 @@ In plain language, you only need to set up one handler to index a specific type
}
```
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
+
-#### 2. Updating the GraphQL Schema File
+
Now, let's think about what information we can get from this smart contract for later searching. The only piece of information we can obtain is the 'DataFeed':
@@ -100,50 +98,19 @@ type DataFeed @entity {
As you look into these features, you'll see that there's only one connection mentioned, which is in the `prices` part, and it links to another thing called `DataPoint`. We'll talk about this entity in the [next section](#data-feed-aggregator-contracts).
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
// Import entity types generated from the GraphQL schema
import { DataFeed } from "./types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
import { FeedConfirmedEvent } from "./types/contracts/FeedRegistry";
```
-#### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
+
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Writing mappings for this smart contract is a straightforward process. To provide better context, we've included this handler in a separate file `feed-registry.ts` within the `src/mappings` directory. Let's start by importing the necessary modules.
@@ -189,13 +156,13 @@ Explaining the code provided above, the `handleFeedConfirmed` function accepts a
It first retrieves the previous data feed using the address provided in the event. Then, it checks if the previous data feed is null (meaning it doesn't exist) or if the latest aggregator address in the event is not a zero address. If either condition is true, it creates a new data feed with various attributes and saves it to the database. If the previous data feed exists and the latest aggregator address is not a zero address, it updates the live status of the previous data feed and, if it's set to false, records the deprecation time.
-🎉 Now, you've effectively developed the handling logic for the data feed registry smart contract and populated queryable entity `DataFeed`. This means you're ready to move on to the [construction phase](#build-your-project) to test the indexer's functionality thus far.
+🎉 Now, you've effectively developed the handling logic for the data feed registry smart contract and populated queryable entity `DataFeed`. This means you're ready to move on to the [build phase](#build-your-project) to test the indexer's functionality thus far.
### Data Feed Aggregator Contracts
As mentioned in the introduction to [Indexer Configuration](#setting-up-the-indexer), a fresh contract is linked to the [feed registry smart contract](#chainlinkfeedregistry) whenever a new feed is confirmed. We use SubQuery's [Dynamic Data Sources](../../build/dynamicdatasources.md) to create a new listener for each new price feed using the following template.
-#### 1. Configuring the Manifest File
+
The feed registry smart contract establishes a connection with a data feed contract for each new data feed. Consequently, we utilize [dynamic data sources](../../build/dynamicdatasources.md) to generate indexers for each new contract:
@@ -232,7 +199,7 @@ The feed registry smart contract establishes a connection with a data feed contr
}
```
-#### 2. Updating the GraphQL Schema File
+
Once more, from each newly linked smart contract, we will extract a single entity known as a `DataPoint`. You can expand the `schema.graphql` file to include it in the following way:
@@ -249,31 +216,14 @@ type DataPoint @entity {
In addition to the core attributes, we can observe the connection to the entity established in the [previous step](#chainlinkfeedregistry), which is `DataFeed`. As evident, there exists only one data feed for each data point, while multiple data points are associated with a single data feed.
-Now, the next step involves instructing the SubQuery CLI to generate types based on your project's updated GraphQL schema:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create update the existing `src/types` directory. All new entites can now be imported from the following directory:
+
```ts
import { AnswerUpdatedEvent } from "./types/contracts/AccessControlledOffchainAggregator";
import { DataFeed, DataPoint } from "./types";
```
-#### 3. Writing the Mappings
+
In this scenario, the mapping process involves two substeps:
@@ -349,71 +299,11 @@ This code defines a function, `handleAnswerUpdated`, which handles events relate
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-chainlink) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
+
-## Run Your Project Locally with Docker
+
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Query `dataFeeds`
@@ -587,14 +477,4 @@ If you take the value listed under `price` and divide it by 10 to the power of t
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the major Chainlink Data Feeds entities and accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-ens.md b/docs/quickstart/quickstart_chains/ethereum-ens.md
index 3eb1f84e10a..654d42c670e 100644
--- a/docs/quickstart/quickstart_chains/ethereum-ens.md
+++ b/docs/quickstart/quickstart_chains/ethereum-ens.md
@@ -1,20 +1,14 @@
# Ethereum Quick Start - ENS (Complex)
-## Goals
+This project can be use as a starting point for developing your new Ethereum SubQuery project, it indexes all ENS Records in the ENS registry.
-This project can be use as a starting point for developing your new Ethereum SubQuery project, it indexes all ENS Records in the ENS registry
+
-
+
Now, let's move forward and fork the example code for this project from [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-ens)
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Ethereum project. It defines most of the details on how SubQuery will index and transform the chain data. For Ethereum, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
+
The main concepts in this ENS project is that it only indexes logs from ENS' various smart contracts, LogHandlers are the most common type of handlers for Ethereum, and it shows here in this example project. There are a total of 31 different log handlers in this project.
@@ -145,11 +139,9 @@ Secondly, note that there are 7 different ABIs imported into this project. We gi
}
```
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
You'll see that there are 33 GraphQL entities in the ENS project with many foreign key relationships between them. Take for example the `Domain` and `DomainEvent` entities. There is a one to many relationship between `Domain` and `DomainEvent`, and there is also a one to many relationship that `Domain` has with itself (via the `parent` property), we've event created a virtual `subdomains` field that can be used to navigate via the GraphQL entities.
@@ -170,28 +162,11 @@ type DomainEvent @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn codegen
-```
+
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
+All entites can be imported from the following directory:
```ts
// Import entity types generated from the GraphQL schema
@@ -204,16 +179,6 @@ import {
NewResolver,
NewTTL,
} from "../types";
-```
-
-As you're creating a new Etheruem based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-All of these types are written to `src/typs/abi-interfaces` and `src/typs/contracts` directories. In the example Gravatar SubQuery project, you would import these types like so.
-
-```ts
-// Import event types from the registry contract ABI
import {
NewOwnerEvent,
TransferEvent,
@@ -222,81 +187,19 @@ import {
} from "../types/contracts/Registry";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
They operate in a similar way to SubGraphs, and you can see wiht ENS that they are contained in 4 different files with the addition of a helper `utils.ts`.
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
+
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -389,14 +292,4 @@ query {
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-ens).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-gravatar.md b/docs/quickstart/quickstart_chains/ethereum-gravatar.md
index 6c3bc724a4e..b658f31548e 100644
--- a/docs/quickstart/quickstart_chains/ethereum-gravatar.md
+++ b/docs/quickstart/quickstart_chains/ethereum-gravatar.md
@@ -1,16 +1,8 @@
# Ethereum Quick Start - Gravatar (Simple)
-## Goals
-
The goal of this quick start guide is to index all Ethereum Gravatars created or updated on the Ethereum mainnet.
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Ethereum project**
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-gravatar).
@@ -22,15 +14,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
-## 1. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Ethereum project. It defines most of the details on how SubQuery will index and transform the chain data. For Ethereum, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
Since we are indexing all Gravatars from the Gravatar contract, the first step is to import the contract abi definition. Copy the entire JSON and save it as a file called `./Gravity.json` in the `/abis` directory.
@@ -77,11 +61,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleLog` mapping function whenever there is an `NewGravatar` or `UpdatedGravatar` log on any transaction from the [Gravatar contract](https://etherscan.io/address/0x2E645469f354BB4F5c8a05B3b30A929361cf77eC).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing the id, owner, display name, image URL and the block the gravatar was created in.
@@ -95,47 +77,21 @@ type Gravatar @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Gravatar } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In the example Gravatar SubQuery project, you would import these types like so.
-
-```ts
import {
NewGravatarLog,
UpdatedGravatarLog,
} from "../types/abi-interfaces/Gravity";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
+
+
Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
@@ -205,73 +161,13 @@ For `handleNewGravatar`, the function here receives an `NewGravatarEvent` which
For `handleUpdatedGravatar`, the function here receives an `UpdatedGravatarEvent` which includes transaction log data in the payload. We extract this data and then first check that the Gravatar already exists, if not we instantiate a new one and then update that Gravatar with the correct updated details. This is then saved to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -318,14 +214,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-gravatar).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-opensea.md b/docs/quickstart/quickstart_chains/ethereum-opensea.md
index c4e8318723f..3cc89f7d450 100644
--- a/docs/quickstart/quickstart_chains/ethereum-opensea.md
+++ b/docs/quickstart/quickstart_chains/ethereum-opensea.md
@@ -1,7 +1,5 @@
# Ethereum Quick Start - Opensea (Medium)
-## Goals
-
Welcome to our comprehensive step-by-step guide dedicated to constructing a SubQuery indexer tailored for the OpenSea marketplace. OpenSea has emerged as a global epicenter for NFTs, establishing itself as a vibrant ecosystem for creators, collectors, and traders alike.
**This guide is designed to seamlessly lead you through the steps of configuring your personal OpenSea SubQuery indexer.**
@@ -14,15 +12,17 @@ In this Seaport indexing project, our main goal is to set up the indexer to only
For a more comprehensive understanding of how these fundamental protocol mechanisms operate, you can consult the official [Seaport documentation](https://docs.opensea.io/reference/seaport-overview).
-
+
+
+
-In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialization description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for the main Seaport smart contract at the bottom of [this page](https://etherscan.io/address/0x00000000006c3852cbef3e08e8df289169ede581#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+For instance, you can locate the ABI for the main Seaport smart contract at the bottom of [this page](https://etherscan.io/address/0x00000000006c3852cbef3e08e8df289169ede581#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
::: tip Note
The code snippets provided further have been simplified for clarity. You can find the full and detailed code [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-opensea) to see all the intricate details.
:::
-### 1.Configuring the Manifest File
+
You only need to set up one handler to index a specific type of log from this contract, which is the `OrderFulfilled` log. Update your manifest file to look like this:
@@ -62,11 +62,9 @@ You only need to set up one handler to index a specific type of log from this co
}
```
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
+
-### 2. Updating the GraphQL Schema File
+
Now, let's think about what information we can get from this smart contract for later searching.
@@ -155,30 +153,9 @@ type _Item @entity {
From the single log we're working with, there's a wealth of information to extract. Notably, there's the `Trade` entity, which signifies the buying and selling activities within a protocol. This entity, as shown in the schema, serves as a link to other entities like `Collection` and `SaleStrategy`. Furthermore, we've included statistical entities, such as `CollectionDailySnapshot` and `MarketplaceDailySnapshot`, to streamline the overall analysis of the protocol's economic dynamics.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
+
-You can conveniently import all these entities from the following directory:
+
```ts
// Import entity types generated from the GraphQL schema
@@ -193,22 +170,12 @@ import {
_OrderFulfillmentMethod,
NftStandard,
} from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
import { OrderFulfilledLog } from "../types/abi-interfaces/SeaportExchangeAbi";
```
-### 3. Writing the Mappings
+
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Writing mappings for this smart contract is a straightforward process. To provide better context, we've included this handler in a separate file `mapping.ts` within the `src/mappings` directory. Let's start by importing the necessary modules.
@@ -342,71 +309,11 @@ This code snippet demonstrates how trade events within the Seaport marketplace a
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-opensea) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Trades
@@ -507,14 +414,4 @@ query {
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the major Opensea Seaport entities and accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/ethereum-uniswap.md b/docs/quickstart/quickstart_chains/ethereum-uniswap.md
index e70ea8cd7bd..6382a5ac326 100644
--- a/docs/quickstart/quickstart_chains/ethereum-uniswap.md
+++ b/docs/quickstart/quickstart_chains/ethereum-uniswap.md
@@ -1,7 +1,5 @@
# Ethereum Quick Start - Uniswap (Complex)
-## Goals
-
Uniswap is one of the leading decentralised exchange (DEX) in web3 and is one that relies on indexers to serve data to it's UI so users can interact with it. By systematically organizing tokens, liquidity pools, transactions, and other essential information, Indexers like SubQuery provide users with a quick and efficient means to search, find, and analyze data within Uniswap.
The objective of this article is to offer a detailed, step-by-step guide on setting up a SubQuery indexer for Uniswap v3 protocol. We will comprehensively cover the necessary configurations and delve into the intricacies of the underlying logic. It's an excellent example of how to do indexing for a complex DEX like Uniswap.
@@ -18,9 +16,9 @@ In this Uniswap indexing project, our main focus is on configuring the indexer t
To gain a deeper understanding of how these core mechanisms work, you can refer to the official [Uniswap documentation](https://docs.uniswap.org/contracts/v3/reference/deployments).
-
+
-In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialization description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan. For instance, you can locate the ABI for **UniswapV3Factory** at the bottom of [this page](https://etherscan.io/address/0x1F98431c8aD98523631AE4a59f267346ea31F984#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract, as described in sections [1](#1-configuring-the-manifest-file), [2](#1-configuring-the-manifest-file-1), and [3](#1configuring-the-manifest-file).
+
::: tip Note
The code snippets provided further have been simplified for clarity. You can find the full and detailed code [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-uniswap-v3) to see all the intricate details.
@@ -30,7 +28,7 @@ The code snippets provided further have been simplified for clarity. You can fin
The core role of the factory contract is to generate liquidity pool smart contracts. Each pool comprises a pair of two tokens, uniting to create an asset pair, and is associated with a specific fee rate. It's important to emphasize that multiple pools can exist with the same asset pair, distinguished solely by their unique swap fees.
-#### 1.Configuring the Manifest File
+
In simple terms, there's only one event that requires configuration, and that's the `PoolCreated` event. After adding this event to the manifest file, it will be represented as follows:
@@ -73,11 +71,9 @@ In simple terms, there's only one event that requires configuration, and that's
}
```
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
+
-#### 2. Updating the GraphQL Schema File
+
Now, let's consider the entities that we can extract from the factory smart contract for subsequent querying. The most obvious ones include:
@@ -161,50 +157,13 @@ The attributes mentioned above represent only a subset of the available attribut
As you explore these attributes, you may notice the relationship between the `Pool` and `Token` entities. Additionally, you'll find numerous derived attributes like `mints` or `swaps`.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
// Import entity types generated from the GraphQL schema
import { Factory, Pool, Token } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-#### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
-
-Writing mappings for the factory smart contract is a straightforward process. To provide better context, we've included this handler in a separate file `factory.ts` within the `src/mappings` directory. Let's start by importing the necessary modules.
-
-```ts
-// Import event types from the registry contract ABI
import {
Pool,
Token,
@@ -215,6 +174,10 @@ import { EthereumLog } from "@subql/types-ethereum";
import { PoolCreatedEvent } from "../types/contracts/Factory";
```
+
+
+
+
`Pool`, `Factory`, and `Token` are models that were generated in a [prior step](#2-updating-graphql-schema-file). `PoolCreatedEvent` and `EthereumLog` are TypeScript models generated by the SubQuery SDK to facilitate event handling.
As a reminder from the configuration step outlined in the [Manifest File](#1configuring-manifest-file), we have a single handler called `handlePoolCreated`. Now, let's proceed with its implementation:
@@ -315,7 +278,7 @@ Throughout this mapping and those that follow, numerous utility functions are em
As we discussed in the introduction of [Configuring the Indexer](#configuring-the-indexer), a new contract is created by the [factory contract](#uniswapv3factory) for each newly created pool.
-#### 1. Configuring the Manifest File
+
The contract factory generates fresh contract instances for each new pool, therefore we use [dynamic data sources](../../build/dynamicdatasources.md) to create indexers for each new contract:
@@ -386,7 +349,7 @@ The contract factory generates fresh contract instances for each new pool, there
}
```
-#### 2. Updating the GraphQL Schema File
+
Numerous entities can be derived from each newly created pool smart contract. To highlight some of the most crucial ones, you'll need to extend the `schema.graphql` file with the following entities:
@@ -493,24 +456,7 @@ type Transaction @entity {
Similar to the previously imported entities, we observe various relationships here. In this case, each new entity references both the `Token` and `Pool` entities, establishing a one-to-one relationship. Additionally, each new entity references a `Transaction` entity, which is the only one among the newly added entities not derived from logs. Instead, it's derived from an event to a specific transaction, showcasing the capabilities of the Subquery SDK.
-Now, the next step involves instructing the SubQuery CLI to generate types based on your project's updated GraphQL schema:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create update the existing `src/types` directory. All new entites can now be imported from the following directory:
+
```ts
import { Burn, Mint, Swap } from "../types";
@@ -523,7 +469,7 @@ import {
} from "../types/contracts/Pool";
```
-#### 3. Writing the Mappings
+
In this scenario, the mapping process involves two substeps:
@@ -638,7 +584,7 @@ Finally, the function saves the updated data for the swap, factory, pool, token0
As you may already know, swaps in UniswapV3 are executed within the context of pools. To enable swaps, these pools must be liquid, and users provide liquidity to each specific pool. Each liquidity provision results in a Liquidity Position, essentially an NFT. This design enables a broader range of DeFi use cases. And the contract responsible for managing these provisions is known as the NonfungiblePositionManager.
-#### 1. Configuring the Manifest File
+
For the NonfungiblePositionManager smart contract, we want to introduce the following updates to the manifest file:
@@ -711,7 +657,7 @@ For the NonfungiblePositionManager smart contract, we want to introduce the foll
The configuration process closely resembles what we've seen earlier. However, we now have a completely new smart contract that we'll be handling events from. This entails different ABI, address, and start block values. Naturally, it also introduces new events, which are listed under the `handlers` object.
-#### 2. Updating the GraphQL Schema File
+
From this smart contract, the only new entity we'll emphasize is the `Position`:
@@ -737,26 +683,7 @@ type Position @entity {
}
```
-Once more, we encounter connections to various entities like `Pool` and `Token`.
-
-Now, the next step involves instructing the SubQuery CLI to generate types based on your project's updated GraphQL schema:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create update the existing `src/types` directory. All new entites can now be imported from the following directory:
+
```ts
import { Position } from "../types";
@@ -768,7 +695,7 @@ import {
} from "../types/contracts/NonfungiblePositionManager";
```
-#### 3. Writing the Mappings
+
For this contract, we will craft the mappings in a file named `position-manager.ts`. Once again, this separation provides context and clarity.
@@ -853,71 +780,11 @@ To briefly clarify the code provided above: the handler function `handleIncrease
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Ethereum/ethereum-uniswap-v3) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
+
-## Query your Project
+
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
:::details Pools
@@ -1163,14 +1030,4 @@ query {
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the major Uniswap entities and accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/fantom.md b/docs/quickstart/quickstart_chains/fantom.md
index 0e574c93f60..e8ddb245ebf 100644
--- a/docs/quickstart/quickstart_chains/fantom.md
+++ b/docs/quickstart/quickstart_chains/fantom.md
@@ -1,30 +1,17 @@
# Fantom Opera Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped FTM](https://ftmscan.com/token/0x21be370d5312f44cb42ce377bc9b8a0cef1a4c83) on [Fantom Opera](https://ftmscan.com/) Network .
::: warning
Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Fantom project.
:::
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
-
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Fantom/fantom-starter).
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Fantom Opera. Since Fantom is an EVM-compatible layer-1, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Fantom project. It defines most of the details on how SubQuery will index and transform the chain data. For Fantom, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped FTM contract on Fantom Opera network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +70,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WFTM contract](https://ftmscan.com/token/0x21be370d5312f44cb42ce377bc9b8a0cef1a4c83).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,47 +96,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-If you're creating a new EVM-based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed. It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. All of these types are written to `src/types/abi-interfaces` and `src/types/contracts` directories. In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -192,73 +151,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -369,14 +268,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Fantom/fantom-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/flare.md b/docs/quickstart/quickstart_chains/flare.md
index d3b7dfe7521..669e780374d 100644
--- a/docs/quickstart/quickstart_chains/flare.md
+++ b/docs/quickstart/quickstart_chains/flare.md
@@ -1,80 +1,17 @@
# Flare Quick Start
-## Goals
-
The goal of this quick start guide is to index all rewards from the Flare FTSO Reward Manager from Flare's Songbird network.
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Flare Songbird Network, not Flare Network**
-:::
-
-Now, let's move forward and update these configurations.
+
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+**Please initialise a Flare Songbird Network, not Flare Network**
+:::
::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/subql-flare-ftso-rewards).
:::
-## 1. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
-
-Remove all existing entities and update the `schema.graphql` file as follows, here you can see we are indexing all rewards and also addresses that those rewards go to/are claimed from:
-
-```graphql
-type Reward @entity {
- id: ID! # Transaction has
- recipient: Address!
- dataProvider: String! @index
- whoClaimed: Address!
- rewardEpoch: BigInt! @index
- amount: BigInt!
-}
-
-type Address @entity {
- id: ID! # accountIDs
- receivedRewards: [Reward] @derivedFrom(field: "recipient")
- claimedRewards: [Reward] @derivedFrom(field: "whoClaimed")
-}
-```
-
-Since we have a [many-to-many relationship](../../build/graphql.md#many-to-many-relationships), we add the `@derivedFrom` annotation to ensure that we are mapping to the right foreign key.
-
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s move forward to the next file.
-
-## 2. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Flare project. It defines most of the details on how SubQuery will index and transform the chain data. For Flare, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/flare.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/flare.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/flare.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
We are indexing all RewardClaimed logs from the FTSORewardManager contract, first you will need to import the contract abi defintion from [here](https://songbird-explorer.flare.network/address/0xc5738334b972745067fFa666040fdeADc66Cb925/contracts#address-tabs). You can copy the entire JSON and save as a file `ftsoRewardManager.abi.json` in the root directory.
@@ -106,13 +43,38 @@ dataSources:
The above code indicates that you will be running a `handleLog` mapping function whenever there is an `RewardClaimed` log on any transaction from the [FTSO Reward Manager contract](https://songbird-explorer.flare.network/address/0xc5738334b972745067fFa666040fdeADc66Cb925).
-Check out our [Manifest File](../../build/manifest/flare.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
+
+
+
+Remove all existing entities and update the `schema.graphql` file as follows, here you can see we are indexing all rewards and also addresses that those rewards go to/are claimed from:
+
+```graphql
+type Reward @entity {
+ id: ID! # Transaction has
+ recipient: Address!
+ dataProvider: String! @index
+ whoClaimed: Address!
+ rewardEpoch: BigInt! @index
+ amount: BigInt!
+}
+
+type Address @entity {
+ id: ID! # accountIDs
+ receivedRewards: [Reward] @derivedFrom(field: "recipient")
+ claimedRewards: [Reward] @derivedFrom(field: "whoClaimed")
+}
+```
+
+Since we have a [many-to-many relationship](../../build/graphql.md#many-to-many-relationships), we add the `@derivedFrom` annotation to ensure that we are mapping to the right foreign key.
+
+
-Next, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -176,73 +138,13 @@ Let’s understand how the above code works.
The function here receives an `FlareLog` which includes transaction log data in the payload. We extract this data and then first ensure that our account entities (foreign keys) exist. We then instantiate a new `Reward` entity defined earlier in the `schema.graphql` file. After that, we add additional information and then use the `.save()` function to save the new entity (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/flare.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
+
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
+
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -305,14 +207,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/jamesbayly/subql-flare-ftso-rewards).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/gnosis.md b/docs/quickstart/quickstart_chains/gnosis.md
index 9515d190a37..95258039a42 100644
--- a/docs/quickstart/quickstart_chains/gnosis.md
+++ b/docs/quickstart/quickstart_chains/gnosis.md
@@ -2,34 +2,20 @@
[Gnosis Chain](https://www.gnosis.io/) is an EVM compatible, community owned network that prioritizes credible neutrality and resiliency, open to everyone without privilege or prejudice. It aims to provide infrastructure and tools for creating, trading, and governing decentralized finance (DeFi) applications. There are several components that make up Gnosis such as the CoW Protocol, Safe, Gnosis Beacon Chain and GnosisDAO.
-## Goals
-
The goal of this quick start guide is to index all [POAP](https://poap.xyz/) mints and transactions on the Gnosis mainnet.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise a Gnosis project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Gnosis/gnosis-poap).
:::
-## 1. Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for Gnosis. Since Gnosis is a EVM implementation, we can use the core Ethereum framework to index it.
:::
-The Project Manifest (`project.ts`) file works as an entry point to your Gnosis project. It defines most of the details on how SubQuery will index and transform the chain data. For Gnosis, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/gnosis.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/gnosis.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/gnosis.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
As we are indexing all transfers and mints for the POAP ERC721 contract, the first step is to import the contract abi definition which can be obtained from [here](https://gnosisscan.io/token/0x22c1f6050e56d2876009903609a2cc3fef83b415#code). Copy the entire contract ABI and save it as a file called `poap.abi.json` in the `/abis` directory.
This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -86,11 +72,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running `handleTokenMint` and `handleTokenTransfer` mapping functions whenever there is a transaction with the function `mintToken` or a log with the signature `Transfer` on any transaction from the [POAP smart contract](https://gnosisscan.io/token/0x22c1f6050e56d2876009903609a2cc3fef83b415).
-Check out our [Manifest File](../../build/manifest/gnosis.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing token information such as the `id` and the `mintBlockHeight` along with all transfers of that token. There are [foreign keys](../../build/graphql.md#entity-relationships) between all entities.
@@ -124,40 +108,12 @@ type TokenTransfer @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
+
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Token, Event, Address, TokenTransfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
EventTokenLog,
MintTokenTransaction,
@@ -165,13 +121,9 @@ import {
} from "../types/abi-interfaces/PoapAbi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions: `handleLog`, and `handleTransaction`.
@@ -286,73 +238,13 @@ The `handleTokenMint` function receives a `tx` parameter of type `MintTokenTrans
The `handleTokenTransfer` receives a typed `TransferLog` that contains information about a transfer event of a specific POAP token. It extracts this, ignores if the transfer is from the root account (`0x0000000000000000000000000000000000000000`), and then saves this transfer data. It also retrieves and updates the `currentHolderId` of the token itself.
-Check out our [Mappings](../../build/mapping/gnosis.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -491,14 +383,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Gnosis/gnosis-poap).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/harmony.md b/docs/quickstart/quickstart_chains/harmony.md
index 52fc8dc267c..3527fc8e217 100644
--- a/docs/quickstart/quickstart_chains/harmony.md
+++ b/docs/quickstart/quickstart_chains/harmony.md
@@ -1,30 +1,15 @@
# Harmony Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://explorer.harmony.one/tx/0xd611c8cf745d85527348218ccd793e5126a5ebecd4340802b8540ee992e3d3bb) on [Harmony](https://explorer.harmony.one/) Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Harmony project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Harmony/harmony-starter).
+Please initialise an a Harmony project.
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Harmony. Since Harmony is an EVM-compatible layer-1, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Harmony project. It defines most of the details on how SubQuery will index and transform the chain data. For Harmony, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Harmony's Network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -82,11 +67,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://explorer.harmony.one/address/0x6983d1e6def3690c4d616b13597a09e6193ea013).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -110,47 +93,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
+
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-If you're creating a new EVM-based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed. It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. All of these types are written to `src/types/abi-interfaces` and `src/types/contracts` directories. In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -191,73 +148,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -392,14 +289,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Harmony/harmony-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/heco.md b/docs/quickstart/quickstart_chains/heco.md
index 006a60de962..a36aef6f491 100644
--- a/docs/quickstart/quickstart_chains/heco.md
+++ b/docs/quickstart/quickstart_chains/heco.md
@@ -1,30 +1,19 @@
Heco Chain Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped HT](https://www.hecoinfo.com/en-us/token/0x5545153ccfca01fbd7dd11c0b23ba694d9509a6f) on [Heco](https://www.hecoinfo.com) Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Heco Chain project.
-:::
+Please initialise an a Heco Chain project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
-
-::: tip Note
-The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Heco/heco-starter).
+
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Heco Chain. Since Heco Chain is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Heco Chain project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
+::: tip Note
+The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Heco/heco-starter/).
+:::
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped HT contract on Heco Chain network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -82,11 +71,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [Wrapped HT contract](https://www.hecoinfo.com/en-us/token/0x5545153ccfca01fbd7dd11c0b23ba694d9509a6f).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -110,51 +97,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
+
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -202,73 +159,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -362,14 +259,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Heco/heco-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/immutable-testnet.md b/docs/quickstart/quickstart_chains/immutable-testnet.md
index 48306d82040..dc6b2dffec4 100644
--- a/docs/quickstart/quickstart_chains/immutable-testnet.md
+++ b/docs/quickstart/quickstart_chains/immutable-testnet.md
@@ -1,14 +1,10 @@
# Immutable (Testnet) Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the Immutable Testnet [Gas Token](https://immutable-testnet.blockscout.com/token/0x0000000000000000000000000000000000001010) on [Immutable Testnet](https://immutable-testnet.blockscout.com) .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an Immutable Testnet project.
-:::
+
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+Please initialise an Immutable Testnet project.
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Immutable/immutable-testnet-starter/).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Immutable Testnet. Since Immutable Testnet is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Immutable project. It defines most of the details on how SubQuery will index and transform the chain data. For Immutable Testnet, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Gas Token contract on Immutable Testnet, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -81,11 +69,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the Immutable [Gas Token](https://immutable-testnet.blockscout.com/token/0x0000000000000000000000000000000000001010).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -109,51 +95,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -194,73 +150,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -354,14 +250,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Immutable).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/klaytn.md b/docs/quickstart/quickstart_chains/klaytn.md
index bac69c85fd0..b524187819b 100644
--- a/docs/quickstart/quickstart_chains/klaytn.md
+++ b/docs/quickstart/quickstart_chains/klaytn.md
@@ -1,14 +1,10 @@
# Klaytn Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Orbit ETH](https://scope.klaytn.com/token/0x34d21b1e550d73cee41151c77f3c73359527a396) on [Klaytn](https://scope.klaytn.com) Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Klaytn project.
-:::
+
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+Please initialise an a Klaytn project.
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Klaytn/klaytn-starter).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Klaytn Network. Since Klaytn is an EVM-compatible layer-1, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Klaytn project. It defines most of the details on how SubQuery will index and transform the chain data. For Klaytn, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Orbit ETH contract on Klaytn network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +71,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [Orbit ETH](https://scope.klaytn.com/token/0x34d21b1e550d73cee41151c77f3c73359527a396).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,47 +97,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-If you're creating a new EVM-based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed. It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. All of these types are written to `src/types/abi-interfaces` and `src/types/contracts` directories. In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -192,73 +152,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -361,14 +261,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Klaytn/klaytn-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/mantle.md b/docs/quickstart/quickstart_chains/mantle.md
index e7f542a3a0a..f0f11e857a4 100644
--- a/docs/quickstart/quickstart_chains/mantle.md
+++ b/docs/quickstart/quickstart_chains/mantle.md
@@ -1,14 +1,10 @@
# Mantle Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Mantle Native token](https://explorer.mantle.xyz/token/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000/token-transfers) on Mantle Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Mantle project.
-:::
+
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+Please initialise an a Mantle project.
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Mantle/mantle-starter).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Mantle. Since Mantle is an EVM-compatible layer-1, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Mantle project. It defines most of the details on how SubQuery will index and transform the chain data. For Mantle, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the [Mantle Native token](https://explorer.mantle.xyz/token/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000/token-transfers) on Mantle Network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +71,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [Mantle Native token](https://explorer.mantle.xyz/token/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000/token-transfers).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,47 +97,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-If you're creating a new EVM-based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed. It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. All of these types are written to `src/types/abi-interfaces` and `src/types/contracts` directories. In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -192,73 +152,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -393,14 +293,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Mantle/mantle-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/meter.md b/docs/quickstart/quickstart_chains/meter.md
index 7f898c42c48..1f9d7a516cf 100644
--- a/docs/quickstart/quickstart_chains/meter.md
+++ b/docs/quickstart/quickstart_chains/meter.md
@@ -1,14 +1,8 @@
# Meter Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://scan.meter.io/address/0x983147fb73a45fc7f8b4dfa1cd61bdc7b111e5b6) on [Meter Network](https://scan.meter.io).
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Meter project.
-:::
-
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+Please initialise an a Meter project.
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Meter/meter-starter).
@@ -16,16 +10,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Meter. Since Meter is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Meter project. It defines most of the details on how SubQuery will index and transform the chain data. For
-Meter, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Meter network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +68,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://scan.meter.io/address/0x983147fb73a45fc7f8b4dfa1cd61bdc7b111e5b6).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,51 +94,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 3. Add a Mapping Function
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -196,73 +149,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -332,14 +225,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Meter/meter-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/metis.md b/docs/quickstart/quickstart_chains/metis.md
index 2da150515e4..d3672a6f607 100644
--- a/docs/quickstart/quickstart_chains/metis.md
+++ b/docs/quickstart/quickstart_chains/metis.md
@@ -1,14 +1,10 @@
# Metis Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [METIS Token](https://andromeda-explorer.metis.io/token/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000/token-transfers) on [Metis](https://metis.io/) Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Metis project.
-:::
+Please initialise an a Metis project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Metis/metis-starter).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Metis. Since Metis is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Metis project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Metis network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -82,11 +70,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [METIS Token contract](https://andromeda-explorer.metis.io/token/0xDeadDeAddeAddEAddeadDEaDDEAdDeaDDeAD0000/token-transfers).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -110,51 +96,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -195,73 +151,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -396,14 +292,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Metis/metis-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/near-aurora.md b/docs/quickstart/quickstart_chains/near-aurora.md
index c7b86906ade..132662a0da7 100644
--- a/docs/quickstart/quickstart_chains/near-aurora.md
+++ b/docs/quickstart/quickstart_chains/near-aurora.md
@@ -4,8 +4,6 @@
Since SubQuery fully supports NEAR and Aurora, you can index data from both execution environments in the same SubQuery project and into the same dataset.
-## Goals
-
The goal of this quick start guide is to index transfers and approvals for the [Wrapped NEAR smart contract](https://explorer.aurora.dev/address/0xC42C30aC6Cc15faC9bD938618BcaA1a1FaE8501d) on NEAR Aurora.
@@ -13,30 +11,20 @@ The goal of this quick start guide is to index transfers and approvals for the [
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an NEAR Aurora project
-:::
+
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/near-subql-starter/tree/main/Near/near-aurora-starter).
:::
-## 1. Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for NEAR Aurora. Since Aurora is a EVM implementation on NEAR, we can use the core Ethereum framework to index it.
:::
-The Project Manifest (`project.ts`) file works as an entry point to your Aurora project. It defines most of the details on how SubQuery will index and transform the chain data. For Aurora, there are three types of mapping handlers (and you can have more than one in each project). Note that these are different mapping handlers to that of [traditional NEAR projects](./near.md#2-update-your-project-manifest-file):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
As we are indexing all transfers and approvals for the Wrapped NEAR smart contract, the first step is to import the contract abi definition which can be obtained from [here](https://explorer.aurora.dev/address/0xC42C30aC6Cc15faC9bD938618BcaA1a1FaE8501d/contracts#address-tabs). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -88,11 +76,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleTransaction` and `handlelog` mapping function whenever there is an `approve` or `Transfer` log on any transaction from the [Wrapped NEAR contract](https://explorer.aurora.dev/address/0xC42C30aC6Cc15faC9bD938618BcaA1a1FaE8501d/contracts#address-tabs).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id and the blockHeight along with addresses such as to, from, owner and spender, along with the contract address and value as well.
@@ -116,53 +102,21 @@ type Approval @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transaction } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see three exported functions: `handleBlock`, `handleLog`, and `handleTransaction`. Replace these functions with the following code:
@@ -210,71 +164,13 @@ The `handleTransaction` function receives a `tx` parameter of type `ApproveTrans
Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
+
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -322,14 +218,4 @@ You will see the result similar to below:
}
```
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/near-ref-finance.md b/docs/quickstart/quickstart_chains/near-ref-finance.md
index 653d225155f..bd965d78147 100644
--- a/docs/quickstart/quickstart_chains/near-ref-finance.md
+++ b/docs/quickstart/quickstart_chains/near-ref-finance.md
@@ -1,8 +1,6 @@
# NEAR Ref Finance. Quickstart Guide
-## Goals
-
-The objective of this project is to catalog the `swap` actions performed by the `v2.ref-finance.near` contract on the NEAR mainnet. It serves as an excellent opportunity to gain practical experience in understanding Graph's functionality through a real-world example.
+The objective of this project is to catalog the `swap` actions performed by the `v2.ref-finance.near` contract on the NEAR mainnet. It serves as an excellent opportunity to gain practical experience in understanding SubQuery's functionality through a real-world example.
@@ -12,7 +10,7 @@ The objective of this project is to catalog the `swap` actions performed by the
The final code of this project can be found [here](https://github.com/subquery/near-subql-starter/tree/main/Near/near-ref-finance).
:::
-
+
::: code-tabs
@tab:active `schema.graphql`
@@ -42,9 +40,7 @@ The schema include `Swap` entity with a unique identifier, associated with a spe
-## 2. Update Your Project Manifest File
-
-
+
We are indexing all actions with a method name equal to `swap` and the `v2.ref-finance.near` contract as the recipient.
@@ -80,11 +76,11 @@ We are indexing all actions with a method name equal to `swap` and the `v2.ref-f
In the provided configuration, when the specified action is indexed, it will be forwarded to a handler known as `handleAction`.
-Check out our [Manifest File](../../build/manifest/near.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
Next, let’s proceed ahead with the Mapping Function’s configuration.
-
+
The `handleAction` function receives event data whenever an event matches the filters, which you specified previously in the `project.ts`. Let’s make changes to it, process the relevant transaction action, and save them to the GraphQL entities created earlier.
@@ -149,6 +145,8 @@ The `handleAction` function processes a Near Protocol action, specifically relat
The `getOrCreateToken` and `getOrCreatePool` functions are used to retrieve existing tokens/pools or create new ones if they don't exist. These functions are utility functions used by `handleAction`.
+
+
diff --git a/docs/quickstart/quickstart_chains/near.md b/docs/quickstart/quickstart_chains/near.md
index 26143866587..dee2accbfea 100644
--- a/docs/quickstart/quickstart_chains/near.md
+++ b/docs/quickstart/quickstart_chains/near.md
@@ -1,24 +1,16 @@
# NEAR Quick Start
-## Goals
-
The goal of this quick start guide is to index all price submissions from priceoracle.near on NEAR's mainnet - it's a great way to quickly learn how SubQuery works on a real world hands-on example.
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a NEAR Network project**
-:::
-
-Now, let's move forward and update these configurations.
+
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/near-subql-starter/tree/main/Near/near-priceoracle-example).
:::
-## 1. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Remove all existing entities and update the `schema.graphql` file as follows, here you can see we are indexing all oracles that submit prices on the chain, as well as each individual price submission made to NEAR's price oracle:
@@ -41,42 +33,11 @@ type Price @entity {
}
```
-Since we have a [one-to-many relationship](../../build/graphql.md#one-to-many-relationships), we define a foreign key using `oracle: Oracle! # The oracle that reported this price` in the `Price` entity.
-
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
+
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s move forward to the next file.
-
-## 2. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your NEAR project. It defines most of the details on how SubQuery will index and transform the chain data. For NEAR, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHandler](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [ActionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction action that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to update the datasource handlers.
+
We are indexing all transactions sent to the `priceoracle.near` address.
@@ -121,13 +82,11 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleNewPrice` mapping function whenever there is transaction made to the `priceoracle.near` address that includes an action with the method name `report_prices`. Additionally we run the `handleNewOracle` mapping function whenever there is transaction made to the `priceoracle.near` address that includes an action with the method name `add_oracle`.
-Check out our [Manifest File](../../build/manifest/near.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
Next, let’s proceed ahead with the Mapping Function’s configuration.
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -214,73 +173,13 @@ For the `handleNewOracle` mapping function, the function receives a new `NearAct
For the `handleNewPrice` mapping function, the function receives a new `NearAction` payload. The data on this is a JSON payload, so we parse into the correct `NewPrices` type via JSON. We then run the `checkAndCreateOracle` to ensure that the oracle we are listing this price for is already known since it's a foreign key (it checks if it already exists before creating a new `Oracle` entity). Finally, for each price submission in the array, we create the price and save it to the store (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/near.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
+
-## 5. Run Your Project Locally with Docker
+
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md). The query shows a list of the most recent prices, and the most active oracles(by number of prices submitted).
+
```graphql
query {
@@ -382,14 +281,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/near-subql-starter/tree/main/Near/near-priceoracle-example).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/optimism.md b/docs/quickstart/quickstart_chains/optimism.md
index 570a6aa3007..ee7bb04d934 100644
--- a/docs/quickstart/quickstart_chains/optimism.md
+++ b/docs/quickstart/quickstart_chains/optimism.md
@@ -1,7 +1,5 @@
# Optimism Quick Start
-## Goals
-
The goal of this quick start guide is to index all claim events from the Optimism airdrop contract. Check out the video or follow the step by step instructions below.
@@ -9,25 +7,15 @@ The goal of this quick start guide is to index all claim events from the Optimis
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an Optimism project
-:::
+Please initialise an Optimism project
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Optimism/optimism-airdrop). We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Optimism. Since Optimism is a layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Optimism project. It defines most of the details on how SubQuery will index and transform the chain data. For Optimism, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/optimism.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/optimism.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/optimism.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all claim events from the Optimism airdrop contract, the first step is to import the contract abi definition which can be obtained from [here](https://optimistic.etherscan.io/address/0xfedfaf1a10335448b7fa0268f56d2b44dbd357de#code). Copy the entire contract ABI and save it as a file called `airdrop.abi.json` in the `/abis` directory.
@@ -67,11 +55,9 @@ As we are indexing all claim events from the Optimism airdrop contract, the firs
The above code indicates that you will be running a `handleClaim` mapping function whenever there is a `Claimed` log on any transaction from the [Optimism airdrop contract](https://optimistic.etherscan.io/address/0xfedfaf1a10335448b7fa0268f56d2b44dbd357de#code).
-Check out our [Manifest File](../../build/manifest/optimism.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight and timestamp along with the value and the total claimed amount.
@@ -92,48 +78,18 @@ type DailyClaimSummary @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Claim, DailyClaimSummary } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import { ClaimedLog } from "../types/abi-interfaces/AirdropAbi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions: `handleLog`, and `handleTransaction`. Replace these functions with the following code:
@@ -189,73 +145,13 @@ export async function handleClaim(log: ClaimedLog): Promise {
The `handleClaim` function receives a `log` parameter of type `ClaimedLog` which includes transaction log data in the payload. We extract this data, assign it to our newClaim object, and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_). We also call the `checkGetDailyClaim` function to retrieve the existing day aggregation (and create a new one if we need to), and the update the `total_claimed` on it.
-Check out our [Mappings](../../build/mapping/optimism.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, it is recommended to run it locally and test it. Using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation q{here
@@ -338,14 +234,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Optimism/optimism-airdrop).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polkadot-astar.md b/docs/quickstart/quickstart_chains/polkadot-astar.md
index c772d97956a..c64d48ad27e 100644
--- a/docs/quickstart/quickstart_chains/polkadot-astar.md
+++ b/docs/quickstart/quickstart_chains/polkadot-astar.md
@@ -1,7 +1,5 @@
# Astar (WASM) Quick Start
-## Goals
-
This quick start guide introduces SubQuery's Substrate WASM support by using an example project in Astar Network. The example project indexes all Transactions and Approvals from the [Astar Wasm based lottery contract](https://astar.subscan.io/account/bZ2uiFGTLcYyP8F88XzXa13xu5Mmp13VLiaW1gGn7rzxktc), as well as dApp staking events from [Astar's dApp Staking](https://docs.astar.network/docs/dapp-staking/) functions.
::: tip Note
@@ -12,11 +10,9 @@ The final code of this project can be found [here](https://github.com/subquery/s
This project is unique, as it indexes data from both Astar's Substrate execution layer (native Astar pallets and runtime), with smart contract data from Astar's WASM smart contract layer, within the same SubQuery project and into the same dataset. A very similar approach can be take with indexing Astar's EVM layer too.
-Previously, in the [1. Create a New Project](../quickstart.md) section, [3 key files](../quickstart.md#_3-make-changes-to-your-project) were mentioned. Let's take a closer look at these files.
-
-## 1. GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
The Astar-wasm-starter project has four entities. A Transaction, Approval, DApp, and DAppReward (which has a [foreign key](../../build/graphql.md#one-to-many-relationships) to Dapp). These index basic block data such as the timestamp, heigh and hash along with from and contract addresses and the value.
@@ -58,30 +54,11 @@ type DAppReward @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
+
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 2. The Project Manifest File
+## The Project Manifest File
The Project Manifest (`project.ts`) file works as an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Substrate/Polkadot chains, there are three types of mapping handlers:
@@ -149,7 +126,7 @@ For [EVM](../../build/substrate-evm.md) and [WASM](../../build/substrate-wasm.md
This indicates that you will be running a `handleNewContract` mapping function whenever there is an event emitted from the `NewContract` method on the `dappsStaking` pallet. Similarly we will run other mapping functions for the three other events being emitted from the `dappsStaking` to other mapping functions. This covers most interactions with the dApp staking feature that we are interested in.
-Check out our [Manifest File](../../build/manifest/polkadot.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
### WASM Manifest Section
@@ -206,9 +183,7 @@ The above code indicates that you will be running a `handleWasmEvent` mapping fu
Check out our [Substrate Wasm](../../build/substrate-wasm.md) documentation to get more information about the Project Manifest (`project.ts`) file for Substrate WASM contracts.
-## 3. Mapping Functions
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. There are multiple other exported functions such as `handleWasmCall`, `handleWasmEvent`, `handleNewContract`, `handleBondAndStake`, `handleUnbondAndUnstake`, and `handleReward`.
@@ -265,61 +240,11 @@ The `handleBondAndStake` function receives Substrate event data from the native
Check out our mappings documentation for [Substrate](../../build/mapping/polkadot.md) and the [Substrate WASM data processor](../../build/substrate-wasm.md) to get detailed information on mapping functions for each type.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, make sure to rebuild your project.
-:::
-
-## 5. Run Your Project Locally with Docker
-
-SubQuery provides a Docker container to run projects very quickly and easily for development purposes.
-
-The docker-compose.yml file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
+
-:::
-
-::: tip
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-Visit [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-## 6. Query Your Project
-
-Once the container is running, navigate to http://localhost:3000 in your browser and run the sample GraphQL command provided in the README file. Below is an example query from the Astar-wasm-starter project.
+
```graphql
query {
@@ -393,14 +318,4 @@ The final code of this project can be found [here](https://github.com/subquery/s
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for events from the lottery smart contract at [`bZ2uiFGTLcYyP8F88XzXa13xu5Mmp13VLiaW1gGn7rzxktc`](https://astar.subscan.io/account/bZ2uiFGTLcYyP8F88XzXa13xu5Mmp13VLiaW1gGn7rzxktc?tab=wasm_transaction).
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polkadot-humanode.md b/docs/quickstart/quickstart_chains/polkadot-humanode.md
index 3777d761286..f12c6aac11b 100644
--- a/docs/quickstart/quickstart_chains/polkadot-humanode.md
+++ b/docs/quickstart/quickstart_chains/polkadot-humanode.md
@@ -1,20 +1,10 @@
# Humanode Quick Start
-## Goals
-
This quick guide aims to adapt the standard starter project and index all transfers, bioauthenitcation events, and online validator nodes from Humanode chain. Humanode is a standalone Substrate chain, but the same process applies to it as a normal Polkadot parachain or relaychain.
-::: warning Important
-Before we begin, ensure that you have initialized your project using the steps in the [Start Here](../quickstart.md) section.
-:::
-
-Now, let's move forward and update these configurations.
-
-While Creating a [New Project](../quickstart.md), you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
-## 1. Updating your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Remove all existing entities and update the `schema.graphql` file as follows, here you can see we are indexing all transfers, bioauthentication events, and online validator nodes from Humanode:
@@ -34,40 +24,9 @@ type ImOnlineSomeOffline @entity {
}
```
-::: warning Important
-While making any changes to the schema file, make sure to regenerate your types directory
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
+
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file let’s move forward to the next file.
-
-## 2. Updating Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Substrate/Polkadot chains, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [EventHandlers](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every event that matches optional filter criteria, run a mapping function
-- [CallHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every extrinsic call that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that look for on the blockchain to start indexing.
+
**Since we are planning to index all transfers, bioauthentication events, and online nodes, we need to update the `datasources` section as follows:**
@@ -105,11 +64,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
This indicates that you will be running a `handleBioauthNewAuthenticationEvent` and `handleImonlineSomeOfflineEvent` mapping functions whenever there are events emitted from the `bioauth` and `imOnline modules` with the `NewAuthentication` and `SomeOffline` methods, respectively.
-Check out our [documentation](../../build/manifest/polkadot.md) to get more information about the Project Manifest (`project.ts`) file.
-
-Next, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Adding a Mapping Function
+
Mapping functions define how chain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
Navigate to the default mapping function in the `src/mappings` directory. You will see two exported functions: `handleBioauthNewAuthenticationEvent` and `handleImonlineSomeOfflineEvent`.
@@ -161,69 +118,13 @@ export async function handleImonlineSomeOfflineEvent(
}
```
-## 4. Building Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Make sure to rebuild your project when you change your mapping functions.
-:::
-
-Now, you are all set to run your first SubQuery project. Let’s dig out the process of running the project in detail.
-
-## 5. Running Your Project Locally with Docker
-
-When you create a new SubQuery Project, you must first run it locally on your computer and test it. Using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. You won't need to change anything for a new project which you have just initialized.
-
-However, visit Running [SubQuery Locally](../../run_publish/run.html) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: warning Note
-It may take a few minutes to download the required images and start various nodes and Postgres databases.
-:::
-
-## 6. Querying Your Project
+
-Next, let's query our project. Follow these simple steps to query your SubQuery project:
+
-Open your browser and head to http://localhost:3000.
+
-You will see a GraphQL playground in the browser and the schemas, which are ready to query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```grpahql
query {
@@ -277,14 +178,4 @@ You will see the results similar to below:
}
```
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polkadot-kilt.md b/docs/quickstart/quickstart_chains/polkadot-kilt.md
index fc6e3955b09..33d07d6fe98 100644
--- a/docs/quickstart/quickstart_chains/polkadot-kilt.md
+++ b/docs/quickstart/quickstart_chains/polkadot-kilt.md
@@ -1,16 +1,12 @@
# Kilt Quick Start
-## Goals
-
This quick start guide introduces SubQuery's Substrate Kilt Spiritnet support by using an example project in Kilt Spiritnet. The example project indexes all Attestations [created](https://spiritnet.subscan.io/event?module=Attestation&event=AttestationCreated) and [revoked](https://spiritnet.subscan.io/event?module=Attestation&event=AttestationRevoked) on the [Kilt Spiritnet Blockchain](https://www.kilt.io/).
-Previously, in the [1. Create a New Project](../quickstart.md) section, [3 key files](../quickstart.md#_3-make-changes-to-your-project) were mentioned. Let's take a closer look at these files.
+
The project that we are developing throughout this guide can be found [here](https://github.com/subquery/subql-starter/tree/main/Kilt/kilt-spiritnet-credentials-example)
-## 1. GraphQL Schema File
-
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
The Kilt-spiritinet-credentials-example project has two entities: Attestation and Aggregation (which has a [foreign key](../../build/graphql.md#one-to-many-relationships) to Dapp). These index basic block data such as the timestamp, height, and hash along with some other attributes related to the event.
@@ -37,36 +33,9 @@ type Aggregation @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
+
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-## 2. The Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Substrate/Polkadot chains, there are three types of mapping handlers:
-
-- [BlockHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [EventHandlers](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every Event that matches optional filter criteria, run a mapping function
-- [CallHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every extrinsic call that matches optional filter criteria, run a mapping function
+
We are indexing all attestations creation and revoking events from the Kilt Spiritnet blockchain. This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -106,9 +75,9 @@ The above code indicates that you will be running a `handleAttestationCreated` m
Check out our [Substrate](../../build/manifest/polkadot.md) documentation to get more information about the Project Manifest (`project.ts`) file for Polkadot chains.
-## 3. Mapping Functions
+
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. There is one more function that was created in the mapping file `handleDailyUpdate`. This function allows us to calculate daily aggregated attestations created and revoked.
@@ -194,63 +163,13 @@ The `handleAttestationCreated` function receives event data from the Kilt execut
There is one more function that was created in the mapping file `handleDailyUpdate`. This function allows us to calculate daily aggregated attestations created and revoked.
-Check out our mappings documentation for [Substrate](../../build/mapping/polkadot.md) to get detailed information on mapping functions for each type.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
+
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
+
-:::
+
-::: warning Important
-Whenever you make changes to your mapping functions, make sure to rebuild your project.
-:::
-
-## 5. Run Your Project Locally with Docker
-
-SubQuery provides a Docker container to run projects very quickly and easily for development purposes.
-
-The docker-compose.yml file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-Visit [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-## 6. Query Your Project
-
-Once the container is running, navigate to http://localhost:3000 in your browser and run the sample GraphQL command provided in the README file. Below is an example query from the kilt-example project.
+
```graphql
query {
@@ -321,14 +240,6 @@ You should see results similar to below:
}
```
-## What's next?
-
Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for events related to attestations [created](https://spiritnet.subscan.io/event?module=Attestation&event=AttestationCreated) and [revoked](https://spiritnet.subscan.io/event?module=Attestation&event=AttestationRevoked) on the [Kilt Spiritnet Blockchain](https://www.kilt.io/).
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polkadot-moonbeam.md b/docs/quickstart/quickstart_chains/polkadot-moonbeam.md
index 3778311b41c..0d86c8da0cc 100644
--- a/docs/quickstart/quickstart_chains/polkadot-moonbeam.md
+++ b/docs/quickstart/quickstart_chains/polkadot-moonbeam.md
@@ -1,16 +1,12 @@
# Moonbeam (EVM) Quick Start
-## Goals
-
This quick start guide introduces SubQuery's Substrate EVM support by using an example project in Moonbeam Network. The example project indexes all Transfers from the [Moonbeam EVM FRAX ERC-20 contract](https://moonscan.io/token/0x322e86852e492a7ee17f28a78c663da38fb33bfb), as well as Collators joining and leaving events from [Moonbeam's Staking functions](https://docs.moonbeam.network/builders/pallets-precompiles/pallets/staking/).
This project is unique, as it indexes data from both Moonbeam's Substrate execution layer (native Moonbeam pallets and runtime), with smart contract data from Moonbeam's EVM smart contract layer, within the same SubQuery project and into the same dataset. A very similar approach was taken with [indexing Astar's WASM layer too](https://academy.subquery.network/quickstart/quickstart_chains/polkadot-astar.html).
-Previously, in the [1. Create a New Project](../quickstart.md) section, [3 key files](../quickstart.md#_3-make-changes-to-your-project) were mentioned. Let's take a closer look at these files.
-
-## 1. GraphQL Schema File
+
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
The Moonbeam-evm-substrate-starter project has two entities. An Erc20Transfer and Collator. These two entities index ERC-20 transfers related to [the $FRAX contract](https://moonscan.io/token/0x322e86852e492a7ee17f28a78c663da38fb33bfb), as well as any [collators joining or leaving](https://docs.moonbeam.network/node-operators/networks/collators/activities/) the Moonbeam Parachain.
@@ -29,30 +25,13 @@ type Collator @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
You will find the generated models in the `/src/types/models` directory.
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 2. The Project Manifest File
+## The Project Manifest File
The Project Manifest (`project.ts`) file works as an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Substrate/Polkadot chains, there are three types of mapping handlers:
@@ -103,7 +82,7 @@ For [EVM](../../build/substrate-evm.md) and [WASM](../../build/substrate-wasm.md
This indicates that you will be running a `handleCollatorJoined` mapping function whenever the method `joinCandidates` is called on the `staking` pallet. Similarly, we will run `handleCollatorLeft` whenever the method `executeLeaveCandidates` is called on the staking pallet. This covers the most basic actions that Collators can do (requesting to join the candidates pool & leaving the candidates pool). For more information about other methods possible under the pallet `staking`in Moonbeam, the Moonbeam documentation provides a [list of possible functions to call](https://docs.moonbeam.network/builders/pallets-precompiles/pallets/staking/).
-Check out our [Manifest File](../../build/manifest/polkadot.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
### EVM Manifest Section
@@ -151,9 +130,7 @@ The above code indicates that you will be running a `handleErc20Transfer` mappin
Check out our [Substrate EVM](../../build/substrate-evm.md) documentation to get more information about the Project Manifest (`project.ts`) file for Substrate EVM contracts.
-## 3. Mapping Functions
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. There are the exported functions `handleCollatorJoined`, `handleCollatorLeft` and `handleErc20Transfer`.
@@ -211,63 +188,11 @@ The `handleErc20Transfer` function receives event data from the EVM execution en
Check out our mappings documentation for [Substrate](../../build/mapping/polkadot.md) and the [Substrate Frontier EVM data processor](../../build/substrate-evm.md) to get detailed information on mapping functions for each type.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, make sure to rebuild your project.
-:::
+
-## 5. Run Your Project Locally with Docker
+
-SubQuery provides a Docker container to run projects very quickly and easily for development purposes.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-Visit [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-## 6. Query Your Project
-
-Once the container is running, navigate to http://localhost:3000 in your browser and run the sample GraphQL command provided in the README file. Below is an example query from this project.
-
-Once the container is running, navigate to http://localhost:3000 in your browser and run the sample GraphQL command provided in the README file. Below is an example query from the Astar-wasm-starter project.
+
```graphql
query {
@@ -328,14 +253,6 @@ You should see results similar to below:
}
```
-## What's next?
-
Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transfer events from the $FRAX smart contract at [`0x322E86852e492a7Ee17f28a78c663da38FB33bfb`](https://moonscan.io/token/0x322e86852e492a7ee17f28a78c663da38fb33bfb).
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polkadot.md b/docs/quickstart/quickstart_chains/polkadot.md
index 1725f977329..95f19f4217f 100644
--- a/docs/quickstart/quickstart_chains/polkadot.md
+++ b/docs/quickstart/quickstart_chains/polkadot.md
@@ -1,16 +1,8 @@
# Polkadot/Substrate Quick Start
-## Goals
-
The goal of this quick guide is to adapt the standard starter project and start indexing all transfers from Polkadot.
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section.
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
#### Check out how to get the Polkadot starter project running
@@ -18,9 +10,7 @@ Previously, in the [1. Create a New Project](../quickstart.md) section, you must
-## 1. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of the data that you are using SubQuery to index, hence it's a great place to start. The shape of your data is defined in a GraphQL Schema file with various [GraphQL entities](../../build/graphql.md).
+
Remove all existing entities and update the `schema.graphql` file as follows, here you can see we are indexing all transfers from Polkadot:
@@ -34,40 +24,11 @@ type Transfer @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-You will find the generated models in the `/src/types/models` directory.
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
Now that you have made essential changes to the GraphQL Schema file, let’s move forward to the next file.
-## 2. Update Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Substrate/Polkadot chains, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [EventHandlers](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every Event that matches optional filter criteria, run a mapping function
-- [CallHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every extrinsic call that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that look for on the blockchain to start indexing.
+
**Since we are planning to index all Polkadot transfers, we need to update the `datasources` section as follows:**
@@ -97,13 +58,9 @@ Note that the manifest file has already been set up correctly and doesn’t requ
This indicates that you will be running a `handleEvent` mapping function whenever there is an event emitted from the `balances` module with the `transfer` method.
-Check out our [Manifest File](../../build/manifest/polkadot.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-Next, let’s proceed ahead with the Mapping Function’s configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will see three exported functions: `handleBlock`, `handleEvent`, and `handleCall`. Delete both the `handleBlock` and `handleCall` functions as you will only deal with the `handleEvent` function.
@@ -140,73 +97,13 @@ Let’s understand how the above code works.
The function here receives a `SubstrateEvent` which includes transfer data in the payload. We extract this data and then instantiate a new `Transfer` entity defined earlier in the `schema.graphql` file. After that, we add additional information and then use the `.save()` function to save the new entity (_SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/polkadot.md) documentation to get detailed information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, make sure to rebuild your project.
-:::
-
-Now, you are all set to run your first SubQuery project. Let’s dig out the process of running the project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it. Using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
+
-:::
-
-::: tip
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query Your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
{
@@ -274,14 +171,4 @@ You will see the result similar to below:
}
```
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polygon-lens.md b/docs/quickstart/quickstart_chains/polygon-lens.md
index b506e9c8149..9e4a6348c3d 100644
--- a/docs/quickstart/quickstart_chains/polygon-lens.md
+++ b/docs/quickstart/quickstart_chains/polygon-lens.md
@@ -1,22 +1,16 @@
# Polygon Quick Start - Lens Protocol
-## Goals
-
This article's purpose is to provide a clear, step-by-step guide on setting up an indexer for the Lens Protocol on the Polygon blockchain. By the end of this guide, you will understand what Lens Protocol is, why its smart contract data is valuable, and how to set up a SubQuery indexer to track and index events like profile creation, post, and follow.
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Polygon project**
-:::
-
-Now, let's move forward and update these configurations.
+**Please initialise a Polygon project**
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Polygon/polygon-lens).
:::
-## 1. Update Your Project Manifest File
+
In this Lens Protocol indexing project, our primary objective is to configure the indexer to specifically gather data from a single smart contract: `0xDb46d1Dc155634FbC732f92E853b10B288AD5a1d`, which you can find on [this page](https://polygonscan.com/address/0xDb46d1Dc155634FbC732f92E853b10B288AD5a1d). You can copy the entire JSON and save as a file `LensHub.abi.json` in the root directory.
@@ -26,12 +20,6 @@ We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `
For a more comprehensive understanding of how the fundamental mechanisms of this protocol work, you can refer to the official [Lens documentation](https://docs.lens.xyz/v2/docs/what-is-lens).
-The Project Manifest (`project.ts`) file works as an entry point to your Polygon project. It defines most of the details on how SubQuery will index and transform the chain data. For Polygon, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
Then, you need to update the `datasources` section as follows:
```ts
@@ -89,13 +77,9 @@ This setup establishes an manifest file to gather and manage information from a
3. `handleFollowed`: This handler is designed to handle data stemming from the `Followed` event, which tracks user interactions related to following other users.
-::: tip Note
-Check out our [Manifest File](../../build/manifest/polygon.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-:::
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows:
@@ -149,37 +133,12 @@ From three logs we're working with, there's a wealth of data to extract. Notably
- `Follow` that represents a follow action. Attributes include an ID, the account that initiated the follow action, the profile that was followed, and a timestamp for when the follow action occurred.
-::: tip Note
-Importantly, these relationships not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to [this section](../../build/graphql.md#entity-relationships). If you prefer a more example-based approach, our dedicated [Hero Course Module](../../academy/herocourse/module3.md) can provide further insights.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
+
```ts
import { Account, Post, Profile, Follow } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
import {
PostCreatedLog,
ProfileCreatedLog,
@@ -187,13 +146,9 @@ import {
} from "../types/abi-interfaces/LensHubAbi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
-
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -320,71 +275,13 @@ Let's dive into an explanation of the code above. The code includes three distin
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Polygon/polygon-lens) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
+
-@tab npm
+
-```shell
-npm run-script build
-```
+
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Posts and Profiles
@@ -498,14 +395,4 @@ Try the following query to understand how it works for your new SubQuery starter
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Polygon/polygon-lens).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for Lens Protocol data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polygon-zkevm.md b/docs/quickstart/quickstart_chains/polygon-zkevm.md
index 563f21166f2..f7aec841faa 100644
--- a/docs/quickstart/quickstart_chains/polygon-zkevm.md
+++ b/docs/quickstart/quickstart_chains/polygon-zkevm.md
@@ -1,14 +1,10 @@
# Polygon zkEVM Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://zkevm.polygonscan.com/token/0x4f9a0e7fd2bf6067db6994cf12e4495df938e6e9) on [Polygon zkEVM](https://zkevm.polygonscan.com) Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Polygon zkEVM project.
-:::
+Please initialise an a Polygon zkEVM project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Polygon/polygon-zkevm-starter).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Polygon zkEVM. Since Polygon zkEVM is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Polygon zkEVM project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Polygon zkEVM network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -82,11 +70,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://zkevm.polygonscan.com/token/0x4f9a0e7fd2bf6067db6994cf12e4495df938e6e9).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -110,51 +96,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -202,73 +158,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -371,14 +267,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Polygon/polygon-zkevm-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/polygon.md b/docs/quickstart/quickstart_chains/polygon.md
index 1c0b92c03b0..2f7e095be82 100644
--- a/docs/quickstart/quickstart_chains/polygon.md
+++ b/docs/quickstart/quickstart_chains/polygon.md
@@ -1,16 +1,8 @@
# Polygon Quick Start
-## Goals
-
The goal of this quick start guide is to index all token deposits from the Polygon Plasma Bridge.
-::: warning
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Polygon project**
-:::
-
-Now, let's move forward and update these configurations.
-
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Polygon/polygon-plasma-bridge).
@@ -22,20 +14,12 @@ The final code of this project can be found [here](https://github.com/subquery/e
-## 1. Update Your Project Manifest File
+
::: warning Important
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Handler`) for Polygon. Since Polygon is a layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-The Project Manifest (`project.ts`) file works as an entry point to your Polygon project. It defines most of the details on how SubQuery will index and transform the chain data. For Polygon, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/polygon.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
-
We are indexing actions from the Plasma bridge contract, first you will need to import the contract abi defintion from https://polygonscan.com/tx/0x88e3ab569326b52e9dd8a5f72545d89d8426bbf536f3bfaf31e023fb459ca373. You can copy the entire JSON and save as a file `plasma.abi.json` in the root directory.
This section in the Project Manifest now imports all the correct definitions and lists the triggers that we look for on the blockchain when indexing.
@@ -85,11 +69,9 @@ This section in the Project Manifest now imports all the correct definitions and
The above code indicates that you will be running a `handleDeposit` mapping function whenever there is an `TokenDeposited` log on any transaction from the [Plasma Bridge contract](https://polygonscan.com/tx/0x88e3ab569326b52e9dd8a5f72545d89d8426bbf536f3bfaf31e023fb459ca373). Simarly, you'll be running a `handleWithdrawl` mapping function whenever there is an `TokenWithdrawn` logs.
-Check out our [Manifest File](../../build/manifest/polygon.md) documentation to get more information about the Project Manifest (`project.ts`) file.
+
-## 2. Update Your GraphQL Schema File
-
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing three entities, a `Deposit` and a `Withdrawl` each with a [foreign key relationship](../../build/graphql.md#entity-relationships) to the `User`.
@@ -119,53 +101,21 @@ type User @entity {
}
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory.
-:::
-
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
+
-```shell
-npm run-script codegen
-```
-
-:::
-
-This will create a new directory (or update the existing) `src/types` which contain generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entites can be imported from the following directory:
+
```ts
import { Deposit, User, Withdrawl } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In the example Polygon SubQuery project, you would import these types like so.
-
-```ts
import {
TokenWithdrawnLog,
TokenDepositedLog,
} from "../types/abi-interfaces/PlasmaAbi";
```
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.
-
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Follow these steps to add a mapping function:
@@ -240,73 +190,13 @@ For `handleDeposit`, the function here receives an `TokenDepositedLog` which inc
For `handleWithdrawl`, the function here receives an `TokenWithdrawnLog` which includes transaction log data in the payload. We extract this data and first confirm if we have a `User` record via `checkGetUser`. We then create a new `Withdrawl` entity that we defined in our `schema.graphql` and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
-
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
+
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
query {
@@ -480,14 +370,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Polygon/polygon-plasma-bridge).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/scroll-sepolia.md b/docs/quickstart/quickstart_chains/scroll-sepolia.md
index e9eb0b3788d..4cf7f27a897 100644
--- a/docs/quickstart/quickstart_chains/scroll-sepolia.md
+++ b/docs/quickstart/quickstart_chains/scroll-sepolia.md
@@ -1,14 +1,10 @@
# Scroll (Sepolia Testnet) Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped Eth](https://sepolia-blockscout.scroll.io/address/0x5300000000000000000000000000000000000004) on Scroll's [Sepolia](https://sepolia-blockscout.scroll.io/) Test Network .
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Scroll Sepolia project.
-:::
+Please initialise an a Scroll Sepolia project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Scroll/scroll-sepolia-starter).
@@ -16,16 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Scroll. Since Scroll is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Scroll project. It defines most of the details on how SubQuery will index and transform the chain data. For
-Scroll, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on Scroll's Sepolia network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -84,11 +71,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://sepolia-blockscout.scroll.io/address/0x5300000000000000000000000000000000000004).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -112,51 +97,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -197,73 +152,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -398,14 +293,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Scroll/scroll-sepolia-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/scroll.md b/docs/quickstart/quickstart_chains/scroll.md
index 0dee5456558..2276bdd509d 100644
--- a/docs/quickstart/quickstart_chains/scroll.md
+++ b/docs/quickstart/quickstart_chains/scroll.md
@@ -1,14 +1,10 @@
# Scroll Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from [USDC](https://scrollscan.com/token/0x06efdbff2a14a7c8e15944d1f4a48f9f95f663a4) on Scroll's Mainnet.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Scroll project.
-:::
+Please initialise an a Scroll project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Scroll/scroll-starter).
@@ -16,16 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Scroll. Since Scroll is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Scroll project. It defines most of the details on how SubQuery will index and transform the chain data. For
-Scroll, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the USDC contract on Scroll's network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +70,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [USDC contract](https://scrollscan.com/token/0x06efdbff2a14a7c8e15944d1f4a48f9f95f663a4).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,51 +96,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -196,73 +151,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -397,14 +292,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Scroll/scroll-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/skale.md b/docs/quickstart/quickstart_chains/skale.md
index 94c8c8e05a8..2643dab1a69 100644
--- a/docs/quickstart/quickstart_chains/skale.md
+++ b/docs/quickstart/quickstart_chains/skale.md
@@ -1,14 +1,10 @@
# Skale Europa Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [SKL Token](https://elated-tan-skat.explorer.mainnet.skalenodes.com/token/0x871Bb56655376622A367ece74332C449e5bAc433) on [Skale Europa](https://skale.space/) Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a Skale project.
-:::
+Please initialise an a Skale project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Skale/skale-starter).
@@ -16,15 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for Skale. Since Skale Europa is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your Skale project. It defines most of the details on how SubQuery will index and transform the chain data. For Poltgon zkEVM, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the SKL Token contract on Skale network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +71,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [SKL Token](https://elated-tan-skat.explorer.mainnet.skalenodes.com/token/0x871Bb56655376622A367ece74332C449e5bAc433).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,51 +97,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -196,73 +152,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -332,14 +228,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Skale/skale-starter/).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/snippets/ethereum-gravatar.md b/docs/quickstart/quickstart_chains/snippets/ethereum-gravatar.md
deleted file mode 100644
index f2ab13a4f7e..00000000000
--- a/docs/quickstart/quickstart_chains/snippets/ethereum-gravatar.md
+++ /dev/null
@@ -1,3 +0,0 @@
-::: warning Important
-We suggest starting with the [Ethereum Gravatar example](./ethereum-gravatar). This project is a lot more complicated and introduces some more advanced concepts
-:::
diff --git a/docs/quickstart/quickstart_chains/stellar.md b/docs/quickstart/quickstart_chains/stellar.md
index 019ecfb6b40..9f212b9ea45 100644
--- a/docs/quickstart/quickstart_chains/stellar.md
+++ b/docs/quickstart/quickstart_chains/stellar.md
@@ -1,17 +1,11 @@
# Stellar & Soroban Quick Start
-## Goals
-
The goal of this quick start guide is to give a quick intro to all features of our Stellar and Soroban indexer. The example project indexes all soroban transfer events on Stellar's Futurenet. It also indexes all account payments including credits and debits - it's a great way to quickly learn how SubQuery works on a real world hands-on example.
-::: warning Important
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Stellar Futurenet project**
-:::
+
Now, let's move forward and update these configurations.
-Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/stellar-subql-starter/tree/main/Stellar/soroban-futurenet-starter).
:::
@@ -301,71 +295,13 @@ The `handleEvent` mapping function is for Soroban smart contracts, and the paylo
Check out our [Mappings](../../build/mapping/stellar.md) documentation to get more information on mapping functions.
-## 4. Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
+
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md). The query shows a list of the most recent prices, and the most active oracles(by number of prices submitted).
+
```graphql
{
@@ -410,14 +346,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/stellar-subql-starter/tree/main/Stellar/soroban-futurenet-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_chains/zksync-era.md b/docs/quickstart/quickstart_chains/zksync-era.md
index e6dc919c183..7f606a48d76 100644
--- a/docs/quickstart/quickstart_chains/zksync-era.md
+++ b/docs/quickstart/quickstart_chains/zksync-era.md
@@ -1,14 +1,10 @@
# ZkSync (Era) Quick Start
-## Goals
-
The goal of this quick start guide is to index all transfers and approval events from the [Wrapped ETH](https://explorer.zksync.io/address/0x3355df6D4c9C3035724Fd0e3914dE96A5a83aaf4) on [ZkSync Era](explorer.zksync.io) Network.
-::: warning
-Before we begin, **make sure that you have initialised your project** using the provided steps in the [Start Here](../quickstart.md) section. Please initialise an a ZkSync Era project.
-:::
+Please initialise an a ZkSync Era project.
-In every SubQuery project, there are [3 key files](../quickstart.md#_3-make-changes-to-your-project) to update. Let's begin updating them one by one.
+
::: tip Note
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Zksync/zksync-starter/).
@@ -16,16 +12,7 @@ The final code of this project can be found [here](https://github.com/subquery/e
We use Ethereum packages, runtimes, and handlers (e.g. `@subql/node-ethereum`, `ethereum/Runtime`, and `ethereum/*Hander`) for ZkSync Era. Since ZkSync Era is an EVM-compatible layer-2 scaling solution, we can use the core Ethereum framework to index it.
:::
-## 1. Your Project Manifest File
-
-The Project Manifest (`project.ts`) file works as an entry point to your ZkSync project. It defines most of the details on how SubQuery will index and transform the chain data. For
-ZkSync Era, there are three types of mapping handlers (and you can have more than one in each project):
-
-- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
-
-Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to import the correct contract definitions and update the datasource handlers.
+
As we are indexing all transfers and approvals from the Wrapped ETH contract on ZkSync network, the first step is to import the contract abi definition which can be obtained from from any standard [ERC-20 contract](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/). Copy the entire contract ABI and save it as a file called `erc20.abi.json` in the `/abis` directory.
@@ -83,11 +70,9 @@ The above code indicates that you will be running a `handleTransaction` mapping
The code also indicates that you will be running a `handleLog` mapping function whenever there is a `Transfer` event being emitted from the [WETH contract](https://explorer.zksync.io/address/0x3355df6D4c9C3035724Fd0e3914dE96A5a83aaf4).
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
-
-## 2. Update Your GraphQL Schema File
+
-The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
Remove all existing entities and update the `schema.graphql` file as follows. Here you can see we are indexing block information such as the id, blockHeight, transfer receiver and transfer sender along with an approvals and all of the attributes related to them (such as owner and spender etc.).
@@ -111,51 +96,21 @@ type Approval @entity {
}
```
-SubQuery makes it easy and type-safe to work with your GraphQL entities, as well as smart contracts, events, transactions, and logs. SubQuery CLI will generate types from your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
+
-This will create a new directory (or update the existing one) `src/types` which contains generated entity classes for each type you have defined previously in `schema.graphql`. These classes provide type-safe entity loading, and read and write access to entity fields - see more about this process in [the GraphQL Schema](../../build/graphql.md). All entities can be imported from the following directory:
+
```ts
import { Approval, Transfer } from "../types";
-```
-
-As you're creating a new EVM based project, this command will also generate ABI types and save them into `src/types` using the `npx typechain --target=ethers-v5` command, allowing you to bind these contracts to specific addresses in the mappings and call read-only contract methods against the block being processed.
-
-It will also generate a class for every contract event to provide easy access to event parameters, as well as the block and transaction the event originated from. Read about how this is done in [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis).
-
-In this example SubQuery project, you would import these types like so.
-
-```ts
import {
ApproveTransaction,
TransferLog,
} from "../types/abi-interfaces/Erc20Abi";
```
-::: warning Important
-When you make any changes to the schema file, please ensure that you regenerate your types directory using the SubQuery CLI prompt `yarn codegen` or `npm run-script codegen`.
-:::
-
-Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
-## 3. Add a Mapping Function
-
-Mapping functions define how chain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
Navigate to the default mapping function in the `src/mappings` directory. You will be able to see two exported functions `handleLog` and `handleTransaction`:
@@ -196,73 +151,13 @@ The `handleLog` function receives a `log` parameter of type `TransferLog` which
The `handleTransaction` function receives a `tx` parameter of type `ApproveTransaction` which includes transaction data in the payload. We extract this data and then save this to the store using the `.save()` function (_Note that SubQuery will automatically save this to the database_).
-Check out our [Mappings](../../build/mapping/ethereum.md) documentation to get more information on mapping functions.
+
-## 4. Build Your Project
+
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
+
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## 5. Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## 6. Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following query to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
```graphql
# Write your query or mutation here
@@ -397,14 +292,4 @@ You will see the result similar to below:
The final code of this project can be found [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Zksync/zksync-starter).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_multichain/galxe-nft.md b/docs/quickstart/quickstart_multichain/galxe-nft.md
index 5b365dfeac2..7207878e6bc 100644
--- a/docs/quickstart/quickstart_multichain/galxe-nft.md
+++ b/docs/quickstart/quickstart_multichain/galxe-nft.md
@@ -6,19 +6,13 @@ By the end of this guide, you will gain a deep understanding of Galxe NFTs, gras
A vital aspect of the Galxe platform revolves around the concept of campaigns. These campaigns serve as a collaborative credential infrastructure, enabling brands to enhance their web3 communities and products. What Galxe essentially does is utilise both on-chain and off-chain credentials to assist brands and protocols in their growth hacking campaigns. Users who complete campaign tasks receive on-chain proof of their accomplishments, which allows them to mint a Galxe NFT OAT (On-Chain Achievement Token).
-## Setting Up the Indexer
+
Galxe has been deployed on different blockchain networks, sometimes with different contract addresses. But because the same smart contract code was used, each one has the same methods and events.
-::: warning Important
-**This project operates across multiple chains, making it more complex than other single chain examples.**
+
-If you are new to SubQuery, we recommend starting your learning journey with single-chain examples, such as the [Ethereum Gravatar example](../quickstart_chains/ethereum-gravatar.md). After understanding the fundamentals, you can then advance to exploring the multi-chain examples.
-:::
-
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Ethereum project**. Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-
-As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on blockchain scanners. For instance, you can locate the ABI for the Galxy Ethereum SpaceStationV2 smart contract at the bottom of [this page](https://etherscan.io/address/0x75cdA57917E9F73705dc8BCF8A6B2f99AdBdc5a5#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+For instance, you can locate the ABI for the Galxy Ethereum SpaceStationV2 smart contract at the bottom of [this page](https://etherscan.io/address/0x75cdA57917E9F73705dc8BCF8A6B2f99AdBdc5a5#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
::: tip Note
The configuration code snippets shared below have been made simpler to improve clarity and will focus exclusively on the NFT claim handling logic.
@@ -26,9 +20,7 @@ The configuration code snippets shared below have been made simpler to improve c
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/galxe) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-### 1.Configuring the Manifest Files
-
-Let's start by setting up an Ethereum indexer that we can later use for different chains. To do this, you need to configure handlers to index specific logs from the contracts.
+
Because there are numerous handlers with various configurations for each network, involving differences in available smart contracts, their addresses, start blocks, and protocol versions, the manifest files will be quite extensive. As a solution, we've developed a script that can generate the manifest files with the correct configurations automatically. You can find the steps to do this [here](https://github.com/subquery/ethereum-subql-starter/blob/main/Multi-Chain/galxe/README.md#add-your-chain).
@@ -189,11 +181,9 @@ dataSources:
:::
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest file.
-:::
+
-Then, create a [multi-chain manifest file](../../build/multi-chain#1-create-a-multi-chain-manifest-file). By following the steps outlined [here](../../build/multi-chain#3-add-a-new-network-to-the-multi-chain-manifest) (using the `subql multi-chain:add` command), start adding the new networks. After you successfuly apply the correct entities for each chain, you will end up with a single `subquery-multichain.yaml` file that we'll map to the individual chain manifest files. This multi-chain manifest file will look something like this:
+
::: code-tabs
@@ -592,11 +582,9 @@ dataSources:
:::
-As evident from the examples above, we employ various handlers for different chains, while keeping the indexed event logs the same. This approach is adopted to facilitate the identification of the originating network for each specific event (refer to this [tip](../../build/multi-chain#handling-network-specific-logic)). This strategy will prove beneficial later, as it allows us to incorporate a `network` field into the entities. This will simplify the execution of filtering, aggregation, and other data manipulation tasks.
-
-### 2. Updating the GraphQL Schema File
+
-The schema will consist of several objects, which will appear as follows:
+
```graphql
type SpaceStation @entity {
@@ -653,35 +641,10 @@ type NFTMintTransaction @entity {
The configuration defines several types for managing space stations, star NFTs, NFTs, campaigns, claim records, and NFT mint transactions. These types include fields like ID, version, claim, network, number, owner, campaign, verifyID, CID, user, transaction, block, timestamp, and more, all of which are used to organise and store information related to NFTs, their ownership, and related transactions on a blockchain network.
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
import { StarNFT, NFT, ClaimRecord } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
import {
EventClaimLog,
EventClaimBatchLog,
@@ -691,13 +654,7 @@ import {
} from "../types/abi-interfaces/SpaceStationV2";
```
-### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Creating mappings for this smart contract is a simple procedure. For added clarity, we have organised individual files for each protocol version in the `src/mappings` directory, specifically `spacestationv2.ts` and `spacestationv1.ts`. In essence, these files are not fundamentally different; they primarily vary in how they manage on-chain data. Let's analyse them separately, beginning with `spacestationv2.ts` since the second version is more pertinent.
@@ -1313,71 +1270,11 @@ The code you provided is similar to the code for the second version of the proto
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/galxe) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Network Metadatas
@@ -1547,14 +1444,4 @@ Try the following queries to understand how it works for your new SubQuery start
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/galxe).
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the Galxe NFTs from multiple blockchains and accepts GraphQL API requests.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_multichain/ibc-transfers.md b/docs/quickstart/quickstart_multichain/ibc-transfers.md
index ab294ef6739..537991b9b09 100644
--- a/docs/quickstart/quickstart_multichain/ibc-transfers.md
+++ b/docs/quickstart/quickstart_multichain/ibc-transfers.md
@@ -1,18 +1,20 @@
# Multichain Quick Start - IBC Transfers
+
+
+[IBC Starter Example](https://github.com/subquery/cosmos-subql-starter/tree/main/Multi-Chain/osmosis-cosmos-bridge)
+
+
+
This tutorial provides a comprehensive guide on establishing a multi-chain indexer for indexing Inter-Blockchain Communication (IBC) activities among Cosmos Zones. The tutorial demonstrates the integration of bi-directional transfers between Osmosis and Cosmos Hub, while also highlighting the flexibility to effortlessly include additional chains.
Upon completing this guide, you will gain insights into effectively correlating event data across multiple networks. Furthermore, you'll acquire the knowledge to configure a SubQuery indexer, enabling the monitoring, tracking, and aggregation of events from various Cosmos blockchains within a unified entity.
-
+
-::: tip
-The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Multi-Chain/osmosis-cosmos-bridge).
-:::
-
-
+
Beginning with Osmosis, the manifest file for this will be as follows:
@@ -94,7 +96,7 @@ dataSources:
Here, again we are relying to the data of the same events. Events of both chanins will be processed asynchronously, without a specific order, and will be matched according to their data.
-
+
```graphql
type BridgeEvent @entity {
@@ -117,7 +119,7 @@ The primary event is the `BridgeEvent`, which contains information about the exe
-
+
::: code-tabs
@tab:active `mappingHandlers.ts`
diff --git a/docs/quickstart/quickstart_multichain/polygon-plasma-bridge.md b/docs/quickstart/quickstart_multichain/polygon-plasma-bridge.md
index 1ff2ecfc9d4..c35fefd8b8a 100644
--- a/docs/quickstart/quickstart_multichain/polygon-plasma-bridge.md
+++ b/docs/quickstart/quickstart_multichain/polygon-plasma-bridge.md
@@ -4,29 +4,17 @@ This page explains how to create an multi-chain indexer to index the bridge tran
There are two types of bridge on Polygon for asset transfer, the Proof of Stake (PoS) Bridge and the Plasma Bridge. The PoS Bridge, as the name suggests, adopts the Proof of Stake (PoS) consensus algorithm to secure its network. Deposits on the PoS Bridge are completed almost instantly, but withdrawals may take a while to confirm. On the other hand, the Plasma Bridge supports the transfer of Polygon's native token `MATIC` and certain Ethereum tokens (`ETH`, ERC-20, and ERC-721). It uses the Ethereum Plasma scaling solution to offer increased security.
-## Setting Up the Indexer
+
Plasma bridge contracts have been deployed on both networks. In order to establish an indexer, we will need to asynchronously align the events from both of those smart contracts.
-::: warning Important
-**This project operates across multiple chains, making it more complex than other single chain examples.**
-
-If you are new to SubQuery, we recommend starting your learning journey with single-chain examples, such as the [Polygon Plasma Bridge Deposits](../quickstart_chains/polygon), where we handle the same event, from Polygon, but don't proceed with its matching on the Ethereum side. After understanding the fundamentals, you can then advance to exploring the multi-chain examples.
-:::
-
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Ethereum project**. Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-
-As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on blockchain scanners. For instance, you can locate the ABI for the Plasma Ethereum smart contract at the bottom of [this page](https://etherscan.io/address/0x401F6c983eA34274ec46f84D70b31C151321188b#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+
::: tip Note
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/polygon-plasma-bridge) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-In this Plasma indexing project, our main focus is on configuring the indexer to exclusively capture a two logs generated in both Plasma smart contracts:
-
-### 1. Configuring the Manifest Files
-
-To begin, we will establish an Polygon indexer manifest file since we are discussing a case for transfers that are initiated on the Polygon side:
+
::: code-tabs
@@ -58,13 +46,11 @@ dataSources:
As you can see, we are only looking for a signle log - `TokenDeposited`. Data from this log will consequently be compared with the one emited in Ethereum network.
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest file.
-:::
+
Next, change the name of the file mentioned above to `polygon.yaml` to indicate that this file holds the Ethereum configuration.
-Then, create a [multi-chain manifest file](../../build/multi-chain#1-create-a-multi-chain-manifest-file). After, following the steps outlined [here](../../build/multi-chain#3-add-a-new-network-to-the-multi-chain-manifest), start adding the new networks. After you successfuly apply the correct entities for each chain, you will end up with a single `subquery-multichain.yaml` file that we'll map to the individual chain manifest files. This multi-chain manifest file will look something like this:
+
::: code-tabs
@@ -114,7 +100,9 @@ dataSources:
Here, again we are relying to the data of the single log, `NewDepositBlock`. Both logs will be processed asynchronously, without a specific order, and will be matched according to their data.
-### 2. Updating the GraphQL Schema File
+
+
+
From the aforementioned logs the following entities can be derived:
@@ -155,26 +143,7 @@ type User @entity {
These types help organise and query information about deposits, transactions, and users within SubQuery bridge indexer.
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
import {
@@ -183,22 +152,11 @@ import {
DepositOnPolygon,
BridgeTransaction,
} from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
import { TokenDepositedLog } from "../types/abi-interfaces/PlasmaAbi";
import { NewDepositBlockLog } from "../types/abi-interfaces/PlasmaEthAbi";
```
-### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Setting up mappings for this smart contract is straightforward. In this instance, the mappings are stored within the `src/mappings` directory, with the sole mapping file being `mappingHandlers.ts`. Now, let's take a closer look at it:
@@ -311,71 +269,11 @@ The `checkGetUser` function is defined to retrieve a user's record from a databa
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/polygon-plasma-bridge) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
+
-1. Open your browser and head to `http://localhost:3000`.
+
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Bridge Transactions
@@ -493,14 +391,4 @@ Try the following queries to understand how it works for your new SubQuery start
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/polygon-plasma-bridge).
:::
-## What's next?
-
-Well done! You've successfully set up a SubQuery project that's locally running. This project indexes the Plasma Bridge smart contracts. What's even more impressive is that it accomplishes this from multiple blockchains and allows GraphQL API requests to be made from a single endpoint.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_multichain/safe.md b/docs/quickstart/quickstart_multichain/safe.md
index 67d889d9120..0a690777069 100644
--- a/docs/quickstart/quickstart_multichain/safe.md
+++ b/docs/quickstart/quickstart_multichain/safe.md
@@ -1,24 +1,18 @@
# Multichain Quick Start - Safe
-## Goals
-
This page explains how to create an multi-chain indexer for [Safe](https://safe.global/), a system that makes secure wallets requiring multiple authorisations. This boosts security and lowers the risk of unauthorised use.
After reading this guide, you'll understand the protocol, know about multi-signature setups, and learn how to set up a SubQuery indexer to monitor and track signed message events on different EVM blockchains.
-## Setting Up the Indexer
+
Safe factory contracts have been deployed on various blockchain networks, sometimes using different contract addresses. Nevertheless, as the same smart contract was utilised, every instance retains the same collection of functions and events.
-::: warning Important
-**This project operates across multiple chains, making it more complex than other single chain examples.**
-
-If you are new to SubQuery, we recommend starting your learning journey with single-chain examples, such as the [Ethereum Gravatar example](../quickstart_chains/ethereum-gravatar.md). After understanding the fundamentals, you can then advance to exploring the multi-chain examples.
-:::
+
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Ethereum project**. Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
+As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on blockchain scanners.
-As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on blockchain scanners. For instance, you can locate the ABI for the Safe Ethereum smart contract at the bottom of [this page](https://etherscan.io/address/0x12302fE9c02ff50939BaAaaf415fc226C078613C#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+For instance, you can locate the ABI for the Safe Ethereum smart contract at the bottom of [this page](https://etherscan.io/address/0x12302fE9c02ff50939BaAaaf415fc226C078613C#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
::: tip Note
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/safe) to observe the integration of all previously mentioned configurations into a unified codebase.
@@ -32,7 +26,7 @@ In this Safe indexing project, our primary focus lies in configuring the indexer
2. **Individual Safe Smart Contracts**: These contracts encompass all the essential functionality needed for establishing and executing Safe transactions.
-### 1. Configuring the Manifest Files
+
To begin, we will establish an Ethereum indexer. As Safe proxies have undergone multiple updates, the indexing process necessitates the configuration of three handlers. In this illustration, we introduce specific smart contracts along with their respective addresses and logs:
@@ -127,13 +121,11 @@ templates:
:::
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest file.
-:::
+
Next, change the name of the file mentioned above to `ethereum.yaml` to indicate that this file holds the Ethereum configuration.
-Then, create a [multi-chain manifest file](../../build/multi-chain#1-create-a-multi-chain-manifest-file). After, following the steps outlined [here](../../build/multi-chain#3-add-a-new-network-to-the-multi-chain-manifest), start adding the new networks. After you successfuly apply the correct entities for each chain, you will end up with a single `subquery-multichain.yaml` file that we'll map to the individual chain manifest files. This multi-chain manifest file will look something like this:
+
::: code-tabs
@@ -371,11 +363,9 @@ repository: https://github.com/subquery/ethereum-subql-starter
:::
-As evident from the examples above, we employ various handlers for different chains, while keeping the indexed event logs the same. This approach is adopted to facilitate the identification of the originating network for each specific event (refer to this [tip](../../build/multi-chain#handling-network-specific-logic)). This strategy will prove beneficial later, as it allows us to incorporate a `network` field into the entities. This will simplify the execution of filtering, aggregation, and other data manipulation tasks.
+
-### 2. Updating the GraphQL Schema File
-
-For the sake of simplicity, the schema will consist of just one object, which will appear as follows.
+
```graphql
type Sig @entity {
@@ -389,36 +379,10 @@ type Sig @entity {
This single object is `Sig`, containing several parameters to be filled from on-chain data. Additionally, it will include a `network` attribute explicitly provided through mapping logic.
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
import { Sig } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
-// Import a smart contract event class generated from provided ABIs
-
import { ProxyCreationLog as ProxyCreation_v1_0_0 } from "../types/abi-interfaces/GnosisSafeProxyFactory_v100";
import { ProxyCreationLog as ProxyCreation_v1_1_1 } from "../types/abi-interfaces/GnosisSafeProxyFactory_v111";
import { ProxyCreationLog as ProxyCreation_v1_3_0 } from "../types/abi-interfaces/GnosisSafeProxyFactory_v130";
@@ -426,13 +390,7 @@ import { ProxyCreationLog as ProxyCreation_v1_3_0 } from "../types/abi-interface
import { SignMsgLog } from "../types/abi-interfaces/GnosisSafe";
```
-### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Setting up mappings for this smart contract is straightforward. In this instance, the mappings are stored within the `src/mappings` directory, with the sole mapping file being `factory.ts`. Now, let's take a closer look at it:
@@ -523,71 +481,11 @@ This code essentially centralises the handling of `SignMsg` events for various n
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/safe) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
-
-@tab npm
-
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
+
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
+
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Sigs
@@ -710,14 +608,4 @@ Try the following queries to understand how it works for your new SubQuery start
:::
-## What's next?
-
-Well done! You've successfully set up a SubQuery project that's locally running. This project indexes the Safe proxy smart contracts responsible for creating individual Safe contracts. For each created smart contract, it indexes the sign event and stores it in a dedicated entity. What's even more impressive is that it accomplishes this from multiple blockchains and allows GraphQL API requests to be made from a single endpoint.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_multichain/snapshot.md b/docs/quickstart/quickstart_multichain/snapshot.md
index 0b2e28bdb1a..ff39610f926 100644
--- a/docs/quickstart/quickstart_multichain/snapshot.md
+++ b/docs/quickstart/quickstart_multichain/snapshot.md
@@ -1,32 +1,22 @@
# Multichain Quick Start - Snapshot
-## Goals
-
The objective of this article is to provide a comprehensive, step-by-step manual for establishing a multi-chain indexer compatible with [Snapshot](https://docs.snapshot.org/), a voting platform that facilitates effortless and gas-free voting for DAOs, DeFi protocols, and NFT communities.
An integral component of this platform is the concept of delegations, which allows users to entrust their voting authority to another wallet. Unlike other actions within Snapshot, the act of delegation occurs directly on the blockchain.
By the conclusion of this guide, you will gain the insights into Snapshot, understand the intricacies of delegation, and acquire the knowledge necessary to configure a SubQuery indexer capable of monitoring and indexing delegation-related events across multiple blockchains.
-## Setting Up the Indexer
+
Snapshot has been implemented across multiple blockchain networks, occasionally with distinct contract addresses. However, because the identical smart contract was employed, each instance maintains an identical set of methods and events.
-::: warning Important
-**This project operates across multiple chains, making it more complex than other single chain examples.**
-
-If you are new to SubQuery, we recommend starting your learning journey with single-chain examples, such as the [Ethereum Gravatar example](../quickstart_chains/ethereum-gravatar.md). After understanding the fundamentals, you can then advance to exploring the multi-chain examples.
-:::
-
-Before we begin, make sure that you have initialised your project using the provided steps in the [Start Here](../quickstart.md) section. **Please initialise a Ethereum project**. Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-
-As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on blockchain scanners. For instance, you can locate the ABI for the Snapshot Ethereum smart contract at the bottom of [this page](https://etherscan.io/address/0x469788fE6E9E9681C6ebF3bF78e7Fd26Fc015446#code). Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
+
::: tip Note
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/snapshot) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-### 1.Configuring the Manifest Files
+
Let's start by setting up an Ethereum indexer that we can later use for different chains. To do this, you only need to configure two handlers to index specific logs from the contract, namely the `SetDelegate` and `ClearDelegate` logs. Update your manifest file as shown below:
@@ -61,13 +51,11 @@ dataSources:
:::
-::: tip Note
-Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest file.
-:::
+
Next, change the name of the file mentioned above to `ethereum.yaml` to indicate that this file holds the Ethereum configuration.
-Then, create a [multi-chain manifest file](../../build/multi-chain#1-create-a-multi-chain-manifest-file). After, following the steps outlined [here](../../build/multi-chain#3-add-a-new-network-to-the-multi-chain-manifest), start adding the new networks. After you Successfuly apply the correct entities for each chain, you will end up with a single `subquery-multichain.yaml` file that we'll map to the individual chain manifest files. This multi-chain manifest file will look something like this:
+
::: code-tabs
@@ -235,11 +223,9 @@ repository: https://github.com/subquery/ethereum-subql-starter
:::
-As evident from the examples above, we employ various handlers for different chains, while keeping the indexed event logs the same. This approach is adopted to facilitate the identification of the originating network for each specific event (refer to this [tip](../../build/multi-chain#handling-network-specific-logic)). This strategy will prove beneficial later, as it allows us to incorporate a `network` field into the entities. This will simplify the execution of filtering, aggregation, and other data manipulation tasks.
+
-### 2. Updating the GraphQL Schema File
-
-For the sake of simplicity, the schema will consist of just one object, which will appear as follows.
+
```graphql
type Delegation @entity {
@@ -254,45 +240,15 @@ type Delegation @entity {
This single object is `Delegation`, containing several parameters to be filled from on-chain data. Additionally, it will include a `network` attribute explicitly provided through mapping logic.
-SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn codegen
-```
-
-@tab npm
-
-```shell
-npm run-script codegen
-```
-
-:::
-
-This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
-
-You can conveniently import all these entities from the following directory:
+
```ts
import { Delegation } from "../types";
-```
-
-It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
-
-```ts
// Import a smart contract event class generated from provided ABIs
import { SetDelegateLog } from "../types/abi-interfaces/DelegateRegistry";
```
-### 3. Writing the Mappings
-
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
-
-::: tip Note
-For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
-:::
+
Creating mappings for this smart contract is a simple procedure. For added clarity, we have organised individual files for each event in the `src/mappings` directory, specifically `clearDelegate.ts` and `setDelegate.ts`. Let's examine them individually.
@@ -474,71 +430,11 @@ Similar to the approach taken in the [`setDelegate.ts`](#setdelegratets) file, t
Check the final code repository [here](https://github.com/subquery/ethereum-subql-starter/tree/main/Multi-Chain/snapshot) to observe the integration of all previously mentioned configurations into a unified codebase.
:::
-## Build Your Project
-
-Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn build
-```
+
-@tab npm
+
-```shell
-npm run-script build
-```
-
-:::
-
-::: warning Important
-Whenever you make changes to your mapping functions, you must rebuild your project.
-:::
-
-Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
-
-## Run Your Project Locally with Docker
-
-Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
-
-The `docker-compose.yml` file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
-
-However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get more information on the file and the settings.
-
-Run the following command under the project directory:
-
-::: code-tabs
-@tab:active yarn
-
-```shell
-yarn start:docker
-```
-
-@tab npm
-
-```shell
-npm run-script start:docker
-```
-
-:::
-
-::: tip Note
-It may take a few minutes to download the required images and start the various nodes and Postgres databases.
-:::
-
-## Query your Project
-
-Next, let's query our project. Follow these three simple steps to query your SubQuery project:
-
-1. Open your browser and head to `http://localhost:3000`.
-
-2. You will see a GraphQL playground in the browser and the schemas which are ready to query.
-
-3. Find the _Docs_ tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
-
-Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the [GraphQL Query language](../../run_publish/query.md).
+
::: details Delegations
@@ -666,14 +562,4 @@ Try the following queries to understand how it works for your new SubQuery start
:::
-## What's next?
-
-Congratulations! You have now a locally running SubQuery project that indexes the Snapshot delegation entitiy from multiple blockchains and accepts GraphQL API requests.
-
-::: tip Tip
-
-Find out how to build a performant SubQuery project and avoid common mistakes in [Project Optimisation](../../build/optimisation.md).
-
-:::
-
-Click [here](../../quickstart/whats-next.md) to learn what should be your **next step** in your SubQuery journey.
+
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-cosmos-manifest-intro.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-cosmos-manifest-intro.md
new file mode 100644
index 00000000000..0f9155d3ad3
--- /dev/null
+++ b/docs/quickstart/quickstart_multichain/snippets/multi-chain-cosmos-manifest-intro.md
@@ -0,0 +1,29 @@
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
+
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
+
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-evm-manifest-intro.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-evm-manifest-intro.md
new file mode 100644
index 00000000000..0f9155d3ad3
--- /dev/null
+++ b/docs/quickstart/quickstart_multichain/snippets/multi-chain-evm-manifest-intro.md
@@ -0,0 +1,29 @@
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
+
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
+
+
+
+
+
+The Multichain project contains multiple manifest files, with support for the following handlers:
+
+
+
+
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-cosmos-intro.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-cosmos-intro.md
deleted file mode 100644
index b2e8132c07b..00000000000
--- a/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-cosmos-intro.md
+++ /dev/null
@@ -1,3 +0,0 @@
-
-
-
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-intro.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-intro.md
deleted file mode 100644
index d4472ec43e3..00000000000
--- a/docs/quickstart/quickstart_multichain/snippets/multi-chain-manifest-intro.md
+++ /dev/null
@@ -1,3 +0,0 @@
-## Your Project Manifest Files
-
-The Project Manifests files is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data. For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-network-origin-note.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-network-origin-note.md
new file mode 100644
index 00000000000..e730afcbb84
--- /dev/null
+++ b/docs/quickstart/quickstart_multichain/snippets/multi-chain-network-origin-note.md
@@ -0,0 +1 @@
+As evident from the examples above, we employ various handlers for different chains, while keeping the indexed event logs the same. This approach is adopted to facilitate the identification of the originating network for each specific event (refer to this [tip](../../build/multi-chain#handling-network-specific-logic)). This strategy will prove beneficial later, as it allows us to incorporate a `network` field into the entities. This will simplify the execution of filtering, aggregation, and other data manipulation tasks.
diff --git a/docs/quickstart/quickstart_multichain/snippets/multi-chain-quickstart-reference.md b/docs/quickstart/quickstart_multichain/snippets/multi-chain-quickstart-reference.md
index 1409a5a4732..923669e7867 100644
--- a/docs/quickstart/quickstart_multichain/snippets/multi-chain-quickstart-reference.md
+++ b/docs/quickstart/quickstart_multichain/snippets/multi-chain-quickstart-reference.md
@@ -1,4 +1,4 @@
-## Setting Up the Indexer
-
::: warning Important
**This project operates across multiple chains, making it more complex than other single chain examples.**
+
+:::
diff --git a/docs/quickstart/snippets/arbitrum-manifest-note.md b/docs/quickstart/snippets/arbitrum-manifest-note.md
new file mode 100644
index 00000000000..9a7b0ab139d
--- /dev/null
+++ b/docs/quickstart/snippets/arbitrum-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/arbitrum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/arbitrum-mapping-note.md b/docs/quickstart/snippets/arbitrum-mapping-note.md
new file mode 100644
index 00000000000..15244f27c07
--- /dev/null
+++ b/docs/quickstart/snippets/arbitrum-mapping-note.md
@@ -0,0 +1 @@
+Check out our [Mappings](../../build/mapping/arbitrum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/avalanche-manifest-note.md b/docs/quickstart/snippets/avalanche-manifest-note.md
new file mode 100644
index 00000000000..67da3618e03
--- /dev/null
+++ b/docs/quickstart/snippets/avalanche-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/avalanche.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/avalanche-mapping-note.md b/docs/quickstart/snippets/avalanche-mapping-note.md
new file mode 100644
index 00000000000..28bb934d6e6
--- /dev/null
+++ b/docs/quickstart/snippets/avalanche-mapping-note.md
@@ -0,0 +1 @@
+Check out our [Mappings](../../build/mapping/avalanche.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/bsc-manifest-note.md b/docs/quickstart/snippets/bsc-manifest-note.md
new file mode 100644
index 00000000000..dbb82c54785
--- /dev/null
+++ b/docs/quickstart/snippets/bsc-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/gnosis.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/bsc-mapping-note.md b/docs/quickstart/snippets/bsc-mapping-note.md
new file mode 100644
index 00000000000..96859d5c21a
--- /dev/null
+++ b/docs/quickstart/snippets/bsc-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/gnosis.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/cosmos-manifest-intro.md b/docs/quickstart/snippets/cosmos-manifest-intro.md
new file mode 100644
index 00000000000..ad2f617cff2
--- /dev/null
+++ b/docs/quickstart/snippets/cosmos-manifest-intro.md
@@ -0,0 +1,23 @@
+
+
+
+
+For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
+
+
+
+Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
+
+
+
+
+
+
+For Cosmos chains, there are four types of mapping handlers (and you can have more than one in each project):
+
+
+
+Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that the manifest file looks for on the blockchain to start indexing.
+
+
diff --git a/docs/quickstart/snippets/cosmos-manifest-note.md b/docs/quickstart/snippets/cosmos-manifest-note.md
new file mode 100644
index 00000000000..5d900091ba0
--- /dev/null
+++ b/docs/quickstart/snippets/cosmos-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/cosmos-mapping-intro.md b/docs/quickstart/snippets/cosmos-mapping-intro.md
index 8e5a506c720..152c86d5fb5 100644
--- a/docs/quickstart/snippets/cosmos-mapping-intro.md
+++ b/docs/quickstart/snippets/cosmos-mapping-intro.md
@@ -1,7 +1,11 @@
-
+
+
+
::: tip Note
Check out our [Mappings](../../build/mapping/cosmos.md) documentation to get more information on mapping functions.
:::
Navigate to the default mapping function in the `src/mappings` directory. Setting up mappings for this the Cosmos chains is straightforward. In this instance, the mappings are stored within the `src/mappings` directory, with the sole mapping file being `mappingHandlers.ts`. Now, let's take a closer look at it:
+
+
diff --git a/docs/quickstart/snippets/cosmos-mapping-note.md b/docs/quickstart/snippets/cosmos-mapping-note.md
new file mode 100644
index 00000000000..0d2c75cc3fd
--- /dev/null
+++ b/docs/quickstart/snippets/cosmos-mapping-note.md
@@ -0,0 +1 @@
+Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
diff --git a/docs/quickstart/snippets/cosmos-quickstart-reference.md b/docs/quickstart/snippets/cosmos-quickstart-reference.md
index b53a8661c4e..488c4000f1f 100644
--- a/docs/quickstart/snippets/cosmos-quickstart-reference.md
+++ b/docs/quickstart/snippets/cosmos-quickstart-reference.md
@@ -1,3 +1,5 @@
-Osmosis based on the Cosmos SDK, which means you can index chain data via the standard Cosmos RPC interface.
+::: info
+This network is based on the Cosmos SDK, which means you can index chain data via the standard Cosmos RPC interface.
Before we begin, make sure that you have initialised your project using the provided steps in the **[Start Here](../quickstart.md)** section. You must complete the suggested [4 steps](https://github.com/subquery/cosmos-subql-starter#readme) for Cosmos users.
+:::
diff --git a/docs/quickstart/snippets/ethereum-manifest-note.md b/docs/quickstart/snippets/ethereum-manifest-note.md
new file mode 100644
index 00000000000..d276fbc9dd4
--- /dev/null
+++ b/docs/quickstart/snippets/ethereum-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/ethereum-mapping-note.md b/docs/quickstart/snippets/ethereum-mapping-note.md
new file mode 100644
index 00000000000..3276b13374f
--- /dev/null
+++ b/docs/quickstart/snippets/ethereum-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/ethereum.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/evm-codegen.md b/docs/quickstart/snippets/evm-codegen.md
index 7566d261e21..b3a34aa2e6a 100644
--- a/docs/quickstart/snippets/evm-codegen.md
+++ b/docs/quickstart/snippets/evm-codegen.md
@@ -3,3 +3,7 @@ SubQuery simplifies and ensures type-safety when working with GraphQL entities,
This action will generate a new directory (or update the existing one) named `src/types`. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your `schema.graphql`. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in [the GraphQL Schema section](../../build/graphql.md).
+
+It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the [EVM Codegen from ABIs](../../build/introduction.md#evm-codegen-from-abis) section. All of these types are stored in the `src/types/abi-interfaces` and `src/types/contracts` directories.
+
+You can conveniently import all these types:
diff --git a/docs/quickstart/snippets/evm-handlers.md b/docs/quickstart/snippets/evm-handlers.md
new file mode 100644
index 00000000000..e2b83b862f6
--- /dev/null
+++ b/docs/quickstart/snippets/evm-handlers.md
@@ -0,0 +1,3 @@
+- [BlockHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every block, run a mapping function
+- [TransactionHandlers](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
+- [LogHanders](../../build/manifest/ethereum.md#mapping-handlers-and-filters): On each and every log that matches optional filter criteria, run a mapping function
diff --git a/docs/quickstart/snippets/evm-manifest-intro.md b/docs/quickstart/snippets/evm-manifest-intro.md
new file mode 100644
index 00000000000..3f842e4b974
--- /dev/null
+++ b/docs/quickstart/snippets/evm-manifest-intro.md
@@ -0,0 +1,19 @@
+
+
+
+
+For EVM chains, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
+
+
+
+
+
+For EVM chains, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
diff --git a/docs/quickstart/snippets/evm-mapping-intro.md b/docs/quickstart/snippets/evm-mapping-intro.md
index 156a4e08ab9..71212cdcf64 100644
--- a/docs/quickstart/snippets/evm-mapping-intro.md
+++ b/docs/quickstart/snippets/evm-mapping-intro.md
@@ -1,3 +1,11 @@
+
+
+
+
::: tip Note
Check out our [Manifest File](../../build/manifest/ethereum.md) documentation to get more information about the Project Manifest (`project.ts`) file.
:::
+
+Navigate to the default mapping function in the `src/mappings` directory. Setting up mappings for this the Cosmos chains is straightforward. In this instance, the mappings are stored within the `src/mappings` directory, with the sole mapping file being `mappingHandlers.ts`. Now, let's take a closer look at it:
+
+
diff --git a/docs/quickstart/snippets/evm-quickstart-reference.md b/docs/quickstart/snippets/evm-quickstart-reference.md
index bd4ec5547b5..fbaaac8f5b9 100644
--- a/docs/quickstart/snippets/evm-quickstart-reference.md
+++ b/docs/quickstart/snippets/evm-quickstart-reference.md
@@ -1 +1,3 @@
-As a prerequisite, you will need to generate types from the ABI files of each smart contract. You can obtain these ABI files by searching for the ABIs of the mentioned smart contract addresses on Etherscan.
+In the earlier section titled "Create a New Project" (refer to [quickstart.md](../quickstart.md)), you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the [initialisation description](../quickstart.md#2-initialise-a-new-subquery-project). As a prerequisite, you will need to generate types from the ABI files of each smart contract.
+
+Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed [here](../quickstart.md#evm-project-scaffolding)). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
diff --git a/docs/quickstart/snippets/final-code.md b/docs/quickstart/snippets/final-code.md
new file mode 100644
index 00000000000..d6acdf6898e
--- /dev/null
+++ b/docs/quickstart/snippets/final-code.md
@@ -0,0 +1,14 @@
+Not included
+
+
+
+::: tip
+The final code of this project can be found here:
+
+
+
+
+
+:::
+
+
diff --git a/docs/quickstart/snippets/flare-manifest-note.md b/docs/quickstart/snippets/flare-manifest-note.md
new file mode 100644
index 00000000000..56540a1e21a
--- /dev/null
+++ b/docs/quickstart/snippets/flare-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/flare.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/flare-mapping-note.md b/docs/quickstart/snippets/flare-mapping-note.md
new file mode 100644
index 00000000000..dd262a20adc
--- /dev/null
+++ b/docs/quickstart/snippets/flare-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/flare.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/gnosis-manifest-note.md b/docs/quickstart/snippets/gnosis-manifest-note.md
new file mode 100644
index 00000000000..dbb82c54785
--- /dev/null
+++ b/docs/quickstart/snippets/gnosis-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/gnosis.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/gnosis-mapping-note.md b/docs/quickstart/snippets/gnosis-mapping-note.md
new file mode 100644
index 00000000000..96859d5c21a
--- /dev/null
+++ b/docs/quickstart/snippets/gnosis-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/gnosis.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/manifest-intro.md b/docs/quickstart/snippets/manifest-intro.md
new file mode 100644
index 00000000000..6dc77ced454
--- /dev/null
+++ b/docs/quickstart/snippets/manifest-intro.md
@@ -0,0 +1,15 @@
+
+
+## Your Project Manifest File
+
+The Project Manifest file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data.
+
+
+
+
+
+#### Your Project Manifest File
+
+The Project Manifest file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data.
+
+
diff --git a/docs/quickstart/snippets/mapping-intro.md b/docs/quickstart/snippets/mapping-intro.md
index 19f45c6eadb..b0bfe28a3fa 100644
--- a/docs/quickstart/snippets/mapping-intro.md
+++ b/docs/quickstart/snippets/mapping-intro.md
@@ -1,3 +1,15 @@
+
+
## Add a Mapping Function
-Mapping functions define how blockchain data is transformed into the optimized GraphQL entities that we previously defined in the `schema.graphql` file.
+Mapping functions define how blockchain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
+
+
+
+
+#### Add a Mapping Function
+
+Mapping functions define how blockchain data is transformed into the optimised GraphQL entities that we previously defined in the `schema.graphql` file.
+
+
diff --git a/docs/quickstart/snippets/near-handlers.md b/docs/quickstart/snippets/near-handlers.md
new file mode 100644
index 00000000000..e6c43d76139
--- /dev/null
+++ b/docs/quickstart/snippets/near-handlers.md
@@ -0,0 +1,3 @@
+- [BlockHandler](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every block, run a mapping function
+- [TransactionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
+- [ActionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction action that matches optional filter criteria, run a mapping function
diff --git a/docs/quickstart/snippets/near-manifest-intro.md b/docs/quickstart/snippets/near-manifest-intro.md
index ecd65bf2384..ff2bce85f91 100644
--- a/docs/quickstart/snippets/near-manifest-intro.md
+++ b/docs/quickstart/snippets/near-manifest-intro.md
@@ -1,5 +1,19 @@
-The Project Manifest (`project.ts`) file works as an entry point to your NEAR project. It defines most of the details on how SubQuery will index and transform the chain data. For NEAR, there are three types of mapping handlers (and you can have more than one in each project):
+
-- [BlockHandler](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every block, run a mapping function
-- [TransactionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction that matches optional filter criteria, run a mapping function
-- [ActionHandlers](../../build/manifest/near.md#mapping-handlers-and-filters): On each and every transaction action that matches optional filter criteria, run a mapping function
+
+
+For NEAR, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
+
+
+
+
+
+For NEAR, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
diff --git a/docs/quickstart/snippets/near-manifest-note.md b/docs/quickstart/snippets/near-manifest-note.md
new file mode 100644
index 00000000000..fad82be51c4
--- /dev/null
+++ b/docs/quickstart/snippets/near-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/near.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/near-mapping-note.md b/docs/quickstart/snippets/near-mapping-note.md
new file mode 100644
index 00000000000..c9f7fc4e1ac
--- /dev/null
+++ b/docs/quickstart/snippets/near-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/near.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/polkadot-handlers.md b/docs/quickstart/snippets/polkadot-handlers.md
new file mode 100644
index 00000000000..05b77cf69eb
--- /dev/null
+++ b/docs/quickstart/snippets/polkadot-handlers.md
@@ -0,0 +1,5 @@
+- [BlockHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every block, run a mapping function
+- [EventHandlers](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every Event that matches optional filter criteria, run a mapping function
+- [CallHanders](../../build/manifest/polkadot.md#mapping-handlers-and-filters): On each and every extrinsic call that matches optional filter criteria, run a mapping function
+
+Note that the manifest file has already been set up correctly and doesn’t require significant changes, but you need to change the datasource handlers. This section lists the triggers that look for on the blockchain to start indexing.
diff --git a/docs/quickstart/snippets/polkadot-manifest-intro.md b/docs/quickstart/snippets/polkadot-manifest-intro.md
new file mode 100644
index 00000000000..8e86bfff89c
--- /dev/null
+++ b/docs/quickstart/snippets/polkadot-manifest-intro.md
@@ -0,0 +1,19 @@
+
+
+
+
+For Polkadot, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
+
+
+
+
+
+For Polkadot, there are three types of mapping handlers (and you can have more than one in each project):
+
+
+
+
diff --git a/docs/quickstart/snippets/polkadot-manifest-note.md b/docs/quickstart/snippets/polkadot-manifest-note.md
new file mode 100644
index 00000000000..e10ae195308
--- /dev/null
+++ b/docs/quickstart/snippets/polkadot-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/polkadot.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/polkadot-mapping-note.md b/docs/quickstart/snippets/polkadot-mapping-note.md
new file mode 100644
index 00000000000..49a9d101b91
--- /dev/null
+++ b/docs/quickstart/snippets/polkadot-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/polkadot.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/polygon-manifest-note.md b/docs/quickstart/snippets/polygon-manifest-note.md
new file mode 100644
index 00000000000..6df39767f5f
--- /dev/null
+++ b/docs/quickstart/snippets/polygon-manifest-note.md
@@ -0,0 +1 @@
+Check out our [Manifest File](../../build/manifest/polygon.md) documentation to get more information about the Project Manifest (`project.ts`) file.
diff --git a/docs/quickstart/snippets/polygon-mapping-note.md b/docs/quickstart/snippets/polygon-mapping-note.md
new file mode 100644
index 00000000000..ba4a27f705b
--- /dev/null
+++ b/docs/quickstart/snippets/polygon-mapping-note.md
@@ -0,0 +1,3 @@
+::: tip Note
+For more information on mapping functions, please refer to our [Mappings](../../build/mapping/polygon.md) documentation.
+:::
diff --git a/docs/quickstart/snippets/schema-intro.md b/docs/quickstart/snippets/schema-intro.md
index fdbc5eeb57d..b7c45a8e25d 100644
--- a/docs/quickstart/snippets/schema-intro.md
+++ b/docs/quickstart/snippets/schema-intro.md
@@ -1,5 +1,23 @@
+
+
## Update Your GraphQL Schema File
The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
-Remove all existing entities and update the `schema.graphql` file as follows:
+
+
+
+
+#### Update Your GraphQL Schema File
+
+The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
+
+
+
+
+#### Update Your GraphQL Schema File
+
+The `schema.graphql` file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
+
+
diff --git a/docs/quickstart/snippets/schema-note.md b/docs/quickstart/snippets/schema-note.md
new file mode 100644
index 00000000000..ecafff92022
--- /dev/null
+++ b/docs/quickstart/snippets/schema-note.md
@@ -0,0 +1,3 @@
+Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
+
+Now that you have made essential changes to the GraphQL Schema file, let’s proceed ahead with the Mapping Function’s configuration.