Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, Algorand, and Flare.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,9 +134,7 @@
-
-
-
+
diff --git a/docs/academy/herocourse/module1.md b/docs/academy/herocourse/module1.md
index dd009fa38ed..eb1b93f0bab 100644
--- a/docs/academy/herocourse/module1.md
+++ b/docs/academy/herocourse/module1.md
@@ -1,26 +1,25 @@
-## Module 1: Getting started
+# Module 1: Getting started
## Introduction
In this module, you will become familiar with SubQuery and get some hands-on experience by creating a simple Hello World SubQuery project. This project will use the **subql CLI** to create an empty project shell. Then a code will be provided to query the Polkadot mainnet for the blockheight. Note that a Docker environment will be used to simplify the running process.
-
## Reference
-* [Hello World PDF workbook](/assets/pdf/Hello_World_Lab.pdf)
-* [Subql Starter Github](https://github.com/subquery/subql-starter)
+- [Hello World PDF workbook](/assets/pdf/Hello_World_Lab.pdf)
+- [Subql Starter Github](https://github.com/subquery/subql-starter)
## Pre-Requisites
You will require the following:
-* NPM package manager.
-* SubQuery CLI (@subql/cli).
-* Docker.
+- NPM package manager.
+- SubQuery CLI (@subql/cli).
+- Docker.
### NPM Package Manager
-First, you must check whether you have installed the latest version of node or not.
+First, you must check whether you have installed the latest version of node or not.
Run this command:
@@ -30,17 +29,18 @@ It should return a result with the latest version of npm, if you have it install
`v18.2.0`
-::: info Note
-Node v12 or higher is required.
-:::
+::: tip Note
+Node v12 or higher is required.
+:::
-If you haven't installed the npm, please run the following command in your terminal and install the latest version of node.
+If you haven't installed the npm, please run the following command in your terminal and install the latest version of node.
```
brew update
brew install node
node -v
```
+
You will get the latest npm version as the output in the end.
### SubQuery CLI
@@ -60,7 +60,6 @@ subql -v
You will get an output similar to this:
`@subql/cli/1.0.1 darwin-x64 node-v18.2.0`
-
### Docker
Please visit [Docker's official site](https://docs.docker.com/get-docker/) for instructions on how to install Docker for your specific operating system.
@@ -83,35 +82,33 @@ Please visit [Docker's official site](https://docs.docker.com/get-docker/) for i
The first step to create a SubQuery project with the following command:
-
```
$ subql init
Project name [subql-starter]: HelloWorld
? Select a network family Substrate
? Select a network Polkadot
? Select a template project subql-starter Starter project for subquery
-RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
-Git repository [https://github.com/subquery/subql-starter]:
+RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
+Git repository [https://github.com/subquery/subql-starter]:
Fetching network genesis hash... done
Author [Ian He & Jay Ji]: Sean
-Description [This project can be use as a starting po...]:
-Version [1.0.0]:
-License [MIT]:
+Description [This project can be use as a starting po...]:
+Version [1.0.0]:
+License [MIT]:
Preparing project... done
HelloWorld is ready
```
Note that any text in the square brackets are the default values, which will be used if nothing is provided.
-This creates a framework and the following directory structure, saving your time.
+This creates a framework and the following directory structure, saving your time.
#### Step 2: Update the Mappings File
-The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock, handleEvent and handleCall`. We will focus on the first function called `handleBlock` for this excerise. Hence, delete the remaining functions.
+The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock, handleEvent and handleCall`. We will focus on the first function called `handleBlock` for this excerise. Hence, delete the remaining functions.
- The `mappingHandler.ts` file should look like this:
-
```
import {SubstrateExtrinsic,SubstrateEvent,SubstrateBlock} from "@subql/types";
import {StarterEntity} from "../types";
@@ -128,10 +125,9 @@ export async function handleBlock(block: SubstrateBlock): Promise {
#### Step 3: Update the Manifest File (aka project.yaml)
-The initialisation command also pre-creates a sample manifest file and defines 3 handlers. Since you have removed `handleEvent` and `handleCall` from the mappings file, you have to remove them from the manifest file as well.
-
-- The ***updated*** part of the manifest file should look like this:
+The initialisation command also pre-creates a sample manifest file and defines 3 handlers. Since you have removed `handleEvent` and `handleCall` from the mappings file, you have to remove them from the manifest file as well.
+- The **_updated_** part of the manifest file should look like this:
```
@@ -145,14 +141,12 @@ dataSources:
kind: substrate/BlockHandler
```
-
-
#### Step 4: Update the Graphql Schema
-The default `schema.graphql` file will contain 5 fields. We can remove the fields from 2 to 5, because the `handleBlock` function in the mappings file only uses “field1”.
+The default `schema.graphql` file will contain 5 fields. We can remove the fields from 2 to 5, because the `handleBlock` function in the mappings file only uses “field1”.
-::: info Note
-Rename field1 to something more meaningful. Eg blockHeight. Note that if you do this, don’t forget to update the reference to field1 in the mappings file appropriately.
+::: tip Note
+Rename field1 to something more meaningful. Eg blockHeight. Note that if you do this, don’t forget to update the reference to field1 in the mappings file appropriately.
:::
The schema file should look like this:
@@ -168,46 +162,39 @@ type StarterEntity @entity {
Install the node dependencies by running the following commands:
+::: code-tabs
+@tab:active yarn
-
-
-
- ```shell
- yarn install
- ```
-
-
+```shell
+yarn install
+```
-
+@tab npm
- ```bash
- npm install
- ```
+```bash
+npm install
+```
-
-
+:::
#### Step 6: Generate the Associated Typescript
Next, we will generate the associated typescript with the following command:
-
-
+::: code-tabs
+@tab:active yarn
- ```shell
- yarn codegen
- ```
-
-
+```shell
+yarn codegen
+```
-
+@tab npm
- ```bash
- npm run-script codegen
- ```
+```bash
+npm run-script codegen
+```
-
-
+:::
You should see a new folder appear with 2 new files.
@@ -215,47 +202,40 @@ You should see a new folder appear with 2 new files.
The next step is to build the project with the following command:
-
-
+::: code-tabs
+@tab:active yarn
- ```shell
- yarn build
- ```
-
-
+```shell
+yarn build
+```
-
+@tab npm
- ```bash
- npm run-script build
- ```
+```bash
+npm run-script build
+```
-
-
+:::
This bundles the app into static files for production.
-
#### Step 8: Start the Docker Container
Run the docker command to pull the images and to start the container.
-
```
docker-compose pull && docker-compose up
```
::: warning Important
-You need to have Docker installed as noted in the prerequisite.
+You need to have Docker installed as noted in the prerequisite.
:::
-
#### Step 9: Run a Query
Once the docker container is up and running, which could take a few minutes, open up your browser, and navigate to `www.localhost:3000`.
-This will open up a “playground” where you can create your query. Copy the example below.
-
+This will open up a “playground” where you can create your query. Copy the example below.
```
{
@@ -269,6 +249,6 @@ This will open up a “playground” where you can create your query. Copy the e
}
```
-::: info Note
-If you renamed field1 something else, modify this query appropriately.
-:::
\ No newline at end of file
+::: tip Note
+If you renamed field1 something else, modify this query appropriately.
+:::
diff --git a/docs/academy/herocourse/module2.md b/docs/academy/herocourse/module2.md
index f0f0855e6dc..93edc001cc3 100644
--- a/docs/academy/herocourse/module2.md
+++ b/docs/academy/herocourse/module2.md
@@ -1,8 +1,8 @@
# Module 2: SubQuery Basics
-This module explains the working of the basic files of a SubQuery Project with an example. The module is divided into 3 short video lessons, each describing the usage of these files an what modfications you may need to do.
+This module explains the working of the basic files of a SubQuery Project with an example. The module is divided into 3 short video lessons, each describing the usage of these files an what modfications you may need to do.
-Refer to the documentation references, given at the end of the each lesson, for an in-depth explanation.
+Refer to the documentation references, given at the end of the each lesson, for an in-depth explanation.
## Lesson 1: The Manifest File
@@ -64,7 +64,7 @@ Using the starter project for this exercise, we will use an **event handler** to
- **Completion of [Module 1](../herocourse/module1.md)**
-### Overview of Steps Involved
+### Overview of Steps Involved
1. Initialise the starter project.
2. Update your mappings, manifest file, and graphql schema file by removing all the default code except for the `handleEvent` function.
@@ -74,25 +74,23 @@ Using the starter project for this exercise, we will use an **event handler** to
### Detailed Steps
-
#### Step 1: Initialise Your Project
The first step is to create a SubQuery project with the following command:
-
```
$ subql init
Project name [subql-starter]: account-balance
? Select a network family Substrate
? Select a network Polkadot
? Select a template project subql-starter Starter project for subquery
-RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
-Git repository [https://github.com/subquery/subql-starter]:
+RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
+Git repository [https://github.com/subquery/subql-starter]:
Fetching network genesis hash... done
-Author [Ian He & Jay Ji]:
-Description [This project can be use as a starting po...]:
-Version [1.0.0]:
-License [MIT]:
+Author [Ian He & Jay Ji]:
+Description [This project can be use as a starting po...]:
+Version [1.0.0]:
+License [MIT]:
Preparing project... done
account-balance is ready
```
@@ -101,31 +99,29 @@ account-balance is ready
The default `schema.graphql` file contains 5 fields. Rename the field2 to `account` and field3 to `balance`. In addition, rename the entity to `Account`.
-::: info Note
+::: tip Note
Whenever you update the manifest file, don’t forget to update the reference to field1 in the `mappings` file and to regenerate the code via yarn codegen.
:::
- The schema file should look like this:
-
```
type Account @entity {
id: ID! #id is a required field
account: String #This is a Polkadot address
- balance: BigInt # This is the amount of DOT
+ balance: BigInt # This is the amount of DOT
}
```
-
#### Step 3: Update the Manifest File (aka project.yaml)
-The initialisation command also pre-creates a sample manifest file and defines 3 handlers. Because we are only focusing on Events, let’s remove `handleBlock`and `handleCall` from the mappings file.
+The initialisation command also pre-creates a sample manifest file and defines 3 handlers. Because we are only focusing on Events, let’s remove `handleBlock`and `handleCall` from the mappings file.
::: warning Important
Avoid messing with the auto-generated version names(as shown in the initial section of the manifest file).
:::
-- The ***updated*** part of the manifest file should look like this:
+- The **_updated_** part of the manifest file should look like this:
```
@@ -147,25 +143,22 @@ dataSources:
method: Deposit
```
-::: info Note
+::: tip Note
Comment out genesisHash by prefixing with #. This is not required for now.
:::
#### Step 4: Update the Mappings File
-The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent`, and `handleCall`. Since you will only focus on `handleEvent`, let’s delete the remaining functions.
-
-You need to make a few other changes as well. Since the Account entity (formally called the StarterEntity) was instantiated in the `handleBlock` function but you no longer have this, you have to instantiate this within your `handleEvent` function. You also need to update the argument that you pass to the constructor.
+The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent`, and `handleCall`. Since you will only focus on `handleEvent`, let’s delete the remaining functions.
+You need to make a few other changes as well. Since the Account entity (formally called the StarterEntity) was instantiated in the `handleBlock` function but you no longer have this, you have to instantiate this within your `handleEvent` function. You also need to update the argument that you pass to the constructor.
```
let record = new Account(event.extrinsic.block.block.header.hash.toString());
```
-
- The mappingHandler.ts file should look like this:
-
```
import {SubstrateEvent} from "@subql/types";
import {Account} from "../types";
@@ -183,175 +176,156 @@ export async function handleEvent(event: SubstrateEvent): Promise {
}
```
-
-
#### Step 5: Install the Dependencies
Install the node dependencies by running the following commands:
+::: code-tabs
+@tab:active yarn
-
-
-
- ```shell
- yarn install
- ```
-
-
+```shell
+yarn install
+```
-
+@tab npm
- ```bash
- npm install
- ```
+```bash
+npm install
+```
-
-
+:::
#### Step 6: Generate the Associated Typescript
Next, let's generate the associated typescript with the following command:
-
-
-
- ```shell
- yarn codegen
- ```
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn codegen
+```
-
+@tab npm
- ```bash
- npm run-script codegen
- ```
+```bash
+npm run-script codegen
+```
-
-
+:::
#### Step 7: Build the Project
The next step is to build the project with the command as follows:
-
-
-
- ```shell
- yarn build
- ```
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn build
+```
-
+@tab npm
- ```bash
- npm run-script build
- ```
+```bash
+npm run-script build
+```
-
-
+:::
This code bundles the app into static files for production.
-
#### Step 8: Start the Docker Container
Run the docker command to pull the images and to start the container.
-
```
docker-compose pull & docker-compose up
```
+#### Step 9: Run a Query
+Once the docker container is up to date and starts running, which might take a few minutes, open up your browser and navigate to `www.localhost:3000`.
-#### Step 9: Run a Query
+This will open up a “playground” where you can create your query. Copy the example below:
-Once the docker container is up to date and starts running, which might take a few minutes, open up your browser and navigate to `www.localhost:3000`.
+::: code-tabs
-This will open up a “playground” where you can create your query. Copy the example below:
+@tab query
-
-
-
- ```
- query {
- accounts(first:10 orderBy:BALANCE_DESC){
- nodes{
- account
- balance
- }
+```
+query {
+ accounts(first:10 orderBy:BALANCE_DESC){
+ nodes{
+ account
+ balance
}
}
- ```
-
-
-
-
- ```
- {
- "data": {
- "accounts": {
- "nodes": [
- {
- "account": "13wY4rD88C3Xzd4brFMPkAMEMC3dSuAR2NC6PZ5BEsZ5t6rJ",
- "balance": "162804160"
- },
- {
- "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
- "balance": "130775360"
- },
- {
- "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
- "balance": "130644160"
- },
- {
- "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
- "balance": "117559360"
- },
- {
- "account": "12H7nsDUrJUSCQQJrTKAFfyCWSactiSdjoVUixqcd9CZHTGt",
- "balance": "117359360"
- },
- {
- "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
- "balance": "108648000"
- },
- {
- "account": "13wY4rD88C3Xzd4brFMPkAMEMC3dSuAR2NC6PZ5BEsZ5t6rJ",
- "balance": "108648000"
- },
- {
- "account": "12zSBXtK9evQRCG9Gsdr72RbqNzbNn2Suox2cTfugCLmWjqG",
- "balance": "108648000"
- },
- {
- "account": "15zF7zvdUiy2eYCgN6KWbv2SJPdbSP6vdHs1YTZDGjRcSMHN",
- "balance": "108448000"
- },
- {
- "account": "15zF7zvdUiy2eYCgN6KWbv2SJPdbSP6vdHs1YTZDGjRcSMHN",
- "balance": "108448000"
- }
- ]
- }
+}
+```
+
+@tab result
+
+```
+{
+ "data": {
+ "accounts": {
+ "nodes": [
+ {
+ "account": "13wY4rD88C3Xzd4brFMPkAMEMC3dSuAR2NC6PZ5BEsZ5t6rJ",
+ "balance": "162804160"
+ },
+ {
+ "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
+ "balance": "130775360"
+ },
+ {
+ "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
+ "balance": "130644160"
+ },
+ {
+ "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
+ "balance": "117559360"
+ },
+ {
+ "account": "12H7nsDUrJUSCQQJrTKAFfyCWSactiSdjoVUixqcd9CZHTGt",
+ "balance": "117359360"
+ },
+ {
+ "account": "146YJHyD5cjFN77HrfKhxUFbU8WjApwk9ncGD6NbxE66vhMS",
+ "balance": "108648000"
+ },
+ {
+ "account": "13wY4rD88C3Xzd4brFMPkAMEMC3dSuAR2NC6PZ5BEsZ5t6rJ",
+ "balance": "108648000"
+ },
+ {
+ "account": "12zSBXtK9evQRCG9Gsdr72RbqNzbNn2Suox2cTfugCLmWjqG",
+ "balance": "108648000"
+ },
+ {
+ "account": "15zF7zvdUiy2eYCgN6KWbv2SJPdbSP6vdHs1YTZDGjRcSMHN",
+ "balance": "108448000"
+ },
+ {
+ "account": "15zF7zvdUiy2eYCgN6KWbv2SJPdbSP6vdHs1YTZDGjRcSMHN",
+ "balance": "108448000"
+ }
+ ]
}
}
- ```
-
-
+}
+```
+:::
If you have nothing returned, wait for a few minutes and let your node index a few blocks.
-Here, we have queried for the balance of DOT tokens for all addresses (accounts) on the Polkadot Mainnet blockchain. We have limited this to the first 10 and sorted it by the “richest” account holders first.
-
+Here, we have queried for the balance of DOT tokens for all addresses (accounts) on the Polkadot Mainnet blockchain. We have limited this to the first 10 and sorted it by the “richest” account holders first.
#### Bonus
-Try to aggregate the balances across addresses and find the total balance of an address.
-
+Try to aggregate the balances across addresses and find the total balance of an address.
### References
diff --git a/docs/academy/herocourse/module3.md b/docs/academy/herocourse/module3.md
index 8238b14fe99..06f1fd2843f 100644
--- a/docs/academy/herocourse/module3.md
+++ b/docs/academy/herocourse/module3.md
@@ -1,13 +1,12 @@
# Module 3: Relationships
-This module explains the different types of entity relations **(one-to-one, one-to-many, and many-to-many)** with guided examples. The module is divided into 3 video lessons for in-depth explanations.
+This module explains the different types of entity relations **(one-to-one, one-to-many, and many-to-many)** with guided examples. The module is divided into 3 video lessons for in-depth explanations.
:::info Note
-For a basic uderstanding of the terminologies related to entity relations, visit [GraphQL Schema Documentation.](../../build/graphql.md)
+For a basic uderstanding of the terminologies related to entity relations, visit [GraphQL Schema Documentation.](../../build/graphql.md)
:::
-Let's have a look at each relationship one-by-one.
-
+Let's have a look at each relationship one-by-one.
## Lesson 1: One to Many Entities
@@ -15,16 +14,15 @@ Let's have a look at each relationship one-by-one.
-### Exercise - Balances Transfers (One-to-Many)
+### Exercise - Balances Transfers (One-to-Many)
-In these exercises, we will take the starter project and focus on understanding **one to many entity relationships**. We will create a project that allows us to query for accounts and determine how much was transferred to what receiving address.
+In these exercises, we will take the starter project and focus on understanding **one to many entity relationships**. We will create a project that allows us to query for accounts and determine how much was transferred to what receiving address.
### Pre-Requisites
Completion of [Module 2](../herocourse/module2.md).
-
-### Overview of Steps Involved
+### Overview of Steps Involved
1. Initialise the starter project.
2. Update your mappings, manifest file, and graphql schema file by removing all the default code except for the `handleEvent` function.
@@ -34,40 +32,35 @@ Completion of [Module 2](../herocourse/module2.md).
### Detailed Steps
-
#### Step 1: Initialise Your Project
The first step to create a SubQuery project using the following command:
-
```
$ subql init
Project name [subql-starter]: account-transfers
? Select a network family Substrate
? Select a network Polkadot
? Select a template project subql-starter Starter project for subquery
-RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
-Git repository [https://github.com/subquery/subql-starter]:
+RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
+Git repository [https://github.com/subquery/subql-starter]:
Fetching network genesis hash... done
-Author [Ian He & Jay Ji]:
-Description [This project can be use as a starting po...]:
-Version [1.0.0]:
-License [MIT]:
+Author [Ian He & Jay Ji]:
+Description [This project can be use as a starting po...]:
+Version [1.0.0]:
+License [MIT]:
Preparing project... done
account-transfers is ready
```
-
-
#### Step 2: Update the Graphql Schema
Create an entity called `Account`. This account will contain multiple transfers. Here, an account can be considered as a **Polkadot address** owned by someone.
-Transfers can be considered as a transaction with an amount, a sender, and a receiver(let’s ignore the sender for now). Here, you will obtain the amount transferred, the blockNumber, and to whom it was sent(also known as the receiver).
+Transfers can be considered as a transaction with an amount, a sender, and a receiver(let’s ignore the sender for now). Here, you will obtain the amount transferred, the blockNumber, and to whom it was sent(also known as the receiver).
- The schema file should look like this:
-
```
type Account @entity {
id: ID! #this primary key is set as the toAddress
@@ -83,14 +76,13 @@ type Transfer @entity {
#### Step 3: Update the Manifest File (aka project.yaml)
-Update the manifest file to only include the `handleEvent` handler and update the filter method to `Transfer`. The reason is that we only want to work with the "balance transfer events" in this example. These events will contain the data of those transactions, which are being transferred from one account to another.
+Update the manifest file to only include the `handleEvent` handler and update the filter method to `Transfer`. The reason is that we only want to work with the "balance transfer events" in this example. These events will contain the data of those transactions, which are being transferred from one account to another.
::: warning Important
Avoid messing with the auto-generated version names(as shown in the initial section of the manifest file).
:::
-- The ***updated*** part of the `project.yaml` file should look similar to as below:
-
+- The **_updated_** part of the `project.yaml` file should look similar to as below:
```
network:
@@ -111,32 +103,28 @@ dataSources:
method: Transfer
```
-::: info Note
+::: tip Note
Note the inclusion of a `dictionary` and the exclusion of the `genesisHash`.
:::
-
#### Step 4: Update the Mappings File
-The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent`, and `handleCall`. As we are only focusing on `handleEvent`, delete the remaining functions.
+The initialisation command pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent`, and `handleCall`. As we are only focusing on `handleEvent`, delete the remaining functions.
-Note that you also need to make a few other changes. First, understand that the `balance.transfer` event gives access to an array of data in the following format: [from, to, value].
+Note that you also need to make a few other changes. First, understand that the `balance.transfer` event gives access to an array of data in the following format: [from, to, value].
This indicates that you can access the values as follows:
-
```
const fromAddress = event.event.data[0];
- const toAddress = event.event.data[1];
+ const toAddress = event.event.data[1];
const amount = event.event.data[2];
```
-
-Furthermore, as the `Account` entity (formally called the StarterEntity) was instantiated in the `handleBlock` function and you no longer have this, you need to instantiate it within the `handleEvent` function.
+Furthermore, as the `Account` entity (formally called the StarterEntity) was instantiated in the `handleBlock` function and you no longer have this, you need to instantiate it within the `handleEvent` function.
However, you must first test and see if this value is already in your database. The reason is that an event can contain multiple transfers to the SAME `toAddress`. As a result, you get the `toAddress` if the value is present in the database. And if it does not exist, save it to the database.
-
```
const toAccount = await Account.get(toAddress.toString());
if (!toAccount) {
@@ -146,7 +134,6 @@ However, you must first test and see if this value is already in your database.
For the `Transfer` entity object, set the primary key as the `blocknumber+event.idx` (which guarantees uniqueness) and then set the other fields of the `Transfer` entity object accordingly.
-
```
const transfer = new Transfer(`${event.block.block.header.number.toNumber()}-${event.idx}`, );
transfer.blockNumber = event.block.block.header.number.toBigInt();
@@ -157,7 +144,6 @@ For the `Transfer` entity object, set the primary key as the `blocknumber+event.
- The `mappingHandler.ts` file should look like this:
-
```
import {SubstrateEvent} from "@subql/types";
import {Account, Transfer} from "../types";
@@ -166,291 +152,265 @@ import {Balance} from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
{
// The balances.transfer event has the following payload \[from, to, value\] that we can access
-
+
// const fromAddress = event.event.data[0];
- const toAddress = event.event.data[1];
+ const toAddress = event.event.data[1];
const amount = event.event.data[2];
-
+
// query for toAddress from DB
const toAccount = await Account.get(toAddress.toString());
// if not in DB, instantiate a new Account object using the toAddress as a unique ID
if (!toAccount) {
await new Account(toAddress.toString()).save();
}
-
+
// instantiate a new Transfer object using the block number and event.idx as a unique ID
const transfer = new Transfer(`${event.block.block.header.number.toNumber()}-${event.idx}`, );
transfer.blockNumber = event.block.block.header.number.toBigInt();
transfer.toId = toAddress.toString();
transfer.amount = (amount as Balance).toBigInt();
await transfer.save();
-
+
}
}
```
-
#### Step 5: Install the Dependencies
Install the node dependencies by running the following commands:
-
-
-
- ```shell
- yarn install
- ```
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn install
+```
-
+@tab npm
- ```bash
- npm install
- ```
+```bash
+npm install
+```
-
-
+:::
#### Step 6: Generate the Associated Typescript
Next, we will generate the associated typescript with the following command:
-
-
-
- ```shell
- yarn codegen
- ```
-
-
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn codegen
+```
- ```bash
- npm run-script codegen
- ```
+@tab npm
-
-
+```bash
+npm run-script codegen
+```
+:::
#### Step 7: Build the Project
The next step is to build the project with the following command:
-
-
-
- ```shell
- yarn build
- ```
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn build
+```
-
+@tab npm
- ```bash
- npm run-script build
- ```
+```bash
+npm run-script build
+```
-
-
+:::
This code bundles the app into static files for production.
-
#### Step 8: Start the Docker Container
Run the docker command to pull the images and start the container.
-
```
docker-compose pull & docker-compose up
```
-
#### Step 9: Run a Query
Once the docker container is all set and running, which may take a few minutes, open up your browser and navigate to `www.localhost:3000`.
This will open up a “playground” where you can create your query. Copy the example below and see the results:
-
-
+::: code-tabs
+@tab:active Query
- ```
- query{
- accounts(first: 3){
- nodes{
- id
- }
- }
+```
+query{
+accounts(first: 3){
+ nodes{
+ id
+ }
}
+}
- ```
-
+```
-
+@tab result
- ```
- {
- "data": {
- "accounts": {
- "nodes": [
- {
- "id": "11k5GkWb9npuqWRq5Pyk51RSnRyskPrPtsyoCApteEUjNou"
- },
- {
- "id": "121dZJsfG7uNvszPSpYvBzwnrcF1P4ejjrE1G6FSWHqht5tC"
- },
- {
- "id": "121rwkQAH3yCD1EcaRgc3nELSoZn29RoTtCN55mcN7RkBA66"
- }
- ]
- }
+```
+{
+ "data": {
+ "accounts": {
+ "nodes": [
+ {
+ "id": "11k5GkWb9npuqWRq5Pyk51RSnRyskPrPtsyoCApteEUjNou"
+ },
+ {
+ "id": "121dZJsfG7uNvszPSpYvBzwnrcF1P4ejjrE1G6FSWHqht5tC"
+ },
+ {
+ "id": "121rwkQAH3yCD1EcaRgc3nELSoZn29RoTtCN55mcN7RkBA66"
+ }
+ ]
}
}
+}
- ```
-
-
-
+```
-The above code will query the `account` entity returning the id. We have defined the id here as the `toAddress`(also known as the receiving address).
+:::
+The above code will query the `account` entity returning the id. We have defined the id here as the `toAddress`(also known as the receiving address).
- You can also query for all the **transfers**. Copy this given code and see the results:
-
-
+::: code-tabs
+@tab query
- ```
- query{
- transfers(first: 3){
- nodes{
- id
- amount
- blockNumber
- }
+```
+query{
+transfers(first: 3){
+ nodes{
+ id
+ amount
+ blockNumber
}
}
-
- ```
-
-
-
+}
- ```
- {
- "data": {
- "transfers": {
- "nodes": [
- {
- "id": "7280565-2",
- "amount": "400009691000",
- "blockNumber": "7280565"
- },
- {
- "id": "7280566-2",
- "amount": "23174700000000",
- "blockNumber": "7280566"
- },
- {
- "id": "7280570-5",
- "amount": "400000000000",
- "blockNumber": "7280570"
- }
- ]
- }
+```
+
+@tab result
+
+```
+ {
+ "data": {
+ "transfers": {
+ "nodes": [
+ {
+ "id": "7280565-2",
+ "amount": "400009691000",
+ "blockNumber": "7280565"
+ },
+ {
+ "id": "7280566-2",
+ "amount": "23174700000000",
+ "blockNumber": "7280566"
+ },
+ {
+ "id": "7280570-5",
+ "amount": "400000000000",
+ "blockNumber": "7280570"
+ }
+ ]
}
}
- ```
-
-
-
+}
+```
+:::
-- Note an amazing possibility here. We can even query the account id from within the transfer query. The example below shows that we are querying for transfers where we have an **associated amount** and **blockNumber**. After that we can link this to the receiving or `to` address as follows:
+- Note an amazing possibility here. We can even query the account id from within the transfer query. The example below shows that we are querying for transfers where we have an **associated amount** and **blockNumber**. After that we can link this to the receiving or `to` address as follows:
-
-
+::: code-tabs
+@tab query
- ```
- query{
- transfers(first: 3){
- nodes{
+```
+query{
+ transfers(first: 3){
+ nodes{
+ id
+ amount
+ blockNumber
+ to{
id
- amount
- blockNumber
- to{
- id
- }
- }
}
- }
- ```
-
-
-
- ```
- {
- "data": {
- "transfers": {
- "nodes": [
- {
- "id": "7280565-2",
- "amount": "400009691000",
- "blockNumber": "7280565",
- "to": {
- "id": "15kUt2i86LHRWCkE3D9Bg1HZAoc2smhn1fwPzDERTb1BXAkX"
- }
- },
- {
- "id": "7280566-2",
- "amount": "23174700000000",
- "blockNumber": "7280566",
- "to": {
- "id": "14uh77yjhC3TLAE6KaCLvkjN7yFeUkejm7o7fdaSsggwD1ua"
- }
- },
- {
- "id": "7280567-2",
- "amount": "3419269000000",
- "blockNumber": "7280567",
- "to": {
- "id": "12sj9HTNQ7aiQoRg5wLyuemgvmFcrWeUJRi3aEUnJLmAE56Y"
- }
- }
- ]
}
}
}
- ```
-
-
-
-
+```
+@tab result
-Let's have a look at the database schema and understand the working.
+```
+{
+ "data": {
+ "transfers": {
+ "nodes": [
+ {
+ "id": "7280565-2",
+ "amount": "400009691000",
+ "blockNumber": "7280565",
+ "to": {
+ "id": "15kUt2i86LHRWCkE3D9Bg1HZAoc2smhn1fwPzDERTb1BXAkX"
+ }
+ },
+ {
+ "id": "7280566-2",
+ "amount": "23174700000000",
+ "blockNumber": "7280566",
+ "to": {
+ "id": "14uh77yjhC3TLAE6KaCLvkjN7yFeUkejm7o7fdaSsggwD1ua"
+ }
+ },
+ {
+ "id": "7280567-2",
+ "amount": "3419269000000",
+ "blockNumber": "7280567",
+ "to": {
+ "id": "12sj9HTNQ7aiQoRg5wLyuemgvmFcrWeUJRi3aEUnJLmAE56Y"
+ }
+ }
+ ]
+ }
+ }
+}
+```
-The **accounts table** is a standalone table which contains only receiving addresses(`accounts.id`). The **transfer table** contains `to_id` which are links or points back to the accounts.
+:::
-Simply put, one account links to many transfers. In other words, each unique Polkadot address, stored in `accounts.id`, links to one or more than one Polkadot address, which has an associated amount and block number.
+Let's have a look at the database schema and understand the working.
+The **accounts table** is a standalone table which contains only receiving addresses(`accounts.id`). The **transfer table** contains `to_id` which are links or points back to the accounts.
+Simply put, one account links to many transfers. In other words, each unique Polkadot address, stored in `accounts.id`, links to one or more than one Polkadot address, which has an associated amount and block number.
### References
-* [Account Transfers PDF workbook](/assets/pdf/Account_Transfers.pdf)
-* [Account Transfers Github](https://github.com/subquery/tutorials-account-transfers)
-* [One-to-many relationships](../../build/graphql.md#one-to-many-relationships)
+- [Account Transfers PDF workbook](/assets/pdf/Account_Transfers.pdf)
+- [Account Transfers Github](https://github.com/subquery/tutorials-account-transfers)
+- [One-to-many relationships](../../build/graphql.md#one-to-many-relationships)
---
-
## Lesson 2: Many to Many Entities
-
-
### Exercise - Account Transfer (With Reverse Lookup)
-In this exercise, we will take the starter project and learn about the reverse lookups.
+In this exercise, we will take the starter project and learn about the reverse lookups.
### Pre-Requisites
Completion of [Module 3: Lesson 1 - One to many entities.](module3.md#lesson-1-one-to-many-entities)
-
### Overview of Steps Involved
1. Git clone the [tutorials-account-transfers](https://github.com/subquery/tutorials-account-transfers) project.
@@ -1019,26 +933,23 @@ Completion of [Module 3: Lesson 1 - One to many entities.](module3.md#lesson-1-o
#### Step 1: Clone Account Transfer Project
-Start by cloning the `tutorials-account-transfers` Github repository.
+Start by cloning the `tutorials-account-transfers` Github repository.
-::: info Note
-This github project was a part of the exercise for **Module 3 - Lesson 2** (See Reference in the end of the Lesson 2).
+::: tip Note
+This github project was a part of the exercise for **Module 3 - Lesson 2** (See Reference in the end of the Lesson 2).
:::
Run the following command:
-
```
git clone https://github.com/subquery/tutorials-account-transfers.git
```
-
-#### Step 2: Confirm that Project Works
+#### Step 2: Confirm that Project Works
Run the basic commands to run the project and check if it's all set.
-
```
yarn install
yarn codegen
@@ -1047,123 +958,118 @@ docker-compose pull && docker-compose up
```
-Once the docker container is running, which may take a few minutes, open up your browser and navigate to `www.localhost:3000`.
+Once the docker container is running, which may take a few minutes, open up your browser and navigate to `www.localhost:3000`.
This will open up a **playground** where you can create your query. Copy the example below and see the result:
-
-
+::: code-tabs
+@tab:active Query
- ```
- query{
- accounts(first: 3){
- nodes{
- id
- }
- }
+```
+query{
+ accounts(first: 3){
+ nodes{
+ id
+ }
}
- ```
-
+ }
+```
-
-
- ```
- {
- "data": {
- "accounts": {
- "nodes": [
- {
- "id": "11k5GkWb9npuqWRq5Pyk51RSnRyskPrPtsyoCApteEUjNou"
- },
- {
- "id": "121dZJsfG7uNvszPSpYvBzwnrcF1P4ejjrE1G6FSWHqht5tC"
- },
- {
- "id": "121rwkQAH3yCD1EcaRgc3nELSoZn29RoTtCN55mcN7RkBA66"
- }
- ]
- }
+@tab result
+
+```
+{
+ "data": {
+ "accounts": {
+ "nodes": [
+ {
+ "id": "11k5GkWb9npuqWRq5Pyk51RSnRyskPrPtsyoCApteEUjNou"
+ },
+ {
+ "id": "121dZJsfG7uNvszPSpYvBzwnrcF1P4ejjrE1G6FSWHqht5tC"
+ },
+ {
+ "id": "121rwkQAH3yCD1EcaRgc3nELSoZn29RoTtCN55mcN7RkBA66"
+ }
+ ]
}
}
- ```
-
-
+}
+```
+:::
-The above code will query the account entity returning the id. Here. we have defined the id as the `toAddress`(also known as the **receiving address**).
+The above code will query the account entity returning the id. Here. we have defined the id as the `toAddress`(also known as the **receiving address**).
+- As noted in a previous exercise(**Lesson 1**), we query the account id from within the **transfer entity**.
-- As noted in a previous exercise(**Lesson 1**), we query the account id from within the **transfer entity**.
+The example given below shows that we are querying for transfers where we have an associated amount and blockNumber. After that, we can link this to the receiving or `to` address as follows:
-The example given below shows that we are querying for transfers where we have an associated amount and blockNumber. After that, we can link this to the receiving or `to` address as follows:
+::: code-tabs
+@tab query
-
-
-
- ```
- query{
- transfers(first: 3){
- nodes{
+```
+query{
+ transfers(first: 3){
+ nodes{
+ id
+ amount
+ blockNumber
+ to{
id
- amount
- blockNumber
- to{
- id
- }
- }
+ }
}
}
- ```
-
+ }
+```
-
+@tab result
- ```
- {
- "data": {
- "transfers": {
- "nodes": [
- {
- "id": "7280565-2",
- "amount": "400009691000",
- "blockNumber": "7280565",
- "to": {
- "id": "15kUt2i86LHRWCkE3D9Bg1HZAoc2smhn1fwPzDERTb1BXAkX"
- }
- },
- {
- "id": "7280566-2",
- "amount": "23174700000000",
- "blockNumber": "7280566",
- "to": {
- "id": "14uh77yjhC3TLAE6KaCLvkjN7yFeUkejm7o7fdaSsggwD1ua"
- }
- },
- {
- "id": "7280567-2",
- "amount": "3419269000000",
- "blockNumber": "7280567",
- "to": {
- "id": "12sj9HTNQ7aiQoRg5wLyuemgvmFcrWeUJRi3aEUnJLmAE56Y"
- }
+```
+{
+ "data": {
+ "transfers": {
+ "nodes": [
+ {
+ "id": "7280565-2",
+ "amount": "400009691000",
+ "blockNumber": "7280565",
+ "to": {
+ "id": "15kUt2i86LHRWCkE3D9Bg1HZAoc2smhn1fwPzDERTb1BXAkX"
}
- ]
- }
+ },
+ {
+ "id": "7280566-2",
+ "amount": "23174700000000",
+ "blockNumber": "7280566",
+ "to": {
+ "id": "14uh77yjhC3TLAE6KaCLvkjN7yFeUkejm7o7fdaSsggwD1ua"
+ }
+ },
+ {
+ "id": "7280567-2",
+ "amount": "3419269000000",
+ "blockNumber": "7280567",
+ "to": {
+ "id": "12sj9HTNQ7aiQoRg5wLyuemgvmFcrWeUJRi3aEUnJLmAE56Y"
+ }
+ }
+ ]
}
}
- ```
-
-
-
+}
+```
+:::
#### Step 3: Add a Reverse Lookup
-Add an extra field to the **Account** entity called `myToAddress`. Assign it the type `Transfer`, and add the `@derived` annotation.
+Add an extra field to the **Account** entity called `myToAddress`. Assign it the type `Transfer`, and add the `@derived` annotation.
-This will create a **virtual field** called `myToAddress`, which can be accessed from the Account entity. Note that it is virtual because the database table structure does not change.
+This will create a **virtual field** called `myToAddress`, which can be accessed from the Account entity. Note that it is virtual because the database table structure does not change.
+
- Allows you to do a reverse lookup in Graphql.
- Adds a `GetElementByID()` on the child entities.
@@ -1183,108 +1089,105 @@ type Transfer @entity {
#### Step 4: Recompile and Test
-
-
+::: code-tabs
+@tab:active Query
- ```
- query{
- accounts(first:5){
- nodes{
- id
- myToAddress{
- nodes{
- id
- amount
- }
+```
+query{
+ accounts(first:5){
+ nodes{
+ id
+ myToAddress{
+ nodes{
+ id
+ amount
}
}
}
}
- ```
-
+}
+```
-
+@tab result
- ```
- {
- "data": {
- "accounts": {
- "nodes": [
- {
- "id": "1112NRMkvMb5x3EwGsLSzXyw7kSLxug4uFH1ec3CnDe7ZoG",
- "myToAddress": {
- "nodes": [
- {
- "id": "1206531-14",
- "amount": "123000000000"
- },
- {
- "id": "1206533-9",
- "amount": "30000000000"
- },
- {
- "id": "1249840-2",
- "amount": "100000000000"
- }
- ]
- }
- },
- {
- "id": "1117zZ65F4sz3EH9hZdAivERch99XMXADHicJn7ZmKUrrxT",
- "myToAddress": {
- "nodes": [
- {
- "id": "1256968-5",
- "amount": "86880000000"
- },
- {
- "id": "1256984-5",
- "amount": "12299500000000"
- }
- ]
- }
- },
- {
- "id": "11212d8rV4pj73RLoXqiEJweNs2qU1SsfwbzzRWVzn2o5ZCt",
- "myToAddress": {
- "nodes": [
- {
- "id": "1212424-9",
- "amount": "50000000000"
- },
- {
- "id": "1212680-3",
- "amount": "150000000000"
- },
- {
- "id": "1212719-3",
- "amount": "22622363200000"
- },
- {
- "id": "1240252-2",
- "amount": "41055764800000"
- },
- {
- "id": "1258672-6",
- "amount": "49000000000"
- }
- ]
- }
+```
+{
+ "data": {
+ "accounts": {
+ "nodes": [
+ {
+ "id": "1112NRMkvMb5x3EwGsLSzXyw7kSLxug4uFH1ec3CnDe7ZoG",
+ "myToAddress": {
+ "nodes": [
+ {
+ "id": "1206531-14",
+ "amount": "123000000000"
+ },
+ {
+ "id": "1206533-9",
+ "amount": "30000000000"
+ },
+ {
+ "id": "1249840-2",
+ "amount": "100000000000"
+ }
+ ]
}
- ]
- }
+ },
+ {
+ "id": "1117zZ65F4sz3EH9hZdAivERch99XMXADHicJn7ZmKUrrxT",
+ "myToAddress": {
+ "nodes": [
+ {
+ "id": "1256968-5",
+ "amount": "86880000000"
+ },
+ {
+ "id": "1256984-5",
+ "amount": "12299500000000"
+ }
+ ]
+ }
+ },
+ {
+ "id": "11212d8rV4pj73RLoXqiEJweNs2qU1SsfwbzzRWVzn2o5ZCt",
+ "myToAddress": {
+ "nodes": [
+ {
+ "id": "1212424-9",
+ "amount": "50000000000"
+ },
+ {
+ "id": "1212680-3",
+ "amount": "150000000000"
+ },
+ {
+ "id": "1212719-3",
+ "amount": "22622363200000"
+ },
+ {
+ "id": "1240252-2",
+ "amount": "41055764800000"
+ },
+ {
+ "id": "1258672-6",
+ "amount": "49000000000"
+ }
+ ]
+ }
+ }
+ ]
}
}
- ```
-
-
-
+}
+```
+:::
-Adding the `@derivedFrom` keyword to the `myToAddress` field allows a **virtual field** to appear in the `Account` object. You can see this in the documentation tab. This allows a **Reverse Lookup** where the `Transfer.to` field can be accessed from `Account.myToAddress`.
+Adding the `@derivedFrom` keyword to the `myToAddress` field allows a **virtual field** to appear in the `Account` object. You can see this in the documentation tab. This allows a **Reverse Lookup** where the `Transfer.to` field can be accessed from `Account.myToAddress`.
### References
-* [Account Transfer with Reverse Lookups PDF Workbook](/assets/pdf/Account_Transfer_with_Reverse_Lookups.pdf)
-* [Account Transfer with Reverse Lookups Github](https://github.com/subquery/tutorials-account-transfer-reverse-lookups)
-* [Reverse lookups](../../build/graphql.md#reverse-lookups)
+- [Account Transfer with Reverse Lookups PDF Workbook](/assets/pdf/Account_Transfer_with_Reverse_Lookups.pdf)
+- [Account Transfer with Reverse Lookups Github](https://github.com/subquery/tutorials-account-transfer-reverse-lookups)
+- [Reverse lookups](../../build/graphql.md#reverse-lookups)
diff --git a/docs/academy/herocourse/module4.md b/docs/academy/herocourse/module4.md
index 206ad85342d..bfc30bed856 100644
--- a/docs/academy/herocourse/module4.md
+++ b/docs/academy/herocourse/module4.md
@@ -1,6 +1,6 @@
# Module 4: Aggregation
-This module explains how you can aggregate data with a video lesson. The module is further divided into 4 guided exercises.
+This module explains how you can aggregate data with a video lesson. The module is further divided into 4 guided exercises.
## Lesson 1: Aggregation Basics
@@ -10,19 +10,17 @@ This module explains how you can aggregate data with a video lesson. The module
## Exercises
-In these exercises, we will take the starter project and see how we can aggregate data. We will focus on indexing the staking rewards and then aggregating them over a particular account.
+In these exercises, we will take the starter project and see how we can aggregate data. We will focus on indexing the staking rewards and then aggregating them over a particular account.
-To summarise, we will determine how much reward an account has accumulated over time.
+To summarise, we will determine how much reward an account has accumulated over time.
## Pre-Requisites
Completion of [Module 3](../herocourse/module3.md).
-
## Exercise 1: Index Staking Rewards
-Before you aggregate all the staked rewards earned by a user, or to be precise a **DOT account owner**, you need to index those staking rewards.
-
+Before you aggregate all the staked rewards earned by a user, or to be precise a **DOT account owner**, you need to index those staking rewards.
### Overview of Steps Involved
@@ -38,26 +36,23 @@ Before you aggregate all the staked rewards earned by a user, or to be precise a
The first step to create a SubQuery project with the following command:
-
```
$ subql init
Project name [subql-starter]: staking-rewards
? Select a network family Substrate
? Select a network Polkadot
? Select a template project subql-starter Starter project for subquery
-RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
-Git repository [https://github.com/subquery/subql-starter]:
+RPC endpoint: [wss://polkadot.api.onfinality.io/public-ws]:
+Git repository [https://github.com/subquery/subql-starter]:
Fetching network genesis hash... done
-Author [Ian He & Jay Ji]:
-Description [This project can be use as a starting po...]:
-Version [1.0.0]:
-License [MIT]:
+Author [Ian He & Jay Ji]:
+Description [This project can be use as a starting po...]:
+Version [1.0.0]:
+License [MIT]:
Preparing project... done
staking-rewards is ready
```
-
-
#### Step 2: Update the Graphql Schema
Add an entity called `StakingReward`. This entity will allow you to record the account-reward along with the balance. Moreover, the block height will help you perform a cross check.
@@ -74,12 +69,9 @@ type StakingReward @entity{
}
```
-
-
#### Step 3: Update the Manifest File (aka project.yaml)
-Update the manifest file by including a `handleStakingRewarded` handler and updating the filter method to `staking/Rewarded`. This is the only event you require to capture for now. Hence, remove the `blockHandler` and `callHandler`.
-
+Update the manifest file by including a `handleStakingRewarded` handler and updating the filter method to `staking/Rewarded`. This is the only event you require to capture for now. Hence, remove the `blockHandler` and `callHandler`.
```
- handler: handleStakingRewarded
@@ -89,16 +81,14 @@ Update the manifest file by including a `handleStakingRewarded` handler and upda
method: Rewarded
```
-::: info Note
-The `Rewarded` method was recently introduced from the block [6,713,249](https://github.com/polkadot-js/api/blob/master/packages/types-known/src/upgrades/polkadot.ts) onwards. It was previously called `Reward`. For this exercise, we will use this the new format and use a startBlock of 7,000,000.
+::: tip Note
+The `Rewarded` method was recently introduced from the block [6,713,249](https://github.com/polkadot-js/api/blob/master/packages/types-known/src/upgrades/polkadot.ts) onwards. It was previously called `Reward`. For this exercise, we will use this the new format and use a startBlock of 7,000,000.
:::
-
::: warning Important
Avoid messing with the auto-generated version names(as shown in the initial section of the manifest file).
:::
-
- The updated part of the manifest file will look like as follows:
```
@@ -140,12 +130,9 @@ bad indentation of a sequence entry (17:5)
error Command failed with exit code 1.
```
-
-
#### Step 4: Create `handleStakingRewarded` and Update Mapping File
-The initialisation of the project also pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent` and `handleCall`. Delete all of these functions as you need to create your own.
-
+The initialisation of the project also pre-creates a sample mappings file with 3 functions: `handleBlock`, `handleEvent` and `handleCall`. Delete all of these functions as you need to create your own.
```
export async function handleStakingRewarded(event: SubstrateEvent): Promise {
@@ -153,25 +140,19 @@ export async function handleStakingRewarded(event: SubstrateEvent): Promise
-
+::: code-tabs
+@tab:active yarn
- ```shell
- yarn install
- yarn codegen
- yarn build
- yarn start:docker
+```shell
+yarn install
+yarn codegen
+yarn build
+yarn start:docker
```
-
+@tab npm
-
-
- ```bash
- npm install
- npm run-script codegen
- npm run-script build
- ```
+```bash
+npm install
+npm run-script codegen
+npm run-script build
+```
-
-
+:::
#### Step 6: Query the Project
-Once the docker container is up and running successfully, which may take a few minutes, open up your browser and navigate to `www.localhost:3000`.
+Once the docker container is up and running successfully, which may take a few minutes, open up your browser and navigate to `www.localhost:3000`.
This will open up a “playground” where you can create your query. Copy the example below and see the results:
-
-
-
- ```
- query{
- stakingRewards(first: 3 orderBy:BLOCK_HEIGHT_ASC){
- nodes{
- blockHeight
- account
- date
- balance
- }
+::: code-tabs
+@tab:active Query
+
+```
+query{
+ stakingRewards(first: 3 orderBy:BLOCK_HEIGHT_ASC){
+ nodes{
+ blockHeight
+ account
+ date
+ balance
}
}
- ```
-
-
-
-
- ```
- {
- "data": {
- "stakingRewards": {
- "nodes": [
- {
- "blockHeight": 7000064,
- "account": "16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7",
- "date": "2021-09-26T16:50:18.001",
- "balance": "2189068638"
- },
- {
- "blockHeight": 7000064,
- "account": "13MnytvGDqJLGZbizqd8CDKJUPa9UJyzXcdxRiJEv5g2hq47",
- "date": "2021-09-26T16:50:18.001",
- "balance": "2050030971"
- },
- {
- "blockHeight": 7000064,
- "account": "12L117g377z195J3WaPshEaFC8vsNMyiMi8CTWfWVJdmBAJ4",
- "date": "2021-09-26T16:50:18.001",
- "balance": "2007885451"
- },
- {
- "blockHeight": 7000064,
- "account": "13owVsvG3GtmDYfcnDNCVm54z6X6VgYf37QRMFywrVPpkJvv",
- "date": "2021-09-26T16:50:18.001",
- "balance": "1987101808"
- },
- }
- }
- ```
-
-
+}
+```
+@tab result
+
+```
+{
+ "data": {
+ "stakingRewards": {
+ "nodes": [
+ {
+ "blockHeight": 7000064,
+ "account": "16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7",
+ "date": "2021-09-26T16:50:18.001",
+ "balance": "2189068638"
+ },
+ {
+ "blockHeight": 7000064,
+ "account": "13MnytvGDqJLGZbizqd8CDKJUPa9UJyzXcdxRiJEv5g2hq47",
+ "date": "2021-09-26T16:50:18.001",
+ "balance": "2050030971"
+ },
+ {
+ "blockHeight": 7000064,
+ "account": "12L117g377z195J3WaPshEaFC8vsNMyiMi8CTWfWVJdmBAJ4",
+ "date": "2021-09-26T16:50:18.001",
+ "balance": "2007885451"
+ },
+ {
+ "blockHeight": 7000064,
+ "account": "13owVsvG3GtmDYfcnDNCVm54z6X6VgYf37QRMFywrVPpkJvv",
+ "date": "2021-09-26T16:50:18.001",
+ "balance": "1987101808"
+ },
+ }
+ }
+```
+:::
-Congratulations! You have now indexed all staking rewards for all accounts from the block 7 Million onwards.
+Congratulations! You have now indexed all staking rewards for all accounts from the block 7 Million onwards.
In the next exercise, let’s aggregate or sum up these rewards for each account.
@@ -310,12 +281,10 @@ Compeletion of [Module 4 - Exercise 1](module4.md#exercise-1-index-staking-rewar
### Detailed Steps
-
#### Step 1: Add an Entity Called Sum Reward
Add a new entity called `SumReward` with extra fields as shown below:
-
```
type SumReward @entity{
id: ID! # AccountId
@@ -324,10 +293,8 @@ type SumReward @entity{
}
```
-
- **The new schema file should now look like this:**
-
```
type StakingReward @entity{
id: ID! #blockHeight-eventIdx
@@ -343,13 +310,10 @@ type SumReward @entity{
}
```
-
-
#### Step 2: Update the Manifest File(aka project.yaml)
Add an extra handler called `handleSumRewarded` and filter it by `staking/Rewarded`.
-
```
- handler: handleSumRewarded
kind: substrate/EventHandler
@@ -358,9 +322,7 @@ Add an extra handler called `handleSumRewarded` and filter it by `staking/Reward
method: Rewarded
```
-
-The ***latest and updated part*** of the manifest file should look like as below:
-
+The **_latest and updated part_** of the manifest file should look like as below:
```
dataSources:
@@ -381,23 +343,20 @@ dataSources:
method: Rewarded
```
-
-
-::: info Note
-This is how more than one mapping handler can be added to a project. Also note that the order is very crucial.
+::: tip Note
+This is how more than one mapping handler can be added to a project. Also note that the order is very crucial.
Otherwise you may encounter an error such as:
```
ERROR failed to index block at height 7000064 handleStakingRewarded() SequelizeForeignKeyConstraintError: insert or update on table "staking_rewards" violates foreign key constraint "staking_rewards_account_id_fkey"
```
-:::
+:::
-#### Step 3: Create `handleSumRewarded` Function and Update Mapping File
+#### Step 3: Create `handleSumRewarded` Function and Update Mapping File
Next, create a function called `handleSumRewarded` along with a helper function called `createSumReward`.
-
```
function createSumReward(accountId: string): SumReward {
const entity = new SumReward(accountId);
@@ -417,13 +376,12 @@ export async function handleSumRewarded(event: SubstrateEvent): Promise {
}
```
-::: info Note
+::: tip Note
Run `yarn codegen` and import the new entity to remove the errors.
:::
The complete and updated mapping file should now look like:
-
```
import {SubstrateEvent} from "@subql/types";
import {StakingReward, SumReward} from "../types";
@@ -457,80 +415,72 @@ export async function handleSumRewarded(event: SubstrateEvent): Promise {
}
```
-
-
#### Step 4: Rebuild the Project
See building a project in the [previous exercise](module4.md#step-5-install-dependencies-and-build-the-project).
-::: info Note
-Delete your database instance, i.e. the `.data folder`, as you have modified the schema file.
+::: tip Note
+Delete your database instance, i.e. the `.data folder`, as you have modified the schema file.
:::
-
#### Step 5: Query the Project
Run the following query to list out the total rewards for each account.
-
-
-
- ```
- query{
- sumRewards(first:3 orderBy:BLOCKHEIGHT_ASC){
- nodes{
- blockheight
- id
- totalReward
- }
+::: code-tabs
+@tab:active Query
+
+```
+query{
+ sumRewards(first:3 orderBy:BLOCKHEIGHT_ASC){
+ nodes{
+ blockheight
+ id
+ totalReward
}
}
- ```
-
-
-
-
- ```
- {
- "data": {
- "sumRewards": {
- "nodes": [
- {
- "blockheight": 7000064,
- "id": "121FXj85TuKfrQM1Pdcjj4ibbJNnfsqCtMsJ24rSvGEdWDdv",
- "totalReward": "10901386603"
- },
- {
- "blockheight": 7000064,
- "id": "123MFw5gAkCjcqEhapJ5zon4Ppyp59Rq2kyNQqEHbfwvM4Ni",
- "totalReward": "1023809925"
- },
- {
- "blockheight": 7000064,
- "id": "129N6sYY5r9LnfaMY2AG9px9yYyUhN6FERPXKLfirwBrjkJv",
- "totalReward": "980420660"
- }
- ]
- }
+}
+```
+
+@tab result
+
+```
+{
+ "data": {
+ "sumRewards": {
+ "nodes": [
+ {
+ "blockheight": 7000064,
+ "id": "121FXj85TuKfrQM1Pdcjj4ibbJNnfsqCtMsJ24rSvGEdWDdv",
+ "totalReward": "10901386603"
+ },
+ {
+ "blockheight": 7000064,
+ "id": "123MFw5gAkCjcqEhapJ5zon4Ppyp59Rq2kyNQqEHbfwvM4Ni",
+ "totalReward": "1023809925"
+ },
+ {
+ "blockheight": 7000064,
+ "id": "129N6sYY5r9LnfaMY2AG9px9yYyUhN6FERPXKLfirwBrjkJv",
+ "totalReward": "980420660"
+ }
+ ]
}
}
+}
- ```
-
-
-
-
+```
+:::
-What if not only could you display the `totalReward`, but also the show the individual rewards that made up this `totalReward`?
-That's what we will explore in our next exercise.
+What if not only could you display the `totalReward`, but also the show the individual rewards that made up this `totalReward`?
+That's what we will explore in our next exercise.
---
-
## Exercise 3: Viewing Both Aggregated and Individual Staking Rewards
-So far in this module, we have managed to query for all the staking rewards and aggregate them for each account. Now we will make an improvement, and view the aggregate amount as well as the individual amounts as a child set.
+So far in this module, we have managed to query for all the staking rewards and aggregate them for each account. Now we will make an improvement, and view the aggregate amount as well as the individual amounts as a child set.
### Pre-Requisites
@@ -538,11 +488,9 @@ Completion of **[Module 4 - Exercise 2](module4.md#exercise-2-aggregate-staking-
### Detailed Steps
-
#### Step 1: Modify the Schema File
-Update the graphql schema field called account to be type `SumReward`. We are creating a one-many entity relationship where one `sumReward` will comprise of many individual staking rewards.
-
+Update the graphql schema field called account to be type `SumReward`. We are creating a one-many entity relationship where one `sumReward` will comprise of many individual staking rewards.
```
type StakingReward @entity{
@@ -570,37 +518,30 @@ type SumReward @entity{
}
```
-
-#### Step 2: Check the Manifest File
+#### Step 2: Check the Manifest File
The manifest file does not need to be modified.
-
#### Step 3: Update `handleStakingRewarded` in the Mapping File(aka project.yaml)
In `handleStakingRewarded`, modify:
-
```
entity.account = account.toString();
```
-
to:
-
```
entity.accountId = account.toString();
```
+Note that you are creating here a relationship between two entities or tables. Hence, he `StakingReward` entity needs to have a column that contains the same value as the primary key column in the `SumReward` entity.
-Note that you are creating here a relationship between two entities or tables. Hence, he `StakingReward` entity needs to have a column that contains the same value as the primary key column in the `SumReward` entity.
-
-Because the `SumReward` entity has been assigned the **account value (account.toString())**, you must do the same here.
+Because the `SumReward` entity has been assigned the **account value (account.toString())**, you must do the same here.
- **Now, the whole updated mappings file should look like this:**
-
```
import {SubstrateEvent} from "@subql/types";
import {StakingReward, SumReward} from "../types";
@@ -633,78 +574,72 @@ export async function handleSumRewarded(event: SubstrateEvent): Promise {
}
```
-
-
#### Step 4: Rebuild the Project
-Refer the steps given in the previous exercise to [build the project](module4.md#step-5-install-dependencies-and-build-the-project).
+Refer the steps given in the previous exercise to [build the project](module4.md#step-5-install-dependencies-and-build-the-project).
:::info Note
You may need to delete your database folder(`.data folder`) because a new field will be created and included in your database schema.
:::
-
#### Step 5: Query the Project
-Now, run a query and utilise a `stakingRewardsByAccountId` field. This field is automatically created to find the individual staking rewards.
+Now, run a query and utilise a `stakingRewardsByAccountId` field. This field is automatically created to find the individual staking rewards.
-Below is an example query of one specific account:
+Below is an example query of one specific account:
-
-
+::: code-tabs
+@tab:active Query
- ```
- query{
- sumRewards(filter: {id:{equalTo:"16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7"}}){
- nodes{
- blockheight
- id
- totalReward
- stakingRewardsByAccountId{
- nodes{
- balance
- }
+```
+query{
+ sumRewards(filter: {id:{equalTo:"16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7"}}){
+ nodes{
+ blockheight
+ id
+ totalReward
+ stakingRewardsByAccountId{
+ nodes{
+ balance
}
- }
+ }
}
}
- ```
-
-
-
-
- ```
- {
- "data": {
- "sumRewards": {
- "nodes": [
- {
- "blockheight": 7013941,
- "id": "16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7",
- "totalReward": "4049635655",
- "stakingRewardsByAccountId": {
- "nodes": [
- {
- "balance": "2189068638"
- },
- {
- "balance": "1860567017"
- }
- ]
- }
+}
+```
+
+@tab result
+
+```
+{
+ "data": {
+ "sumRewards": {
+ "nodes": [
+ {
+ "blockheight": 7013941,
+ "id": "16jWQMBXZNxfgXJmVL61gMX4uqtc9WTXV3c8DGx6DUKejm7",
+ "totalReward": "4049635655",
+ "stakingRewardsByAccountId": {
+ "nodes": [
+ {
+ "balance": "2189068638"
+ },
+ {
+ "balance": "1860567017"
+ }
+ ]
}
- ]
- }
+ }
+ ]
}
}
+}
- ```
-
-
-
+```
-- Note that the result shows that a total reward of `4049635655` is made up of two balances.
+:::
+- Note that the result shows that a total reward of `4049635655` is made up of two balances.
---
@@ -712,9 +647,9 @@ Below is an example query of one specific account:
So far, we have used the `Rewarded` method in the manifest file.
-As mentioned in the previous exercise, `Rewarded` was only recently introduced from block [6713249](https://github.com/polkadot-js/api/blob/master/packages/types-known/src/upgrades/polkadot.ts) onwards. It was previously called `Reward`.
+As mentioned in the previous exercise, `Rewarded` was only recently introduced from block [6713249](https://github.com/polkadot-js/api/blob/master/packages/types-known/src/upgrades/polkadot.ts) onwards. It was previously called `Reward`.
-Hence, you need to update your code to capture all the staking rewards prior to this change.
+Hence, you need to update your code to capture all the staking rewards prior to this change.
### Pre-Requisites
@@ -724,9 +659,7 @@ Completion of **[Module 4 - Exercise 2](module4.md#exercise-2-aggregate-staking-
#### Step 1: Update the Manifest File(aka project.yaml)
-Add the following mapping filters to the manifest file. Note that we have removed the **“ed”** from the handler name and the method.
-
-
+Add the following mapping filters to the manifest file. Note that we have removed the **“ed”** from the handler name and the method.
```
- handler: handleSumReward
@@ -741,8 +674,7 @@ Add the following mapping filters to the manifest file. Note that we have remove
method: Reward
```
-- The ***updated part** of the manifest file should like this:
-
+- The **\*updated part** of the manifest file should like this:
```
dataSources:
@@ -763,8 +695,7 @@ dataSources:
method: Reward
```
-
-::: info Note
+::: tip Note
Also change the start block to 6,000,000, which should return the staking reward data.
When you change the starting block, don’t forget to delete the database and reindex.
@@ -774,7 +705,6 @@ When you change the starting block, don’t forget to delete the database and re
Create a redirect function from the old method to utilise the same code. The reason is that we have already captured the event.
-
```
export async function handleSumReward(event: SubstrateEvent): Promise {
await handleSumRewarded(event)
@@ -829,68 +759,62 @@ export async function handleSumRewarded(event: SubstrateEvent): Promise {
```
-
-
-
#### Step 3: Rebuild the Project
-To build the project, refer the steps provided in the [previous exercise](module4.md#step-5-install-dependencies-and-build-the-project).
-
+To build the project, refer the steps provided in the [previous exercise](module4.md#step-5-install-dependencies-and-build-the-project).
#### Step 4: Query the Project
-Re-run the previous queries. The data should appear for the blocks starting from 6 Million.
+Re-run the previous queries. The data should appear for the blocks starting from 6 Million.
:::info Note
-You have to wait for a while until the relevant blocks get indexed.
+You have to wait for a while until the relevant blocks get indexed.
:::
-
-
-
- ```
- query{
- sumRewards(first:3 orderBy:BLOCKHEIGHT_ASC){
- nodes{
- blockheight
- id
- totalReward
- }
+::: code-tabs
+@tab:active Query
+
+```
+query{
+ sumRewards(first:3 orderBy:BLOCKHEIGHT_ASC){
+ nodes{
+ blockheight
+ id
+ totalReward
}
}
- ```
-
-
-
-
- ```
- {
- "data": {
- "sumRewards": {
- "nodes": [
- {
- "blockheight": 6001338,
- "id": "11283CvjWWXesEPQxryZYxjBwTqFV7NMRw8reNGJfzQF4GvS",
- "totalReward": "5068047768"
- },
- {
- "blockheight": 6001338,
- "id": "112EHZp2Dn8jqW9iqpAUFW3ChmiiT6cMnN1arsqJtatnthfz",
- "totalReward": "503936239"
- },
- {
- "blockheight": 6001338,
- "id": "11agCcnJ8cYvKby6p27CiLxBaS1G1hnbRmwtUBAQ3beygUA",
- "totalReward": "1874696285"
- }
- ]
- }
+}
+```
+
+@tab result
+
+```
+{
+ "data": {
+ "sumRewards": {
+ "nodes": [
+ {
+ "blockheight": 6001338,
+ "id": "11283CvjWWXesEPQxryZYxjBwTqFV7NMRw8reNGJfzQF4GvS",
+ "totalReward": "5068047768"
+ },
+ {
+ "blockheight": 6001338,
+ "id": "112EHZp2Dn8jqW9iqpAUFW3ChmiiT6cMnN1arsqJtatnthfz",
+ "totalReward": "503936239"
+ },
+ {
+ "blockheight": 6001338,
+ "id": "11agCcnJ8cYvKby6p27CiLxBaS1G1hnbRmwtUBAQ3beygUA",
+ "totalReward": "1874696285"
+ }
+ ]
}
}
- ```
-
-
+}
+```
+:::
### References
diff --git a/docs/academy/herocourse/module6.md b/docs/academy/herocourse/module6.md
index 1eb36dcae4b..d426f4a8a7c 100644
--- a/docs/academy/herocourse/module6.md
+++ b/docs/academy/herocourse/module6.md
@@ -2,14 +2,14 @@
## Block v Events v Calls
-To process a SubQuery project and index data as fast and as efficient as possible, it is necessary to understand how things work under the covers.
+To process a SubQuery project and index data as fast and as efficient as possible, it is necessary to understand how things work under the covers.
SubQuery has three handlers to process blockchain data: [block handlers](../../build/mapping/polkadot.md#block-handler), [event handlers](../../build/mapping/polkadot.md#event-handler), and [call handlers](../../build/mapping/polkadot.md#call-handler).
**Block handlers** are very inefficient. They inspect every single block to grab data to index. In a case with over seven million blocks, if each block could be indexed in 10ms, this would take over eight (8) days to fully index the blockchain. Therefore, it is
advisable to avoid using block handlers if possible.
-**Event and call handlers** are the recommended handlers to use, in conjunction with mapping filters of course, as their performance is much better. The mapping filter allows the project to index only the blocks that satisfy the filter criteria.
+**Event and call handlers** are the recommended handlers to use, in conjunction with mapping filters of course, as their performance is much better. The mapping filter allows the project to index only the blocks that satisfy the filter criteria.
For example, below is a filter indexing the **staking** module and the **Rewarded** method.
@@ -21,20 +21,19 @@ For example, below is a filter indexing the **staking** module and the **Rewarde
method: Rewarded
```
-::: info Note
+::: tip Note
For even more performance gains, using a dictionary is highly recommended.
:::
-
## Using a Dictionary
-The concept of a **dictionary** was introduced in previous modules (For e.g. [Module 5 - Overriding Endpoints](../herocourse/module5.md#step-3-override-endpoints)).
+The concept of a **dictionary** was introduced in previous modules (For e.g. [Module 5 - Overriding Endpoints](../herocourse/module5.md#step-3-override-endpoints)).
Due to its importance, pleae review [Understanding how a dictionary works](../tutorials_examples/dictionary.md) and remember to include it in all your projects.
## Event & Extrinsic Names
-A popular question while creating SubQuery projects is - how do you know what data you can extract from the Polkadot blockchain?
+A popular question while creating SubQuery projects is - how do you know what data you can extract from the Polkadot blockchain?
There are several resource options:
@@ -65,7 +64,7 @@ Note that not all explorers are equal. Some may be easier to use and some may re
The previous two methods of knowing what blockchain data is available, along with the **type** (which is just as important), are great. However, learning to connect directly to the **Polkadot API via command line** provides several advantages.
-To begin with, it provides access to the most up to date API specifications because the [documentation](https://polkadot.js.org/docs/substrate/events/) could be slightly outdated.
+To begin with, it provides access to the most up to date API specifications because the [documentation](https://polkadot.js.org/docs/substrate/events/) could be slightly outdated.
Furthermore, it allows developers to understand the exact arguments and their types. This is essential when there are issues and debugging is required. And finally, it is very useful when integrating with custom chains where sometimes documentation is not available.
@@ -104,7 +103,7 @@ api = await ApiPromise.create({ provider });
#### Fetching a Block
-To get block hash at the height `h`, run:
+To get block hash at the height `h`, run:
**const blockHash = await api.rpc.chain.getBlockHash(h)**
@@ -138,13 +137,13 @@ To check the args (input for transaction) types, enter:
myExtrinsic.meta.args
```
-You should see a Vec/array. The size of the array means how many arg this extrinsics takes, and each arg metadata info should include 'name', 'type', 'typeName'.
+You should see a Vec/array. The size of the array means how many arg this extrinsics takes, and each arg metadata info should include 'name', 'type', 'typeName'.
We are looking for the `type`. For eg: 'MultiAddress' is the type interface from **Polkadot/api**.
#### Getting Events at a Certain Block Height
-Events cannot be extracted from a block, but they can be queried. Since we already have the `blockHash `(from above), we can **‘lock’** the current API to this particular block height.
+Events cannot be extracted from a block, but they can be queried. Since we already have the `blockHash `(from above), we can **‘lock’** the current API to this particular block height.
- Start with:
@@ -233,6 +232,6 @@ Using a smaller batch size can reduce memory usage and not leave users hanging f
Note that some events only start to occur at higher block height. Hence, one way to test a mapping function faster is to adjust the starting block height. See [How to start at a different block height?](../tutorials_examples/block-height.md).
-
## Bonus Tutorial
-* [201 Lab List All Transaction Workbook](/assets/pdf/SubQuery_201_Lab_List_All_Transaction.pdf)
\ No newline at end of file
+
+- [201 Lab List All Transaction Workbook](/assets/pdf/SubQuery_201_Lab_List_All_Transaction.pdf)
diff --git a/docs/bg/README.md b/docs/bg/README.md
index c04866347f7..7628622107f 100644
--- a/docs/bg/README.md
+++ b/docs/bg/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/bg/build/install.md b/docs/bg/build/install.md
index 3a828d5def1..cfbf1f5eeb1 100644
--- a/docs/bg/build/install.md
+++ b/docs/bg/build/install.md
@@ -8,21 +8,23 @@
Инсталирайте SubQuery CLI във вашия терминал, като използвате Yarn или NPM:
- ```bash npm install -g @subql/cli ``` ```shell yarn global add @subql/cli ``` You can then run help to see available commands and usage provide by CLI
+`bash npm install -g @subql/cli ` `shell yarn global add @subql/cli ` You can then run help to see available commands and usage provide by CLI
+
## Инсталиране на @subql/node
Нодата SubQuery е реализация, която извлича базирани върху субстрат блокчейн данни за проекта SubQuery и ги записва в базата данни на Postgres.
Инсталирайте нодата SubQuery node като използвате за целта терминалите Yarn или NPM:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Once installed, you can can start a node with:
```shell
subql-node
```
+
> Забележка: Ако използвате Docker или хостинг на проекта в други SubQuery проекти, може да пропуснете тази стъпка. Причината е следната: нодата SubQuery вече е част от Docker контейнера и инфраструктурата на хостинга.
## Инсталиране на @subql/query
@@ -31,7 +33,7 @@ subql-node
Инсталирайте запитване SubQuery като използвате за целта терминалите Yarn или NPM:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Забележка: Ако използвате Docker или хостинг на проекта в други SubQuery проекти, може да пропуснете тази стъпка. Причината е следната: нодата SubQuery вече е част от Docker контейнера и инфраструктурата на хостинга.
\ No newline at end of file
+> Забележка: Ако използвате Docker или хостинг на проекта в други SubQuery проекти, може да пропуснете тази стъпка. Причината е следната: нодата SubQuery вече е част от Docker контейнера и инфраструктурата на хостинга.
diff --git a/docs/bg/build/introduction.md b/docs/bg/build/introduction.md
index 1d6ec81f37b..bffe70d7dea 100644
--- a/docs/bg/build/introduction.md
+++ b/docs/bg/build/introduction.md
@@ -51,8 +51,8 @@ yarn codegen
Изпълнете командата за изграждане от основната директория на проекта.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Алтернативни опции за играждане
diff --git a/docs/bg/build/manifest.md b/docs/bg/build/manifest.md
index 7b398ff029f..3a22386841c 100644
--- a/docs/bg/build/manifest.md
+++ b/docs/bg/build/manifest.md
@@ -4,7 +4,7 @@
Манифестът може да бъде във формат YAML или JSON. В този документ ще използваме YAML във всички примери. По-долу е даден стандартен пример за основен `project.yaml`.
- ` yml specVersion: 0.2.0 name: example-project # version: 1.0.0 # Версия на проекта description: '' # Посочете името на проекта repository: 'https://github.com/subquery/subql-starter' # Git адрес на хранилището на вашия проект schema: file: ./schema.graphql # Местоположението на вашия файл със схема на GraphQL network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Генезис хеш на мрежата endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # По избор предоставете HTTP крайната точка на речник с пълна верига, за да ускорите обработката dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Това променя вашия начален блок за индексиране, задайте го по-високо, за да пропуснете първоначалните блокове с по-малко данни mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ` yml specVersion: "0.0.1" description: '' # Описание на вашия проект repository: 'https://github.com/subquery/subql-starter' # Git адрес на хранилището на вашия проект schema: ./schema.graphql # Местоположението на вашия файл със схема на GraphQL network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # По избор предоставете HTTP крайната точка на речник с пълна верига, за да ускорите обработката dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Това променя вашия начален блок за индексиране, задайте го по-високо, за да пропуснете първоначалните блокове с по-малко данни mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Филтърът е по избор, но се препоръчва за ускоряване на обработката на събития module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # version: 1.0.0 # Версия на проекта description: '' # Посочете името на проекта repository: 'https://github.com/subquery/subql-starter' # Git адрес на хранилището на вашия проект schema: file: ./schema.graphql # Местоположението на вашия файл със схема на GraphQL network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Генезис хеш на мрежата endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # По избор предоставете HTTP крайната точка на речник с пълна верига, за да ускорите обработката dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Това променя вашия начален блок за индексиране, задайте го по-високо, за да пропуснете първоначалните блокове с по-малко данни mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Описание на вашия проект repository: 'https://github.com/subquery/subql-starter' # Git адрес на хранилището на вашия проект schema: ./schema.graphql # Местоположението на вашия файл със схема на GraphQL network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # По избор предоставете HTTP крайната точка на речник с пълна верига, за да ускорите обработката dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Това променя вашия начален блок за индексиране, задайте го по-високо, за да пропуснете първоначалните блокове с по-малко данни mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Филтърът е по избор, но се препоръчва за ускоряване на обработката на събития module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## Мигриране от v0.0.1 към v0.2.0
@@ -81,9 +81,9 @@ USAGE $ subql init [PROJECTNAME]
### Мапинг спецификации
-| Поле | v0.0.1 | v0.2.0 | Описание |
-| ---------------------- | --------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **file** | Низ | 𐄂 | Път до записа за мапинг |
+| Поле | v0.0.1 | v0.2.0 | Описание |
+| ---------------------- | --------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
+| **file** | Низ | 𐄂 | Път до записа за мапинг |
| **handlers & filters** | [Манипулатори и филтри по подразбиране](./manifest/#mapping-handlers-and-filters) | Манипулатори и филтри по подразбиране, [Персонализирани манипулатори и филтри](#custom-data-sources) | Избройте всички [мапинг функции](./mapping/polkadot.md) и съответните им типове манипулатори, с допълнителни филтри за мапинг.
За персонализирани манипулатори за мапинг по време на изпълнение, моля, вижте [Персонализирани източници на данни](#custom-data-sources) |
## Източници на данни и мапинг
@@ -104,8 +104,8 @@ dataSources:
**Вашият SubQuery проект ще бъде много по-ефективен, когато използвате само манипулатори на събития и повиквания с подходящи филтри за мапинг**
-| Манипулатор | Поддържан филтър |
-| ------------------------------------------ | ---------------------------- |
+| Манипулатор | Поддържан филтър |
+| --------------------------------------------------- | ---------------------------- |
| [BlockHandler](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -153,8 +153,8 @@ dataSources:
В примера v0.2.0 по-долу, `network.chaintypes` сочат към файл, който включва всички персонализирани типове. Това е стандартен файл със спецификации на веригата, който декларира специфичните типове, поддържани от този блокчейн в `.json`, `.yaml` или `.js` формат.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # относителният път на файла до мястото, където се съхраняват персонализирани типове ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Опционално specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # относителният път на файла до мястото, където се съхраняват персонализирани типове ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Опционално specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
За да използвате машинопис за вашия файл с типове вериги, включете го в папка `src` (например `./src/types.ts`), стартирайте `yarn build` и след това посочете генерирания js файл, разположен в папка `dist`.
@@ -171,7 +171,7 @@ network:
Ето пример за файл с `.ts` типове вериги:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## Персонализирани източници на данни
@@ -197,6 +197,6 @@ network:
По-долу е даден пример, който показва различни източници на данни за мрежите Polkadot и Kusama.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Създайте шаблон, за да избегнете излишък определения: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #По избор specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Създайте шаблон, за да избегнете излишък определения: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #По избор specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/bg/build/mapping.md b/docs/bg/build/mapping.md
index 33a72e2788b..deef13ed41b 100644
--- a/docs/bg/build/mapping.md
+++ b/docs/bg/build/mapping.md
@@ -13,13 +13,13 @@
Можете да използвате манипулатори на блокове, за да улавяте информация всеки път, когато нов блок е прикачен към веригата на Substrate, напр. номер на блока. За да се постигне това, дефиниран BlockHandler ще бъде заявен веднъж за всеки блок.
```ts
-import {SubstrateBlock} from "@subql/types";
+import { SubstrateBlock } from "@subql/types";
export async function handleBlock(block: SubstrateBlock): Promise {
- // Create a new StarterEntity with the block hash as it's ID
- const record = new starterEntity(block.block.header.hash.toString());
- record.field1 = block.block.header.number.toNumber();
- await record.save();
+ // Create a new StarterEntity with the block hash as it's ID
+ const record = new starterEntity(block.block.header.hash.toString());
+ record.field1 = block.block.header.number.toNumber();
+ await record.save();
}
```
@@ -51,25 +51,30 @@ export async function handleEvent(event: SubstrateEvent): Promise {
```ts
export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
- const record = new starterEntity(extrinsic.block.block.header.hash.toString());
- record.field4 = extrinsic.block.timestamp;
- await record.save();
+ const record = new starterEntity(
+ extrinsic.block.block.header.hash.toString()
+ );
+ record.field4 = extrinsic.block.timestamp;
+ await record.save();
}
```
[SubstrateExtrinsic](https://github.com/OnFinality-io/subql/blob/a5ab06526dcffe5912206973583669c7f5b9fdc9/packages/types/src/interfaces.ts#L21) разширява [GenericExtrinsic](https://github.com/polkadot-js/api/blob/a9c9fb5769dec7ada8612d6068cf69de04aa15ed/packages/types/src/extrinsic/Extrinsic.ts#L170). Прилага му се `id` (блокът, към който принадлежи този външен елемент) и предоставя външен елемент, който разширява събитията между този блок. Освен това, той записва статуса на успех на този външен елемент.
## Състояния на заявка
+
Нашата цел е да покрием всички източници на данни за потребители за манипулатори на мапинг (повече от трите типа събития на интерфейса по-горе). Ето защо ние разкрихме някои от @polkadot/api интерфейсите, за да увеличим възможностите.
Това са интерфейсите, които в момента поддържаме:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) ще направи заявка към текущия блок.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) ще направи множество заявки от един тип към текущия блок.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) ще направи множество заявки от различен тип към текущия блок.
+
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) ще направи заявка към **текущия** блок.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) ще направи множество заявки от **един** тип към текущия блок.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) ще направи множество заявки от **различен** тип към текущия блок.
Това са интерфейсите, които **НЕ ** поддържаме в момента:
-- ~~api.tx.*~~
-- ~~api.derive.*~~
+
+- ~~api.tx.\*~~
+- ~~api.derive.\*~~
- ~~api.query.<module>.<method>.at~~
- ~~api.query.<module>.<method>.entriesAt~~
- ~~api.query.<module>.<method>.entriesPaged~~
@@ -97,6 +102,7 @@ const b1 = await api.rpc.chain.getBlock(blockhash);
// Ще използва текущия блок по подразбиране, по този начин:
const b2 = await api.rpc.chain.getBlock();
```
+
- За [Персонализирани Substrate Вериги](#custom-substrate-chains) RPC повиквания, вижте [употреба](#usage).
## Модули и библиотеки
@@ -147,6 +153,7 @@ SubQuery може да се използва във всяка верига, б
```shell
curl -H "Content-Type: application/json" -d '{"id":"1", "jsonrpc":"2.0", "method": "state_getMetadata", "params":[]}' http://localhost:9933
```
+
или от неговата **websocket** крайна точка с помощта на [`websocat`](https://github.com/vi/websocat):
```shell
@@ -160,46 +167,49 @@ echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
След това копирайте и поставете резултата в JSON файл. В нашия пример [kitty](https://github.com/subquery/tutorials-kitty-chain), създадохме `api-interface/kitty.json`.
#### Дефиниции на типове
+
Предполагаме, че потребителят познава специфичните типове и RPC поддръжка от веригата, и това е дефинирано в [Манифеста](./manifest.md).
Следвайки [видовете сетъпи](https://polkadot.js.org/docs/api/examples/promise/typegen#metadata-setup), създаваме :
+
- `src/api-interfaces/definitions.ts` - това експортира всички дефиниции на подпапки
```ts
-export { default as kitties } from './kitties/definitions';
+export { default as kitties } from "./kitties/definitions";
```
- `src/api-interfaces/kitties/definitions.ts` - дефиниции на типа за модула Kitties
+
```ts
export default {
- // custom types
- types: {
- Address: "AccountId",
- LookupSource: "AccountId",
- KittyIndex: "u32",
- Kitty: "[u8; 16]"
+ // custom types
+ types: {
+ Address: "AccountId",
+ LookupSource: "AccountId",
+ KittyIndex: "u32",
+ Kitty: "[u8; 16]",
+ },
+ // custom rpc : api.rpc.kitties.getKittyPrice
+ rpc: {
+ getKittyPrice: {
+ description: "Get Kitty price",
+ params: [
+ {
+ name: "at",
+ type: "BlockHash",
+ isHistoric: true,
+ isOptional: false,
+ },
+ {
+ name: "kittyIndex",
+ type: "KittyIndex",
+ isOptional: false,
+ },
+ ],
+ type: "Balance",
},
- // custom rpc : api.rpc.kitties.getKittyPrice
- rpc: {
- getKittyPrice:{
- description: 'Get Kitty price',
- params: [
- {
- name: 'at',
- type: 'BlockHash',
- isHistoric: true,
- isOptional: false
- },
- {
- name: 'kittyIndex',
- type: 'KittyIndex',
- isOptional: false
- }
- ],
- type: 'Balance'
- }
- }
-}
+ },
+};
```
#### Пакети
@@ -251,28 +261,32 @@ yarn generate:meta
```json
{
"compilerOptions": {
- // this is the package name we use (in the interface imports, --package for generators) */
- "kitty-birthinfo/*": ["src/*"],
- // here we replace the @polkadot/api augmentation with our own, generated from chain
- "@polkadot/api/augment": ["src/interfaces/augment-api.ts"],
- // replace the augmented types with our own, as generated from definitions
- "@polkadot/types/augment": ["src/interfaces/augment-types.ts"]
- }
+ // this is the package name we use (in the interface imports, --package for generators) */
+ "kitty-birthinfo/*": ["src/*"],
+ // here we replace the @polkadot/api augmentation with our own, generated from chain
+ "@polkadot/api/augment": ["src/interfaces/augment-api.ts"],
+ // replace the augmented types with our own, as generated from definitions
+ "@polkadot/types/augment": ["src/interfaces/augment-types.ts"]
+ }
}
```
### Използване
Сега във функцията за преобразуване можем да покажем как метаданните и типовете всъщност декорират API. RPC крайната точка ще поддържа модулите и методите, които декларирахме по-горе. И за да използвате персонализирано rpc повикване, моля, вижте раздел [Персонализирани верижни rpc повиквания](#custom-chain-rpc-calls)
+
```typescript
export async function kittyApiHandler(): Promise {
- //return the KittyIndex type
- const nextKittyId = await api.query.kitties.nextKittyId();
- // return the Kitty type, input parameters types are AccountId and KittyIndex
- const allKitties = await api.query.kitties.kitties('xxxxxxxxx',123)
- logger.info(`Next kitty id ${nextKittyId}`)
- //Custom rpc, set undefined to blockhash
- const kittyPrice = await api.rpc.kitties.getKittyPrice(undefined,nextKittyId);
+ //return the KittyIndex type
+ const nextKittyId = await api.query.kitties.nextKittyId();
+ // return the Kitty type, input parameters types are AccountId and KittyIndex
+ const allKitties = await api.query.kitties.kitties("xxxxxxxxx", 123);
+ logger.info(`Next kitty id ${nextKittyId}`);
+ //Custom rpc, set undefined to blockhash
+ const kittyPrice = await api.rpc.kitties.getKittyPrice(
+ undefined,
+ nextKittyId
+ );
}
```
@@ -281,6 +295,7 @@ export async function kittyApiHandler(): Promise {
### Rpc повиквания в персонализирана верига
За да поддържаме персонализирани верижни RPC преобразувания, трябва ръчно да вкараме RPC дефиниции за `typesBundle`, позволявайки конфигурация по спецификация. Можете да дефинирате `typesBundle` в `project.yml`. И моля, не забравяйте, че се поддържат само повиквания тип `isHistoric`.
+
```yaml
...
types: {
diff --git a/docs/bg/build/substrate-evm.md b/docs/bg/build/substrate-evm.md
index c17babea6cd..164979c6233 100644
--- a/docs/bg/build/substrate-evm.md
+++ b/docs/bg/build/substrate-evm.md
@@ -74,7 +74,7 @@
| ------ | --------------- | --------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| topics | Масив на низове | Transfer(address indexed from,address indexed to,uint256 value) | Филтърът за теми следва филтрите за регистрационни файлове на Ethereum JSON-PRC, повече документация можете да намерите [тук](https://docs.ethers.io/v5/concepts/events/). |
-Бележка по теми:
+**Бележка по теми:**
Има няколко подобрения от основните филтри за регистрационни файлове:
- Темите не трябва да са подплатени с 0
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Известни ограничения
diff --git a/docs/bg/faqs/faqs.md b/docs/bg/faqs/faqs.md
index 6084f81c607..1a3dc4df908 100644
--- a/docs/bg/faqs/faqs.md
+++ b/docs/bg/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery също така предоставя безплатен хостин
**Мрежата на SubQuery**
-SubQuery Network позволява на разработчиците напълно да децентрализират своят инфраструктурен стек. Това е най-отворена, производителна, надеждна и мащабируема услуга за данни за dApps. SubQuery Network индексира и предоставя данни за глобалната общност по стимулиран начин, който подлежи на проверка. След като публикувате вашия проект в SubQuery Network, всеки получава възможност да го индексира и хоства - предоставяйки данни на потребителите по целия свят по-бързо и надеждно.
+SubQuery Network позволява на разработчиците напълно да децентрализират своят инфраструктурен стек. Това е най-отворена, производителна, надеждна и мащабируема услуга за данни за dApps. SubQuery Network индексира и предоставя данни за глобалната общност по стимулиран начин, който подлежи на проверка. След като публикувате вашия проект в SubQuery Network, всеки получава възможност да го индексира и хоства - предоставяйки данни на потребителите по целия свят по-бързо и надеждно.
Повече информация [тук](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ SubQuery Network позволява на разработчиците напъл
## По какъв начин мога да допринеса или да дам обратна връзка към SubQuery?
-Ние харесваме приноса и обратната връзка от общността. За да използвате с ваш код, направете "форк" на интересуващото ви хранилището и направете промените си. След това изпратете PR или Pull Request. Не забравяйте да тествате също. Проверете също така нашето ръководство за участие.
+Ние харесваме приноса и обратната връзка от общността. За да използвате с ваш код, направете "форк" на интересуващото ви хранилището и направете промените си. След това изпратете PR или Pull Request. Не забравяйте да тествате също. Проверете също така нашето [ръководство за участие](../miscellaneous/contributing.html).
За да дадете обратна връзка, свържете се с нас на hello@subquery.network или преминете към нашия [канал на discord](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Обърнете внимание, че се препоръчва да използвате `--force-clean`, когато променяте `startBlock` в манифеста на проекта (`project.yaml`), за да започнете преиндексиране от конфигурирания блок. Ако `startBlock` се промени без `--force-clean` на проекта, тогава индексаторът ще продължи да индексира с предварително конфигурирания `startBlock`.
-
## Как мога да оптимизирам проекта си, за да го направя по-бърз?
Производителността решаващ фактор във всеки един проект. За щастие има няколко неща, които можете да направите, за подобряването и. Представяме Ви списък с някои предложения:
@@ -89,13 +88,13 @@ subql-node -f . --force-clean --subquery-name=
- Задайте началния блок в момента, в който контракта е инициализиран.
- Винаги използвайте [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (можем да ви помогнем да създадете такъв за вашата нова мрежа).
- Оптимизирайте дизайна на вашата схема, организирайте я по възможно най-опростен начин.
- - Опитайте се да намалите ненужните полета и колони.
- - Създайте толкова индекси, колкото е необхидимо.
+ - Опитайте се да намалите ненужните полета и колони.
+ - Създайте толкова индекси, колкото е необхидимо.
- Използвайте паралелна/групова обработка възможно най-често.
- - Използвайте `api.queryMulti()` за оптимизиране на Polkadot API повикванията вътре във функциите за картографиране и поисквайте ги периодично. Това е по-бързият начин.
- - Използвайте `Promise.all()`. В случай на множество асинхронни функции е по-добре да ги изпълните и разрешите паралелно.
- - Ако искате да създадете много обекти в рамките на един манипулатор, можете да използвате `store.bulkCreate(entityName: string, entities: Entity[])`. Можете да ги създавате паралелно, няма нужда да правите това по отделно.
+ - Използвайте `api.queryMulti()` за оптимизиране на Polkadot API повикванията вътре във функциите за картографиране и поисквайте ги периодично. Това е по-бързият начин.
+ - Използвайте `Promise.all()`. В случай на множество асинхронни функции е по-добре да ги изпълните и разрешите паралелно.
+ - Ако искате да създадете много обекти в рамките на един манипулатор, можете да използвате `store.bulkCreate(entityName: string, entities: Entity[])`. Можете да ги създавате паралелно, няма нужда да правите това по отделно.
- Осъществяването на API повиквания към състояние на заявка може да бъде бавно. Можете да опитате да сведете до минимум повикванията, където е възможно, и да използвате данни за `външни/транзакции/събития`.
- Използвайте `worker threads`, за да преместите обработката на блоковете в собствена работна нишка. Може да се ускори индексирането до 4 пъти (в зависимост от конкретния проект). Можете лесно да активирате това с помощта на flag `-workers=`. Имайте предвид, че броят на наличните процесорни ядра стриктно ограничава използването на работните потоци. Засега, тези функции са налични само за Substrate и Cosmos и скоро ще бъдат интегрирани към Avalanche.
- Забележете, че `JSON.stringify` не подкрепя нативния `BigInts`. Нашата библиотека за регистриране ще направи това вътрешно, при опит за регистрирате на обект. Търсим заобиколно решение за това.
-- Използвайте удобен филтър `modulo`, за да стартирате манипулатор само веднъж към определен блок. Този филтър позволява обработка на произволен брой блокове, което е изключително полезно за групиране и изчисляване на данни на зададен интервал. Например, ако модулът е зададен на 50, манипулаторът на блокове ще работи на всеки 50 блока. Той осигурява дори повече контрол върху индексирането на данни на разработчиците и може да бъде внедрен като такъв в манифеста на вашия проект.
\ No newline at end of file
+- Използвайте удобен филтър `modulo`, за да стартирате манипулатор само веднъж към определен блок. Този филтър позволява обработка на произволен брой блокове, което е изключително полезно за групиране и изчисляване на данни на зададен интервал. Например, ако модулът е зададен на 50, манипулаторът на блокове ще работи на всеки 50 блока. Той осигурява дори повече контрол върху индексирането на данни на разработчиците и може да бъде внедрен като такъв в манифеста на вашия проект.
diff --git a/docs/bg/quickstart/helloworld-localhost.md b/docs/bg/quickstart/helloworld-localhost.md
index 0861fbf269e..75d8a509781 100644
--- a/docs/bg/quickstart/helloworld-localhost.md
+++ b/docs/bg/quickstart/helloworld-localhost.md
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Сега направете инсталация на yarn или node, за да инсталирате различните зависимости.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
Пример за `yarn install`
@@ -109,8 +109,8 @@ success Saved lockfile.
Сега стартирайте `yarn codegen`, за да генерирате Typescript от схемата GraphQL.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
Пример за `yarn codegen`
@@ -133,8 +133,8 @@ $ ./node_modules/.bin/subql codegen
Следващата стъпка е да създадете код с `yarn build`.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
Пример за `yarn build`
diff --git a/docs/bg/quickstart/quickstart-avalanche.md b/docs/bg/quickstart/quickstart-avalanche.md
index 21a2de51b4a..2fa7ff6eda0 100644
--- a/docs/bg/quickstart/quickstart-avalanche.md
+++ b/docs/bg/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ subql init
И накрая, в директорията на проекта изпълнете следната команда, за да инсталирате зависимостите на новия проект.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Внасяне на промени във вашия проект
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Важно: Когато правите промени във файла schema, моля, уверете се, че отново сте създали директорията си с типове със следната команда yarn codegen. Направете го сега.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Ще намерите генерираните модели в директорията `/src/types/models`. За повече информация относно файла `schema.graphql`, проверете нашата документация в раздела [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ export async function handleLog(event: AvalancheLog): Promise {
За да стартираме вашия нов проект SubQuery, първо трябва да изградим нашата работа. Изпълнете командата за изграждане от основната директория на проекта.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Важно: Всеки път, когато правите промени във вашите функции за картографиране, ще трябва да изградите отново своя проект**
@@ -183,13 +183,11 @@ export async function handleLog(event: AvalancheLog): Promise {
В директорията на проекта изпълнете следната команда:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Може да отнеме известно време изтеглянето на необходимите пакети ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 > и Postgres) за първи път, но скоро ще видите работещ нод на SubQuery. Бъдете търпеливи.
-
-
+`@subql/query` и Postgres) за първи път, но скоро ще видите работещ нод на SubQuery. Бъдете търпеливи.
### Направете заявка за вашият проект
@@ -199,8 +197,6 @@ export async function handleLog(event: AvalancheLog): Promise {
За нов стартов SubQuery проект можете да опитате следната заявка, за да получите представа как работи или [научете повече относно езика за заявки GraphQL ](../run_publish/graphql.md).
-
-
```graphql
query {
pangolinApprovals(first: 5) {
@@ -217,17 +213,12 @@ query {
}
```
-
-
-
### Публикувайте своя SubQuery проект
SubQuery предоставя безплатна управлявана услуга, с помощта който можете да разгърнете новия си проект. Може да го разгърнете в [SubQuery Projects](https://project.subquery.network) и да направите запитване с помощта на нашият [Explorer](https://explorer.subquery.network).
[Прочетете ръководството за публикуване на новия си проект в SubQuery Projects](../run_publish/publish.md), **Обърнете внимание, че трябва да внедрите чрез IPFS**.
-
-
## Следващите стъпки
Поздравления, вече имате локално работещ SubQuery проект, който приема заявки за GraphQL API за прехвърляне на данни от bLuna.
diff --git a/docs/bg/quickstart/quickstart-cosmos.md b/docs/bg/quickstart/quickstart-cosmos.md
index dedc07c4dbb..4bff2ca10cd 100644
--- a/docs/bg/quickstart/quickstart-cosmos.md
+++ b/docs/bg/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ Cosmos is not yet supported in SubQuery's CLI (`subql`), to start with Juno clon
И накрая, в директорията на проекта изпълнете следната команда, за да инсталирате зависимостите на новия проект.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Промени във вашият проект
@@ -75,8 +75,8 @@ type Vote @entity {
**Важно: Когато правите промени във файла schema, моля, уверете се, че отново сте създали директорията си с типове със следната команда yarn codegen. Направете го сега.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Ще намерите генирираните модели в директорията `/src/types/models`. За повече информация относно файла `schema.graphql`, проверете нашата документация в раздела [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ What this is doing is receiving a CosmosMessage which includes message data on t
За да стартираме вашия нов проект SubQuery, първо трябва да изградим нашата работа. Изпълнете командата за изграждане от основната директория на проекта.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
** Важно: Всеки път, когато правите промени във вашите mapping функции, ще трябва да изградите отново своя проект**
@@ -159,7 +159,7 @@ What this is doing is receiving a CosmosMessage which includes message data on t
В директорията на проекта изпълнете следната команда:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Изтеглянето на необходимите пакети може да отнеме известно време ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) за първи път, но скоро ще видите работеща нода SubQuery. Бъдете търпеливи.
@@ -173,10 +173,9 @@ What this is doing is receiving a CosmosMessage which includes message data on t
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/bg/quickstart/quickstart-polkadot.md b/docs/bg/quickstart/quickstart-polkadot.md
index dccfdc99577..d98fa55061f 100644
--- a/docs/bg/quickstart/quickstart-polkadot.md
+++ b/docs/bg/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Ще ви бъдат зададени някои въпроси, по време на инициализирането на проекта SubQuery:
- Име на проект: Име на проект за вашия проект SubQuery
-- Мрежово семейство: Семейството на блокчейн от слой 1, което този проект SubQuery ще бъде разработен за индексиране. Използвайте клавишите със стрелки, за да изберете от наличните опции. За това ръководство ще използваме *"Substrate"*
-- Мрежа: Конкретната мрежа, която този проект SubQuery ще бъде разработен за индексиране. Използвайте клавишите със стрелки, за да изберете от наличните опции. За това ръководство ще използваме *"Polkadot"*
-- Шаблонен проект: Изберете шаблонен проект на SubQuery, който ще осигури отправна точка за започване на разработка. Предлагаме да изберете проекта *"subql-starter"*.
-- RPC крайна точка: Предоставете HTTPS URL на работеща RPC крайна точка, която ще се използва по подразбиране за този проект. Можете бързо да получите достъп до публични крайни точки за различни мрежи на Polkadot, да създадете свой собствен частен специален Нод с помощта на [OnFinality](https://app.onfinality.io) или просто да използвате крайната точка на Polkadot по подразбиране. Този вид нода RPC трябва да представлява архивна нода (да има състояние на пълна веригата). За това ръководство ще използваме стойността по подразбиране *"https://polkadot.api.onfinality.io"*
+- Мрежово семейство: Семейството на блокчейн от слой 1, което този проект SubQuery ще бъде разработен за индексиране. Използвайте клавишите със стрелки, за да изберете от наличните опции. За това ръководство ще използваме _"Substrate"_
+- Мрежа: Конкретната мрежа, която този проект SubQuery ще бъде разработен за индексиране. Използвайте клавишите със стрелки, за да изберете от наличните опции. За това ръководство ще използваме _"Polkadot"_
+- Шаблонен проект: Изберете шаблонен проект на SubQuery, който ще осигури отправна точка за започване на разработка. Предлагаме да изберете проекта _"subql-starter"_.
+- RPC крайна точка: Предоставете HTTPS URL на работеща RPC крайна точка, която ще се използва по подразбиране за този проект. Можете бързо да получите достъп до публични крайни точки за различни мрежи на Polkadot, да създадете свой собствен частен специален Нод с помощта на [OnFinality](https://app.onfinality.io) или просто да използвате крайната точка на Polkadot по подразбиране. Този вид нода RPC трябва да представлява архивна нода (да има състояние на пълна веригата). За това ръководство ще използваме стойността по подразбиране _"https://polkadot.api.onfinality.io"_
- Git хранилище: Предоставете URL на Git към репозитория, в която този проект SubQuery ще бъде хостван (когато се хоства в SubQuery Explorer) или приемете предоставеното по подразбиране.
- Автори: Въведете собственика на този проект SubQuery тук (например вашето име!) или приемете предоставеното по подразбиране.
- Описание: Предоставете кратък параграф за вашия проект, който описва какви данни съдържа и какво могат да правят потребителите с него или да приемат предоставеното по подразбиране.
@@ -57,8 +57,8 @@ subql init
Накрая, под директорията на проекта, изпълнете следната команда, за да инсталирате зависимостите на новия проект.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Внасяне на промени във вашия проект
@@ -88,8 +88,8 @@ type Transfer @entity {
**Важно: Когато правите промени във файла schema, моля, уверете се, че отново сте създали директорията си с типове със следната команда yarn codegen.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Ще намерите генерираните модели в директорията `/src/types/models`. За повече информация относно файла `schema.graphql`, проверете нашата документация в раздела [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
За да стартираме вашия нов проект SubQuery, първо трябва да изградим нашата работа. Изпълнете командата за изграждане от основната директория на проекта.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Важно: Всеки път, когато правите промени във вашите функции за картографиране, ще трябва да изградите отново своя проект**
@@ -174,7 +174,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
Под директорията на проекта изпълнете следната команда:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Може да отнеме известно време изтеглянето на необходимите пакети ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query) и Postgres) за първи път, но скоро трябва да видите работещ нод на SubQuery на екрана на терминала.
@@ -189,10 +189,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/bg/quickstart/quickstart-terra.md b/docs/bg/quickstart/quickstart-terra.md
index d721a55de40..88861659258 100644
--- a/docs/bg/quickstart/quickstart-terra.md
+++ b/docs/bg/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Ще ви бъдат зададени някои въпроси, по време на инициализирането на проекта SubQuery:
- Project Name: A name for your SubQuery project
-- Мрежово семейство: Блокчейн мрежовото семейство от слой 1, което този проект SubQuery ще бъде разработен за индексиране, използвайте клавишите със стрелки на клавиатурата си, за да изберете от опциите, за това ръководство ще използваме *„Terra“*
-- Мрежа: Конкретната мрежа, която този проект SubQuery ще бъде разработен за индексиране, използвайте клавишите със стрелки на клавиатурата си, за да изберете от опциите, за това ръководство ще използваме *„Terra“*
-- Template: Изберете шаблон за проекта SubQuery, който ще служи като начална точка за започване на разработка, предлагаме да изберете *"Starter project"*
+- Мрежово семейство: Блокчейн мрежовото семейство от слой 1, което този проект SubQuery ще бъде разработен за индексиране, използвайте клавишите със стрелки на клавиатурата си, за да изберете от опциите, за това ръководство ще използваме _„Terra“_
+- Мрежа: Конкретната мрежа, която този проект SubQuery ще бъде разработен за индексиране, използвайте клавишите със стрелки на клавиатурата си, за да изберете от опциите, за това ръководство ще използваме _„Terra“_
+- Template: Изберете шаблон за проекта SubQuery, който ще служи като начална точка за започване на разработка, предлагаме да изберете _"Starter project"_
- Git repository (опционално): посочете Git URL хранилище, в което ще се съхранява този проект SubQuery (при разполагане в SubQuery Explorer)
-- RPC endpoint (Необходимо): Укажете HTTPS URL за работеща крайна точка RPC която ще бъде използвана по подразбиране за този проект. Този вид нода RPC трябва да представлява архивна нода (да има състояние на пълна веригата). За това ръководство ще използваме стойността по подразбиране *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Необходимо): Укажете HTTPS URL за работеща крайна точка RPC която ще бъде използвана по подразбиране за този проект. Този вид нода RPC трябва да представлява архивна нода (да има състояние на пълна веригата). За това ръководство ще използваме стойността по подразбиране _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (задължително): въведете собственика на този проект за SubQuery тук (например вашето име!)
- Description (опционално): можете да предоставите кратко описание за вашия проект, който описва какви данни съдържа и какво могат да правят потребителите с него
- Version (Задължително): въведете свой персонализиран номер на версията или използвайте стойността по подразбиране (`1.0.0`)
@@ -59,8 +59,8 @@ subql init
И накрая, в директорията на проекта изпълнете следната команда, за да инсталирате зависимостите на новия проект.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Промени във вашият проект
@@ -91,8 +91,8 @@ type Transfer @entity {
**Важно: Когато правите промени във файла schema, моля, уверете се, че отново сте създали директорията си с типове със следната команда yarn codegen. Направете го сега.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Ще намерите генирираните модели в директорията `/src/types/models`. За повече информация относно файла `schema.graphql`, проверете нашата документация в раздела [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ export async function handleEvent(
За да стартираме вашия нов проект SubQuery, първо трябва да изградим нашата работа. Изпълнете командата за изграждане от основната директория на проекта.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
** Важно: Всеки път, когато правите промени във вашите mapping функции, ще трябва да изградите отново своя проект**
@@ -192,7 +192,7 @@ export async function handleEvent(
В директорията на проекта изпълнете следната команда:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Изтеглянето на необходимите пакети може да отнеме известно време ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) за първи път, но скоро ще видите работеща нода SubQuery. Бъдете търпеливи.
@@ -207,10 +207,7 @@ export async function handleEvent(
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/bg/quickstart/quickstart.md b/docs/bg/quickstart/quickstart.md
index 0d41fc24502..6d44657b9f0 100644
--- a/docs/bg/quickstart/quickstart.md
+++ b/docs/bg/quickstart/quickstart.md
@@ -89,8 +89,8 @@ HelloWorld is ready
Накрая изпълнете следната команда, за да инсталирате зависимостите на новия проект от директорията на новия проект.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
Вече инициализирахте първия си проект на SubQuery само с няколко прости стъпки. Нека сега персонализираме стандартния шаблонен проект за конкретна блокова верига, която представлява интерес.
@@ -104,4 +104,4 @@ HelloWorld is ready
2. Манифест на проекта в `project.yaml`.
3. Mapping функциите в директорията `src/mappings/`.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/bg/run_publish/connect.md b/docs/bg/run_publish/connect.md
index 596373662f1..2908dcac435 100644
--- a/docs/bg/run_publish/connect.md
+++ b/docs/bg/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![Проектът се внедрява и се синхронизира](/assets/img/projects-deploy-sync.png)
+![Проектът се внедрява и се синхронизира](/assets/img/projects_deploy_sync.png)
Като алтернатива можете да щракнете върху трите точки до заглавието на вашия проект и да го видите в SubQuery Explorer. There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/bg/run_publish/query.md b/docs/bg/run_publish/query.md
index 087a40ae2c4..f9cc64cd3b2 100644
--- a/docs/bg/run_publish/query.md
+++ b/docs/bg/run_publish/query.md
@@ -12,4 +12,4 @@ SubQuery explorer ще ви помогне да започнете работа.
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. Тази документация се генерира автоматично и ви помага да намерите за какви обекти и методи можете да направите заявка.
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/bg/run_publish/references.md b/docs/bg/run_publish/references.md
index 5edcfadc2f9..b6cdd430f70 100644
--- a/docs/bg/run_publish/references.md
+++ b/docs/bg/run_publish/references.md
@@ -21,11 +21,11 @@
Тази команда използва webpack за генериране на пакет от проект на subquery.
-| Опции | Описание |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | локална папка на проекта subquery (ако все още не сте в папка) |
-| -o, --output | посочете изходната папка на build например build-folder |
-| --mode=(production | prod | development | dev) | [ default: production ] |
+| Опции | Описание |
+| ------------------ | -------------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | локална папка на проекта subquery (ако все още не сте в папка) |
+| -o, --output | посочете изходната папка на build например build-folder |
+| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -338,8 +338,6 @@ The port the subquery indexing service binds to. By default this is set to `3000
Disables automated historical state tracking, [see Historic State Tracking](./historical.md). By default this is set to `false`.
-
-
### -w, --workers
This will move block fetching and processing into a worker. By default, this feature is **disabled**. You can enable it with the `--workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. So, when using the `--workers=` flag, always specify the number of workers. With no flag provided, everything will run in the same thread.
@@ -348,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/bg/run_publish/run.md b/docs/bg/run_publish/run.md
index 2d7d6e6c486..64a160878f5 100644
--- a/docs/bg/run_publish/run.md
+++ b/docs/bg/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/bg/run_publish/subscription.md b/docs/bg/run_publish/subscription.md
index 646a7a1627d..faba1ec79a8 100644
--- a/docs/bg/run_publish/subscription.md
+++ b/docs/bg/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery вече поддържа и Graphql абонаменти. Подобн
Абонаментите са много полезни, когато искате вашето клиентско приложение да промени данни или да покаже някои нови данни веднага щом тази промяна настъпи или новите данни са налични. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## Как да се абонирам за обект
diff --git a/docs/bg/run_publish/upgrade.md b/docs/bg/run_publish/upgrade.md
index cb699b1c73a..e3b5b29ab55 100644
--- a/docs/bg/run_publish/upgrade.md
+++ b/docs/bg/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
След като внедряването ви приключи успешно и нашите нодове са индексирали вашите данни от веригата, ще можете да се свържете с вашия проект чрез показания ендпойнт на GraphQL Query.
-![Проектът се внедрява и се синхронизира](/assets/img/projects-deploy-sync.png)
+![Проектът се внедрява и се синхронизира](/assets/img/projects_deploy_sync.png)
Като алтернатива можете да щракнете върху трите точки до заглавието на вашия проект и да го видите в SubQuery Explorer. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/bg/subquery_network/introduction.md b/docs/bg/subquery_network/introduction.md
index 295c7a86735..b03b437906c 100644
--- a/docs/bg/subquery_network/introduction.md
+++ b/docs/bg/subquery_network/introduction.md
@@ -18,22 +18,22 @@ There’s a role for everyone in the network, from highly technical developers t
Consumers will ask the SubQuery Network for specific data for their dApps or tools, and pay an advertised amount of SQT for each request.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexers
Indexers will run and maintain high quality SubQuery projects in their own infrastructure, running both the indexer and query service, and will be rewarded in SQT for the requests that they serve.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Делегатори
Delegators will participate in the Network by supporting their favourite Indexers to earn rewards based on the work those indexers do.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Architects
Architects are the builders of the SubQuery projects that the Network runs on. They author and publish SubQuery projects for the Network to index and run.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/build/cosmos-evm.md b/docs/build/cosmos-evm.md
index 068ef043cc6..a78ca5f208f 100644
--- a/docs/build/cosmos-evm.md
+++ b/docs/build/cosmos-evm.md
@@ -2,7 +2,7 @@
We provide a custom data source processor for [Cosmos's Ethermint EVM](https://github.com/cosmos/ethermint). This offers a simple way to filter and index both EVM and Cosmos activity on many Cosmos networks within a single SubQuery project.
-::: info Note
+::: tip Note
Ethermint chains (e.g. Cronos) are usually fully EVM compatible, which means that you can use two options for indexing Ethermint data. You can index Ethermint contract data via the standard Cosmos RPC interface, or via Ethereum APIs. For Cronos, we provide a [starter project for each](https://github.com/subquery/cosmos-subql-starter/tree/main/Cronos) and you can compare the two different options in the [Cronos quick start guide](../quickstart/quickstart_chains/cosmos-cronos.md).
This document goes into detail about how to use the Ethermint Cosmos RPCs (rather than the Ethereum API)
diff --git a/docs/build/graph-migration.md b/docs/build/graph-migration.md
index 928630dfcb8..0847a6c5e1e 100644
--- a/docs/build/graph-migration.md
+++ b/docs/build/graph-migration.md
@@ -59,7 +59,7 @@ The manifest file contains the largest set of differences, but once you understa
![Difference between a SubGraph and a SubQuery project](/assets/img/subgraph-manifest-3.png)
-:::: code-group
+::: code-group
::: code-group-item SubGraph
@@ -166,7 +166,7 @@ dataSources:
:::
-::::
+:::
## Mapping
@@ -178,7 +178,7 @@ The functions are defined the same way. Moreover, entities can be instantiated,
![Difference between a SubGraph and a SubQuery project](/assets/img/subgraph-mapping.png)
-:::: code-group
+::: code-group
::: code-group-item SubGraph
@@ -227,7 +227,7 @@ export async function handleUnlockAttackNFTs(
:::
-::::
+:::
## Querying Contracts
diff --git a/docs/build/graphql.md b/docs/build/graphql.md
index 62c1a213027..b0f481b38d0 100644
--- a/docs/build/graphql.md
+++ b/docs/build/graphql.md
@@ -8,23 +8,20 @@ The `schema.graphql` file defines the various GraphQL schemas. Due to the way th
When you make any changes to the schema file, don't forget to regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
### Entities
diff --git a/docs/build/install.md b/docs/build/install.md
index e4fa0398e0f..8d1f85547c2 100644
--- a/docs/build/install.md
+++ b/docs/build/install.md
@@ -8,23 +8,20 @@ The [@subql/cli](https://github.com/subquery/subql/tree/main/packages/cli) tool
Install SubQuery CLI globally on your terminal by using Yarn or NPM:
-
-
+::: code-tabs
+@tab npm
```bash
npm install -g @subql/cli
```
-
-
-
+@tab:active yarn
```shell
yarn global add @subql/cli
```
-
-
+:::
You can then run help to see available commands and usage provide by CLI:
@@ -38,23 +35,20 @@ A SubQuery node is an implementation that extracts substrate-based blockchain da
Install SubQuery node globally on your terminal by using Yarn or NPM:
-
-
+::: code-tabs
+@tab npm
```bash
npm install -g @subql/node
```
-
-
-
+@tab:active yarn
```shell
yarn global add @subql/node
```
-
-
+:::
Once installed, you can can start a node with:
@@ -62,7 +56,7 @@ Once installed, you can can start a node with:
subql-node
```
-::: info Note
+::: tip Note
If you are using Docker or hosting your project in SubQuery Projects, you can skip this step. This is because the SubQuery node is already provided in the Docker container and the hosting infrastructure.
:::
@@ -72,24 +66,21 @@ The SubQuery query library provides a service that allows you to query your proj
Install SubQuery query globally on your terminal by using Yarn or NPM:
-
-
+::: code-tabs
+@tab npm
```bash
npm install -g @subql/query
```
-
-
-
+@tab:active yarn
```shell
yarn global add @subql/query
```
-
-
+:::
-::: info Note
+::: tip Note
If you are using Docker or hosting your project in SubQuery Projects, you can skip this step also. This is because the SubQuery node is already provided in the Docker container and the hosting infrastructure.
:::
diff --git a/docs/build/introduction.md b/docs/build/introduction.md
index a9d469501ce..3e379068b8b 100644
--- a/docs/build/introduction.md
+++ b/docs/build/introduction.md
@@ -30,7 +30,7 @@ The following map provides an overview of the directory structure of a SubQuery
L README.md
L schema.graphql
L tsconfig.json
-
+
```
For example:
@@ -53,23 +53,20 @@ In order to run your SubQuery Project on a locally hosted SubQuery Node, you nee
Run the build command from the project's root directory.
-
-
-
- ```shell
- yarn build
- ```
+::: code-tabs
+@tab:active yarn
-
+```shell
+yarn build
+```
-
+@tab npm
- ```bash
- npm run-script build
- ```
+```bash
+npm run-script build
+```
-
-
+:::
### Alternative build options
diff --git a/docs/build/manifest/polkadot.md b/docs/build/manifest/polkadot.md
index c78f523945f..9a1421042a0 100644
--- a/docs/build/manifest/polkadot.md
+++ b/docs/build/manifest/polkadot.md
@@ -234,7 +234,7 @@ filter:
timestamp: "*/5 * * * *"
```
-::: info Note
+::: tip Note
We use the [cron-converter](https://github.com/roccivic/cron-converter) package to generate unix timestamps for iterations out of the given cron expression. So, make sure the format of the cron expression given in the `timestamp` filter is compatible with the package.
:::
diff --git a/docs/build/mapping/polkadot.md b/docs/build/mapping/polkadot.md
index b18fe2f2819..b8804a9850a 100644
--- a/docs/build/mapping/polkadot.md
+++ b/docs/build/mapping/polkadot.md
@@ -51,7 +51,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
A `SubstrateEvent` is an extended interface type of the [EventRecord](https://github.com/polkadot-js/api/blob/f0ce53f5a5e1e5a77cc01bf7f9ddb7fcf8546d11/packages/types/src/interfaces/system/types.ts#L149). Besides the event data, it also includes an `id` (the block to which this event belongs) and the extrinsic inside of this block.
-::: info Note
+::: tip Note
From `@subql/types` version `X.X.X` onwards `SubstrateEvent` is now generic. This can provide you with higher type safety when developing your project.
```ts
@@ -77,7 +77,7 @@ export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
The [SubstrateExtrinsic](https://github.com/OnFinality-io/subql/blob/a5ab06526dcffe5912206973583669c7f5b9fdc9/packages/types/src/interfaces.ts#L21) extends [GenericExtrinsic](https://github.com/polkadot-js/api/blob/a9c9fb5769dec7ada8612d6068cf69de04aa15ed/packages/types/src/extrinsic/Extrinsic.ts#L170). It is assigned an `id` (the block to which this extrinsic belongs) and provides an extrinsic property that extends the events among this block. Additionally, it records the success status of this extrinsic.
-::: info Note
+::: tip Note
From `@subql/types` version `X.X.X` onwards `SubstrateExtrinsic` is now generic. This can provide you with higher type safety when developing your project.
```ts
@@ -139,9 +139,9 @@ Our goal is to cover all data sources for users for mapping handlers (more than
These are the interfaces we currently support:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) will query the current block.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) will make multiple queries of the same type at the current block.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) will make multiple queries of different types at the current block.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) will query the **current** block.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) will make multiple queries of the **same** type at the current block.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) will make multiple queries of **different** types at the current block.
These are the interfaces we do **NOT** support currently:
diff --git a/docs/build/mapping/store.md b/docs/build/mapping/store.md
index b128b6d492d..7f1cc31dd0f 100644
--- a/docs/build/mapping/store.md
+++ b/docs/build/mapping/store.md
@@ -2,7 +2,7 @@
The SubQuery store is an injected class that allows users to interact with records in the database from within mapping functions. This will come handy when user demands using multiple entity records as the parameters in the mapping function, or create/update multiple records in a single place.
-::: info Note
+::: tip Note
Note that there are additional methods autogenerated with your entities that also interact with the store. Most users will find those methods sufficient for their projects.
:::
@@ -15,7 +15,7 @@ export interface Store {
entity: string,
field: string,
value: any,
- options?: { limit?: number, offset?: number }
+ options?: { limit?: number; offset?: number }
): Promise;
getOneByField(
entity: string,
diff --git a/docs/build/multi-chain.md b/docs/build/multi-chain.md
index ab4a44c9c67..7a0067a12ce 100644
--- a/docs/build/multi-chain.md
+++ b/docs/build/multi-chain.md
@@ -6,11 +6,11 @@ You can use the same SubQuery project, which includes the same GraphQL schema an
For example, you could capture XCM transaction data from all Polkadot parachains or monitoring IBC messages across Cosmos Zones in a single project, with a single database, and a single query endpoint.
-![Multi-chain](/assets/img/multi-chain.jpg)
+![Multi-chain](/assets/img/multi_chain.jpg)
## How it Works
-::: info Requirements for multi-chain indexing
+::: tip Requirements for multi-chain indexing
1. All projects must reference the same [GraphQL schema](./graphql.md) in their `project.yaml`
2. All projects must index to the same PostgreSQL table schema, this is set in your `docker-compose.yml`
diff --git a/docs/build/substrate-evm.md b/docs/build/substrate-evm.md
index ddcd69c7b48..c96fbca65d1 100644
--- a/docs/build/substrate-evm.md
+++ b/docs/build/substrate-evm.md
@@ -129,8 +129,8 @@ Changes from the `Log` type:
- `args` is added if the `abi` field is provided and the arguments can be successfully parsed. You can add a generic parameter like so to type `args`: `FrontierEvmEvent<{ from: string, to: string, value: BigNumber }>`.
-
-
+::: code-tabs
+@tab Frontier EVM
```ts
import { Approval, Transaction } from "../types";
@@ -178,8 +178,7 @@ export async function handleFrontierEvmCall(
}
```
-
-
+@tab Acala EVM+
```ts
import { Approval, Transaction } from "../types";
@@ -224,15 +223,14 @@ export async function handleAcalaEvmCall(
}
```
-
-
+:::
## Data Source Example
This is an extract from the `project.yaml` manifest file.
-
-
+::: code-tabs
+@tab Frontier EVM
```yaml
dataSources:
@@ -269,8 +267,7 @@ dataSources:
function: "approve(address to,uint256 value)"
```
-
-
+@tab Acala EVM+
```yaml
dataSources:
@@ -305,8 +302,7 @@ dataSources:
from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
-
-
+:::
## Querying contracts
diff --git a/docs/de/README.md b/docs/de/README.md
index 77ad45ea5d1..b0e3816cff0 100644
--- a/docs/de/README.md
+++ b/docs/de/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/de/build/install.md b/docs/de/build/install.md
index b13b5818870..5ea993606b8 100644
--- a/docs/de/build/install.md
+++ b/docs/de/build/install.md
@@ -8,28 +8,30 @@ Das Tool [@subql/cli](https://github.com/subquery/subql/tree/docs-new-section/pa
Installieren Sie SubQuery CLI global auf Ihrem Terminal, indem Sie Yarn oder NPM verwenden:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
Sie können dann help ausführen, um die verfügbaren Befehle und die Nutzung anzuzeigen, die von der CLI bereitgestellt werden:
```shell
subql help
```
+
## Installieren Sie bitte @subql/node
Eine SubQuery-Node ist eine Implementierung, die substratbasierte Blockchain-Daten pro SubQuery-Projekt extrahiert und in einer Postgres-Datenbank speichert.
Installieren Sie die SubQuery-Node global auf Ihrem Terminal, indem Sie Yarn oder NPM verwenden:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Nach der Installation können Sie eine Node starten mit:
```shell
subql-node
```
+
> Hinweis: Wenn Sie Docker verwenden oder Ihr Projekt in SubQuery-Projekten hosten, können Sie diesen Schritt überspringen. Dies liegt daran, dass die SubQuery-Node bereits im Docker-Container und der Hosting-Infrastruktur bereitgestellt wird.
## Installieren Sie bitte @subql/query
@@ -38,7 +40,7 @@ Die SubQuery-Abfragebibliothek stellt einen Dienst bereit, mit dem Sie Ihr Proje
Installieren Sie die SubQuery-Abfrage global auf Ihrem Terminal, indem Sie Yarn oder NPM verwenden:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Hinweis: Wenn Sie Docker verwenden oder Ihr Projekt in SubQuery-Projekten hosten, können Sie diesen Schritt auch überspringen. Dies liegt daran, dass die SubQuery-Node bereits im Docker-Container und der Hosting-Infrastruktur bereitgestellt wird.
\ No newline at end of file
+> Hinweis: Wenn Sie Docker verwenden oder Ihr Projekt in SubQuery-Projekten hosten, können Sie diesen Schritt auch überspringen. Dies liegt daran, dass die SubQuery-Node bereits im Docker-Container und der Hosting-Infrastruktur bereitgestellt wird.
diff --git a/docs/de/build/introduction.md b/docs/de/build/introduction.md
index a4fa6072867..ec1d724eb2f 100644
--- a/docs/de/build/introduction.md
+++ b/docs/de/build/introduction.md
@@ -51,8 +51,8 @@ Um Ihr SubQuery-Projekt auf einer lokal gehosteten SubQuery-Node auszuführen, m
Führen Sie den Build-Befehl im Stammverzeichnis des Projekts aus.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Alternative Build-Optionen
diff --git a/docs/de/build/manifest.md b/docs/de/build/manifest.md
index c45225e2852..49f1d18da22 100644
--- a/docs/de/build/manifest.md
+++ b/docs/de/build/manifest.md
@@ -4,7 +4,7 @@ Die Manifestdatei `project.yaml` kann als Einstiegspunkt Ihres Projekts angesehe
Das Manifest kann entweder im YAML- oder im JSON-Format vorliegen. In diesem Dokument verwenden wir YAML in allen Beispielen. Unten sehen Sie ein Standardbeispiel einer einfachen `project.yaml`.
- ` yml specVersion: 0.2.0 name: example-project # Geben Sie den Projektnamen an Version: 1.0.0 # Projektversion description: '' # Beschreibung Ihres Projekts repository: 'https://github.com/subquery/subql-starter' # Git-Repository-Adresse Ihres Projekts Schema: file: ./schema.graphql # Der Speicherort Ihrer GraphQL-Schemadatei Netzwerk: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis-Hash des Netzwerks endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Geben Sie optional den HTTP-Endpunkt eines vollständigen Kettenwörterbuchs an, um die Verarbeitung zu beschleunigen dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Dies ändert Ihren Startblock für die Indizierung, stellen Sie diesen höher ein, um Anfangsblöcke mit weniger Daten zu überspringen mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ` yml specVersion: "0.0.1" description: '' # Beschreibung Ihres Projekts repository: 'https://github.com/subquery/subql-starter' # Git-Repository-Adresse Ihres Projekts schema: ./schema.graphql # Der Speicherort Ihrer GraphQL-Schemadatei Netzwerk: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Geben Sie optional den HTTP-Endpunkt eines vollständigen Kettenwörterbuchs an, um die Verarbeitung zu beschleunigen dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Dies ändert Ihren Startblock für die Indizierung, stellen Sie diesen höher ein, um Anfangsblöcke mit weniger Daten zu überspringen mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter ist optional, wird aber empfohlen, um die Ereignisverarbeitung zu beschleunigen module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # Geben Sie den Projektnamen an Version: 1.0.0 # Projektversion description: '' # Beschreibung Ihres Projekts repository: 'https://github.com/subquery/subql-starter' # Git-Repository-Adresse Ihres Projekts Schema: file: ./schema.graphql # Der Speicherort Ihrer GraphQL-Schemadatei Netzwerk: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis-Hash des Netzwerks endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Geben Sie optional den HTTP-Endpunkt eines vollständigen Kettenwörterbuchs an, um die Verarbeitung zu beschleunigen dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Dies ändert Ihren Startblock für die Indizierung, stellen Sie diesen höher ein, um Anfangsblöcke mit weniger Daten zu überspringen mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Beschreibung Ihres Projekts repository: 'https://github.com/subquery/subql-starter' # Git-Repository-Adresse Ihres Projekts schema: ./schema.graphql # Der Speicherort Ihrer GraphQL-Schemadatei Netzwerk: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Geben Sie optional den HTTP-Endpunkt eines vollständigen Kettenwörterbuchs an, um die Verarbeitung zu beschleunigen dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Dies ändert Ihren Startblock für die Indizierung, stellen Sie diesen höher ein, um Anfangsblöcke mit weniger Daten zu überspringen mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter ist optional, wird aber empfohlen, um die Ereignisverarbeitung zu beschleunigen module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## Migration von v0.0.1 auf v0.2.0
@@ -81,9 +81,9 @@ Definiert die Daten, die gefiltert und extrahiert werden, und den Speicherort de
### Mapping Spec
-| Bereich | v0.0.1 | v0.2.0 | Beschreibung |
-| -------------------- | ----------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **Datei** | String | 𐄂 | Pfad zum Mapping-Eintrag |
+| Bereich | v0.0.1 | v0.2.0 | Beschreibung |
+| -------------------- | ----------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **Datei** | String | 𐄂 | Pfad zum Mapping-Eintrag |
| **handler & Filter** | [Standardhandler und -filter](./manifest/#mapping-handlers-and-filters) | Standardhandler und -filter, [Benutzerdefinierte Handler und Filter](#custom-data-sources) | Listen Sie alle [Zuordnungsfunktionen](./mapping/polkadot.md) und ihre entsprechenden Handlertypen mit zusätzlichen Zuordnungsfiltern auf.
Informationen zu benutzerdefinierten Laufzeit-Zuordnungshandlern finden Sie unter [Benutzerdefinierte Datenquellen](#custom-data-sources) |
## Data Sources und Mapping
@@ -104,8 +104,8 @@ In der folgenden Tabelle werden Filter erläutert, die von verschiedenen Handler
**Ihr SubQuery-Projekt wird viel effizienter, wenn Sie nur Ereignis- und Call-handler mit geeigneten Zuordnungsfiltern verwenden**
-| Handler | Unterstützte Filter: |
-| ------------------------------------------ | --------------------------- |
+| Handler | Unterstützte Filter: |
+| --------------------------------------------------- | --------------------------- |
| [Blockhandler](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `Modul`,`Methode` ,`Erfolg` |
@@ -151,12 +151,12 @@ Sie können Daten aus benutzerdefinierten Chains indizieren, indem Sie auch Chai
Wir unterstützen die zusätzlichen Typen, die von Substrat-Laufzeitmodulen verwendet werden, `typesAlias`, `typesBundle`, `typesChain` und `typesSpec` werden ebenfalls unterstützt .
-Im folgenden v0.2.0-Beispiel verweisen die `network.chaintypes` auf eine Datei, die alle benutzerdefinierten Typen enthält. Dies ist eine standardmäßige Chainspec-Datei, die die von dieser Blockchain unterstützten spezifischen Typen entweder in < 0>.json-, `.yaml`- oder `.js`-Format.
+Im folgenden v0.2.0-Beispiel verweisen die `network.chaintypes` auf eine Datei, die alle benutzerdefinierten Typen enthält. Dies ist eine standardmäßige Chainspec-Datei, die die von dieser Blockchain unterstützten spezifischen Typen entweder in `.json`, `.yaml`- oder `.js`-Format.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # Der relative Dateipfad, in dem benutzerdefinierte Typen gespeichert werden`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # Der relative Dateipfad, in dem benutzerdefinierte Typen gespeichert werden`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
-Um Typoskript für Ihre Chaintypendatei zu verwenden, fügen Sie es in den `src`-Ordner ein (z. B. `./src/types.ts`), führen Sie `yarn build aus. 4> und zeigen Sie dann auf die generierte js-Datei, die sich im Ordner dist` befindet.
+Um Typoskript für Ihre Chaintypendatei zu verwenden, fügen Sie es in den `src`-Ordner ein (z. B. `./src/types.ts`), führen Sie `yarn build aus. 4> und zeigen Sie dann auf die generierte js-Datei, die sich im Ordner `dist` befindet.
```yml
network:
@@ -171,7 +171,7 @@ Beachten Sie Folgendes bei der Verwendung der Chaintypendatei mit der Erweiterun
Hier ist ein Beispiel für eine `.ts`-Chaintypdatei:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## Benutzerdefinierte Datenquellen
@@ -197,6 +197,6 @@ Benutzer können `Datenquellen` einen `Filter` hinzufügen, um zu entscheiden, w
Unten sehen Sie ein Beispiel, das verschiedene Datenquellen für das Polkadot- und das Kusama-Netzwerk zeigt.
- ```yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Erstellen Sie eine Vorlage, um Redundanzen zu vermeiden definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #verwenden Sie die Vorlage hier - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # Man kann wiederverwenden oder ändern
+::: code-tabs @tab v0.0.1 ```yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Erstellen Sie eine Vorlage, um Redundanzen zu vermeiden definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #verwenden Sie die Vorlage hier - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # Man kann wiederverwenden oder ändern
-
+:::
diff --git a/docs/de/build/mapping.md b/docs/de/build/mapping.md
index cf0297360e6..b1997d9578d 100644
--- a/docs/de/build/mapping.md
+++ b/docs/de/build/mapping.md
@@ -67,9 +67,9 @@ Unser Ziel ist es, alle Datenquellen für Benutzer für das Mapping von Handlern
Dies sind die Schnittstellen, die wir derzeit unterstützen:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) fragt den aktuellen Block ab.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) führt mehrere Abfragen des gleichen Typs im aktuellen Block durch.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) führt im aktuellen Block mehrere Abfragen verschiedener Typen durch.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) fragt den **aktuellen** Block ab.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) führt mehrere Abfragen des **gleichen** Typs im aktuellen Block durch.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) führt im aktuellen Block mehrere Abfragen **verschiedener** Typen durch.
Dies sind die Schnittstellen, die wir derzeit **NICHT** unterstützen:
diff --git a/docs/de/build/substrate-evm.md b/docs/de/build/substrate-evm.md
index d632804469c..7e452eae777 100644
--- a/docs/de/build/substrate-evm.md
+++ b/docs/de/build/substrate-evm.md
@@ -74,7 +74,7 @@ Funktioniert genauso wie [substrate/EventHandler](../create/mapping/#event-handl
| ------- | ------------ | ---------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- |
| Themen | String Array | Transfer(Addresse indexed von,Addresse indexed zu,uint256 value) | Der Themenfilter folgt den Ethereum JSON-PRC-Protokollfiltern, weitere Dokumentation finden Sie [hier](https://docs.ethers.io/v5/concepts/events/). |
-Hinweis zu Themen:
+**Hinweis zu Themen:**
Es gibt einige Verbesserungen der grundlegenden Protokollfilter:
- Themen müssen nicht 0 gepolstert sein
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Bekannte Einschränkungen
diff --git a/docs/de/faqs/faqs.md b/docs/de/faqs/faqs.md
index af3a214b2ca..77abd56003e 100644
--- a/docs/de/faqs/faqs.md
+++ b/docs/de/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery bietet Entwicklern außerdem kostenloses, produktionstaugliches Hosting
**SubQuery-Netzwerk**
-Das SubQuery-Netzwerk ermöglicht Entwicklern die vollständige Dezentralisierung ihres Infrastruktur-Stacks. Es ist der offenste, leistungsfähigste, zuverlässigste und skalierbarste Datendienst für dApps. Das SubQuery-Netzwerk indiziert und stellt Daten für die globale Gemeinschaft auf eine motivierte und überprüfbare Weise bereit. Nachdem Sie Ihr Projekt im SubQuery-Netzwerk veröffentlicht haben, kann jeder es indizieren und hosten, wodurch Daten schneller und zuverlässiger für Benutzer auf der ganzen Welt bereitgestellt werden.
+Das SubQuery-Netzwerk ermöglicht Entwicklern die vollständige Dezentralisierung ihres Infrastruktur-Stacks. Es ist der offenste, leistungsfähigste, zuverlässigste und skalierbarste Datendienst für dApps. Das SubQuery-Netzwerk indiziert und stellt Daten für die globale Gemeinschaft auf eine motivierte und überprüfbare Weise bereit. Nachdem Sie Ihr Projekt im SubQuery-Netzwerk veröffentlicht haben, kann jeder es indizieren und hosten, wodurch Daten schneller und zuverlässiger für Benutzer auf der ganzen Welt bereitgestellt werden.
Weitere Info finden Sie [hier](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ Der beste Einstieg in SubQuery ist unser [Hello World Tutorial](/assets/pdf/Hell
## Wie kann ich zu SubQuery beitragen oder Feedback geben?
-Wir lieben Beiträge und Feedback aus der Community. Um den Code beizutragen, forken Sie das Repository Ihres Interesses und nehmen Sie Ihre Änderungen vor. Senden Sie dann einen PR- oder Pull-Request. Vergessen Sie nicht, auch zu testen. Sehen Sie sich auch unsere Richtlinien für Beiträge an
+Wir lieben Beiträge und Feedback aus der Community. Um den Code beizutragen, forken Sie das Repository Ihres Interesses und nehmen Sie Ihre Änderungen vor. Senden Sie dann einen PR- oder Pull-Request. Vergessen Sie nicht, auch zu testen. Sehen Sie sich auch unsere [Richtlinien für Beiträge](../miscellaneous/contributing.html) an
Um Feedback zu geben, kontaktiere uns unter hello@subquery.net oder besuche unseren [Discord-Kanal](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Beachten Sie, dass empfohlen wird, `--force-clean` zu verwenden, wenn Sie den `startBlock` im Projektmanifest (`project.yaml`) ändern, um zu beginnen Neuindizierung aus dem konfigurierten Block. Wenn `startBlock` ohne `--force-clean` des Projekts geändert wird, dann wird der Indexer die Indizierung mit dem zuvor konfigurierten `startBlock` fortsetzen.
-
## Wie kann ich mein Projekt optimieren, um es zu beschleunigen?
Leistung ist ein entscheidender Faktor in jedem Projekt. Glücklicherweise gibt es mehrere Dinge, die Sie tun könnten, um es zu verbessern. Hier ist die Liste mit einigen Vorschlägen:
@@ -89,13 +88,13 @@ Leistung ist ein entscheidender Faktor in jedem Projekt. Glücklicherweise gibt
- Setzen Sie den Startblock auf den Zeitpunkt, an dem der Vertrag initialisiert wurde.
- Verwenden Sie immer ein [Wörterbuch](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (wir können Ihnen helfen, eines für Ihr neues Netzwerk zu erstellen).
- Optimieren Sie Ihr Schemadesign, halten Sie es so einfach wie möglich.
- - Versuchen Sie unnötige Felder und Spalten zu reduzieren.
- - Erstellen Sie nach Bedarf Indizes.
+ - Versuchen Sie unnötige Felder und Spalten zu reduzieren.
+ - Erstellen Sie nach Bedarf Indizes.
- Verwenden Sie so oft wie möglich Parallel-/Batch-Verarbeitung.
- - Verwenden Sie `api.queryMulti()`, um Polkadot-API-Aufrufe innerhalb von Mapping-Funktionen zu optimieren und sie parallel abzufragen. Dies ist ein schnellerer Weg als eine Schleife.
- - Verwende `Promise.all()`. Bei mehreren asynchronen Funktionen ist es besser, sie parallel auszuführen und aufzulösen.
- - Wenn Sie viele Entitäten in einem einzigen Handler erstellen möchten, können Sie `store.bulkCreate(entityName: string, entity: Entity[])` verwenden. Sie können sie parallel erstellen, ohne dass Sie dies einzeln tun müssen.
+ - Verwenden Sie `api.queryMulti()`, um Polkadot-API-Aufrufe innerhalb von Mapping-Funktionen zu optimieren und sie parallel abzufragen. Dies ist ein schnellerer Weg als eine Schleife.
+ - Verwende `Promise.all()`. Bei mehreren asynchronen Funktionen ist es besser, sie parallel auszuführen und aufzulösen.
+ - Wenn Sie viele Entitäten in einem einzigen Handler erstellen möchten, können Sie `store.bulkCreate(entityName: string, entity: Entity[])` verwenden. Sie können sie parallel erstellen, ohne dass Sie dies einzeln tun müssen.
- Das Ausführen von API-Aufrufen zum Abfragen des Status kann langsam sein. Sie könnten versuchen, Anrufe nach Möglichkeit zu minimieren und `extrinsische/Transaktions-/Ereignisdaten` zu verwenden.
- Verwenden Sie `Worker-Threads`, um den Blockabruf und die Blockverarbeitung in einen eigenen Worker-Thread zu verschieben. Es könnte die Indizierung um das bis zu 4-fache beschleunigen (abhängig vom jeweiligen Projekt). Sie können es einfach mit dem Flag `-workers=` aktivieren. Beachten Sie, dass die Anzahl der verfügbaren CPU-Kerne die Verwendung von Worker-Threads streng begrenzt. Derzeit ist es nur für Substrate und Cosmos verfügbar und wird bald für Avalanche integriert.
- Beachten Sie, dass `JSON.stringify` keine nativen `BigInts` unterstützt. Unsere Protokollierungsbibliothek wird dies intern tun, wenn Sie versuchen, ein Objekt zu protokollieren. Wir suchen nach einem Workaround dafür.
-- Verwenden Sie einen praktischen `Modulo`-Filter, um einen Handler nur einmal für einen bestimmten Block auszuführen. Dieser Filter ermöglicht die Verarbeitung einer beliebigen Anzahl von Blöcken, was äußerst nützlich ist, um Daten in einem festgelegten Intervall zu gruppieren und zu berechnen. Wenn Modulo beispielsweise auf 50 eingestellt ist, wird der Blockhandler alle 50 Blöcke ausgeführt. Es bietet Entwicklern noch mehr Kontrolle über die Indizierung von Daten und kann wie unten in Ihrem Projektmanifest implementiert werden.
\ No newline at end of file
+- Verwenden Sie einen praktischen `Modulo`-Filter, um einen Handler nur einmal für einen bestimmten Block auszuführen. Dieser Filter ermöglicht die Verarbeitung einer beliebigen Anzahl von Blöcken, was äußerst nützlich ist, um Daten in einem festgelegten Intervall zu gruppieren und zu berechnen. Wenn Modulo beispielsweise auf 50 eingestellt ist, wird der Blockhandler alle 50 Blöcke ausgeführt. Es bietet Entwicklern noch mehr Kontrolle über die Indizierung von Daten und kann wie unten in Ihrem Projektmanifest implementiert werden.
diff --git a/docs/de/miscellaneous/contributing.md b/docs/de/miscellaneous/contributing.md
index 059e57264ff..f5cae8b4c6d 100644
--- a/docs/de/miscellaneous/contributing.md
+++ b/docs/de/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
Herzlich Willkommen und vielen Dank, dass Sie in Erwägung ziehen, zu diesem SubQuery-Projekt beizutragen! Gemeinsam können wir den Weg in eine dezentralere Zukunft ebnen.
-::: info Hinweis Diese Dokumentation wird vom SubQuery-Team aktiv gepflegt. Wir freuen uns über Ihre Beiträge. Sie können dies tun, indem Sie unser GitHub-Projekt verzweigen und Änderungen an allen Dokumentations-Markdown-Dateien im Verzeichnis `docs` vornehmen. :::
+::: tip Hinweis Diese Dokumentation wird vom SubQuery-Team aktiv gepflegt. Wir freuen uns über Ihre Beiträge. Sie können dies tun, indem Sie unser GitHub-Projekt verzweigen und Änderungen an allen Dokumentations-Markdown-Dateien im Verzeichnis `docs` vornehmen. :::
Was folgt, ist eine Reihe von Richtlinien (keine Regeln) für das Mitwirken an SubQuery. Die Befolgung dieser Richtlinien hilft uns dabei, den Beitragsprozess für alle Beteiligten einfach und effektiv zu gestalten. Es teilt auch mit, dass Sie sich bereit erklären, die Zeit der Entwickler zu respektieren, die dieses Projekt verwalten und entwickeln. Im Gegenzug werden wir diesen Respekt erwidern, indem wir Ihr Problem angehen, Änderungen in Betracht ziehen, an Verbesserungen mitarbeiten und Ihnen helfen, Ihre Pull-Anfragen abzuschließen.
@@ -14,8 +14,8 @@ Wir nehmen unsere Open-Source-Community-Projekte und Verantwortung ernst und hal
Beiträge zu unseren Repositories erfolgen über Issues und Pull Requests (PRs). Einige allgemeine Richtlinien, die beides abdecken:
-* Suchen Sie nach bestehenden Problemen und PRs, bevor Sie Ihre eigenen erstellen.
-* Wir arbeiten hart daran, sicherzustellen, dass Probleme umgehend behoben werden, aber je nach Auswirkung kann es eine Weile dauern, die Ursache zu untersuchen. Eine freundliche @-Erwähnung im Kommentarthread an den Einreicher oder einen Beitragenden kann helfen, Aufmerksamkeit zu erregen, wenn Ihr Problem blockiert.
+- Suchen Sie nach bestehenden Problemen und PRs, bevor Sie Ihre eigenen erstellen.
+- Wir arbeiten hart daran, sicherzustellen, dass Probleme umgehend behoben werden, aber je nach Auswirkung kann es eine Weile dauern, die Ursache zu untersuchen. Eine freundliche @-Erwähnung im Kommentarthread an den Einreicher oder einen Beitragenden kann helfen, Aufmerksamkeit zu erregen, wenn Ihr Problem blockiert.
## Wie man einen Beitrag leistet
@@ -23,32 +23,32 @@ Beiträge zu unseren Repositories erfolgen über Issues und Pull Requests (PRs).
Fehler werden als GitHub-Probleme verfolgt. Wenn Sie ein Problem protokollieren, erklären Sie das Problem und fügen Sie zusätzliche Details hinzu, um den Betreuern zu helfen, das Problem zu reproduzieren:
-* Verwenden Sie einen klaren und beschreibenden Titel für das Problem, um das Problem zu identifizieren.
-* Beschreiben Sie die genauen Schritte, um das Problem zu reproduzieren.
-* Beschreiben Sie das Verhalten, das Sie beobachtet haben, nachdem Sie die Schritte ausgeführt haben.
-* Erklären Sie, welches Verhalten Sie stattdessen erwartet haben und warum.
-* Fügen Sie wenn möglich Screenshots hinzu.
+- Verwenden Sie einen klaren und beschreibenden Titel für das Problem, um das Problem zu identifizieren.
+- Beschreiben Sie die genauen Schritte, um das Problem zu reproduzieren.
+- Beschreiben Sie das Verhalten, das Sie beobachtet haben, nachdem Sie die Schritte ausgeführt haben.
+- Erklären Sie, welches Verhalten Sie stattdessen erwartet haben und warum.
+- Fügen Sie wenn möglich Screenshots hinzu.
### Senden von Pull-Requests
Im Allgemeinen folgen wir dem „Fork-and-Pull“-Git-Workflow:
-* Verzweigen Sie das Repository zu Ihrem eigenen Github-Konto.
-* Klonen Sie das Projekt auf Ihren Computer.
-* Erstellen Sie lokal einen Zweig mit einem prägnanten, aber aussagekräftigen Namen.
-* Übernehmen Sie Änderungen an der Verzweigung.
-* Befolgen Sie alle Formatierungs- und Testrichtlinien, die für dieses Repo spezifisch sind.
-* Pushen Sie Änderungen an Ihren Fork.
-* Öffnen Sie eine PR in unserem Repository.
+- Verzweigen Sie das Repository zu Ihrem eigenen Github-Konto.
+- Klonen Sie das Projekt auf Ihren Computer.
+- Erstellen Sie lokal einen Zweig mit einem prägnanten, aber aussagekräftigen Namen.
+- Übernehmen Sie Änderungen an der Verzweigung.
+- Befolgen Sie alle Formatierungs- und Testrichtlinien, die für dieses Repo spezifisch sind.
+- Pushen Sie Änderungen an Ihren Fork.
+- Öffnen Sie eine PR in unserem Repository.
## Kodierungskonventionen
### Git-Commit-Nachrichten
-* Verwenden Sie die Gegenwartsform ("Add feature" not "Added feature").
-* Verwenden Sie den Imperativ ("Move cursor to..." not "Moves cursor to...").
-* Beschränken Sie die erste Zeile auf 72 Zeichen oder weniger.
+- Verwenden Sie die Gegenwartsform ("Add feature" not "Added feature").
+- Verwenden Sie den Imperativ ("Move cursor to..." not "Moves cursor to...").
+- Beschränken Sie die erste Zeile auf 72 Zeichen oder weniger.
### JavaScript-Styleguide
-* Der gesamte JavaScript-Code ist mit Prettier und ESLint versehen.
+- Der gesamte JavaScript-Code ist mit Prettier und ESLint versehen.
diff --git a/docs/de/quickstart/helloworld-localhost.md b/docs/de/quickstart/helloworld-localhost.md
index 3d212fe4ce7..c07d1d5c6b3 100644
--- a/docs/de/quickstart/helloworld-localhost.md
+++ b/docs/de/quickstart/helloworld-localhost.md
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Führen Sie nun eine Yarn- oder Nodeinstallation durch, um die verschiedenen Abhängigkeiten zu installieren.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
Beispiel von`yarn install`
@@ -109,8 +109,8 @@ success Saved lockfile.
Führen Sie nun `yarn codegen` aus, um Typescript aus dem GraphQL-Schema zu generieren.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
Beispiel von `yarn codegen`
@@ -133,8 +133,8 @@ $ ./node_modules/.bin/subql codegen
Der nächste Schritt besteht darin, den Code mit `yarn build` zu erstellen.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
Beispiel von`yarn build`
diff --git a/docs/de/quickstart/quickstart-avalanche.md b/docs/de/quickstart/quickstart-avalanche.md
index 3c7fba267e3..e5a68db9fea 100644
--- a/docs/de/quickstart/quickstart-avalanche.md
+++ b/docs/de/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ Nachdem der Initialisierungsprozess abgeschlossen ist, sollten Sie sehen, dass e
Führen Sie zuletzt im Projektverzeichnis den folgenden Befehl aus, um die Abhängigkeiten des neuen Projekts zu installieren.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Änderungen an Ihrem Projekt vornehmen
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Wichtig: Wenn Sie Änderungen an der Schemadatei vornehmen, stellen Sie bitte sicher, dass Sie Ihr Typenverzeichnis neu generieren. Tun Sie dies jetzt.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Sie finden die generierten Modelle im Verzeichnis `/src/types/models`. Weitere Informationen zur Datei `schema.graphql` finden Sie in unserer Dokumentation unter [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ Weitere Informationen zu Mapping-Funktionen finden Sie in unserer Dokumentation
Um Ihr neues SubQuery-Projekt auszuführen, müssen wir zuerst unsere Arbeit erstellen. Führen Sie den Build-Befehl im Stammverzeichnis des Projekts aus.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Wichtig: Wenn Sie Änderungen an Ihren Zuordnungsfunktionen vornehmen, müssen Sie Ihr Projekt neu erstellen**
@@ -183,13 +183,11 @@ Die gesamte Konfiguration, die steuert, wie ein SubQuery-Node ausgeführt wird,
Führen Sie im Projektverzeichnis den folgenden Befehl aus:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Es kann einige Zeit dauern, die erforderlichen Pakete herunterzuladen ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 > und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Node sehen. Seien Sie hier bitte geduldig.
-
-
+`@subql/query` und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Node sehen. Seien Sie hier bitte geduldig.
### Fragen Sie Ihr Projekt ab
@@ -199,8 +197,6 @@ Sie sollten sehen, dass im Explorer ein GraphQL-Playground angezeigt wird und di
Für ein neues SubQuery-Starterprojekt können Sie die folgende Abfrage ausprobieren, um einen Eindruck davon zu bekommen, wie sie funktioniert, oder [mehr über die GraphQL-Abfragesprache erfahren](../run_publish/graphql.md).
-
-
```graphql
query {
pangolinApprovals(first: 5) {
@@ -215,21 +211,14 @@ query {
}
}
}
-
-
```
-
-
-
### Veröffentlichen Sie Ihr SubQuery-Projekt
SubQuery bietet einen kostenlosen verwalteten Dienst, wenn Sie Ihr neues Projekt bereitstellen können. Sie können es in [SubQuery-Projekten](https://project.subquery.network) bereitstellen und mit unserem [Explorer](https://explorer.subquery.network) abfragen.
[Lesen Sie den Leitfaden zur Veröffentlichung Ihres neuen Projekts in SubQuery Projects](../run_publish/publish.md), **Beachten Sie, dass Sie es über IPFS bereitstellen müssen**.
-
-
## Weitere Schritte
Herzlichen Glückwunsch, Sie haben jetzt ein lokal ausgeführtes SubQuery-Projekt, das GraphQL-API-Anforderungen für die Übertragung von Daten von bLuna akzeptiert.
diff --git a/docs/de/quickstart/quickstart-cosmos.md b/docs/de/quickstart/quickstart-cosmos.md
index 1a9b834e783..1f24ffc14dd 100644
--- a/docs/de/quickstart/quickstart-cosmos.md
+++ b/docs/de/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ Nachdem der Initialisierungsprozess abgeschlossen ist, sollten Sie sehen, dass e
Führen Sie zuletzt im Projektverzeichnis den folgenden Befehl aus, um die Abhängigkeiten des neuen Projekts zu installieren.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Änderungen an Ihrem Projekt vornehmen
@@ -75,8 +75,8 @@ type Vote @entity {
**Wichtig: Wenn Sie Änderungen an der Schemadatei vornehmen, stellen Sie bitte sicher, dass Sie Ihr Typenverzeichnis neu generieren. Tun Sie dies jetzt.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Sie finden die generierten Modelle im Verzeichnis `/src/types/models`. Weitere Informationen zur Datei `schema.graphql` finden Sie in unserer Dokumentation unter [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ Weitere Informationen zu Mapping-Funktionen finden Sie in unserer Dokumentation
Um Ihr neues SubQuery-Projekt auszuführen, müssen wir zuerst unsere Arbeit erstellen. Führen Sie den Build-Befehl im Stammverzeichnis des Projekts aus.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Wichtig: Wenn Sie Änderungen an Ihren Zuordnungsfunktionen vornehmen, müssen Sie Ihr Projekt neu erstellen**
@@ -159,13 +159,11 @@ Die gesamte Konfiguration, die steuert, wie ein SubQuery-Node ausgeführt wird,
Führen Sie im Projektverzeichnis den folgenden Befehl aus:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Es kann einige Zeit dauern, die erforderlichen Pakete herunterzuladen ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 > und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Knoten sehen. Seien Sie hier bitte geduldig.
-
-
+`@subql/query` und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Knoten sehen. Seien Sie hier bitte geduldig.
### Fragen Sie Ihr Projekt ab
@@ -175,14 +173,11 @@ Sie sollten sehen, dass im Explorer ein GraphQL-Playground angezeigt wird und di
Für ein neues SubQuery-Starterprojekt können Sie die folgende Abfrage ausprobieren, um einen Eindruck davon zu bekommen, wie sie funktioniert, oder [mehr über die GraphQL-Abfragesprache erfahren](../run_publish/graphql.md).
-
-
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
@@ -194,19 +189,14 @@ query {
}
```
-
Den endgültigen Code dieses Projekts können Sie hier unter [https://github.com/jamesbayly/juno-terra-developer-fund-votes](https://github.com/jamesbayly/juno-terra-developer-fund-votes) einsehen
-
-
### Veröffentlichen Sie Ihr SubQuery-Projekt
SubQuery bietet einen kostenlosen verwalteten Dienst, wenn Sie Ihr neues Projekt bereitstellen können. Sie können es in [SubQuery-Projekten](https://project.subquery.network) bereitstellen und mit unserem [Explorer](https://explorer.subquery.network) abfragen.
[Lesen Sie die Anleitung zum Veröffentlichen Ihres neuen Projekts in SubQuery Projects](../publish/publish.md)
-
-
## Weitere Schritte
Herzlichen Glückwunsch, Sie haben jetzt ein lokal ausgeführtes SubQuery-Projekt, das GraphQL-API-Anforderungen für die Übertragung von Daten von bLuna akzeptiert.
diff --git a/docs/de/quickstart/quickstart-polkadot.md b/docs/de/quickstart/quickstart-polkadot.md
index 45c2c872d01..6479d41b359 100644
--- a/docs/de/quickstart/quickstart-polkadot.md
+++ b/docs/de/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Während das SubQuery-Projekt initialisiert wird, werden Ihnen bestimmte Fragen gestellt:
- Projektname: Ein Projektname für Ihr SubQuery-Projekt
-- Netzwerkfamilie: Die Layer-1-Blockchain-Netzwerkfamilie, für deren Indizierung dieses SubQuery-Projekt entwickelt wird. Verwenden Sie die Pfeiltasten, um aus den verfügbaren Optionen auszuwählen. Für diese Anleitung verwenden wir *"Substrat"*
-- Netzwerk: Das spezifische Netzwerk, für das dieses SubQuery-Projekt entwickelt wird, um es zu indizieren. Verwenden Sie die Pfeiltasten, um aus den verfügbaren Optionen auszuwählen. Für diese Anleitung verwenden wir *"Polkadot"*
-- Vorlagenprojekt: Wählen Sie ein SubQuery-Vorlagenprojekt aus, das als Ausgangspunkt für den Beginn der Entwicklung dient. Wir empfehlen, das Projekt *"subql-starter"* auszuwählen.
-- RPC-Endpunkt: Geben Sie eine HTTPS-URL zu einem ausgeführten RPC-Endpunkt an, der standardmäßig für dieses Projekt verwendet wird. Sie können schnell auf öffentliche Endpunkte für verschiedene Polkadot-Netzwerke zugreifen, Ihren eigenen privaten dedizierten Nodes mit [OnFinality](https://app.onfinality.io) erstellen oder einfach den standardmäßigen Polkadot-Endpunkt verwenden. Dieser RPC-Node muss ein Archivnode sein (den Zustand der vollständigen Chain haben). Für diese Anleitung verwenden wir den Standardwert *"https://polkadot.api.onfinality.io"*
+- Netzwerkfamilie: Die Layer-1-Blockchain-Netzwerkfamilie, für deren Indizierung dieses SubQuery-Projekt entwickelt wird. Verwenden Sie die Pfeiltasten, um aus den verfügbaren Optionen auszuwählen. Für diese Anleitung verwenden wir _"Substrat"_
+- Netzwerk: Das spezifische Netzwerk, für das dieses SubQuery-Projekt entwickelt wird, um es zu indizieren. Verwenden Sie die Pfeiltasten, um aus den verfügbaren Optionen auszuwählen. Für diese Anleitung verwenden wir _"Polkadot"_
+- Vorlagenprojekt: Wählen Sie ein SubQuery-Vorlagenprojekt aus, das als Ausgangspunkt für den Beginn der Entwicklung dient. Wir empfehlen, das Projekt _"subql-starter"_ auszuwählen.
+- RPC-Endpunkt: Geben Sie eine HTTPS-URL zu einem ausgeführten RPC-Endpunkt an, der standardmäßig für dieses Projekt verwendet wird. Sie können schnell auf öffentliche Endpunkte für verschiedene Polkadot-Netzwerke zugreifen, Ihren eigenen privaten dedizierten Nodes mit [OnFinality](https://app.onfinality.io) erstellen oder einfach den standardmäßigen Polkadot-Endpunkt verwenden. Dieser RPC-Node muss ein Archivnode sein (den Zustand der vollständigen Chain haben). Für diese Anleitung verwenden wir den Standardwert _"https://polkadot.api.onfinality.io"_
- Git-Repository: Geben Sie eine Git-URL zu einem Repository an, in dem dieses SubQuery-Projekt gehostet wird (wenn es in SubQuery Explorer gehostet wird), oder akzeptieren Sie die bereitgestellte Standardeinstellung.
- Autoren: Geben Sie hier den Eigentümer dieses SubQuery-Projekts ein (z. B. Ihren Namen!) oder übernehmen Sie die vorgegebene Vorgabe.
- Beschreibung: Geben Sie einen kurzen Absatz zu Ihrem Projekt an, der beschreibt, welche Daten es enthält und was Benutzer damit tun können, oder akzeptieren Sie die bereitgestellte Standardeinstellung.
@@ -57,8 +57,8 @@ Nachdem der Initialisierungsprozess abgeschlossen ist, sollten Sie sehen, dass e
Führen Sie zuletzt im Projektverzeichnis den folgenden Befehl aus, um die Abhängigkeiten des neuen Projekts zu installieren.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Änderungen an Ihrem Projekt vornehmen
@@ -88,8 +88,8 @@ type Transfer @entity {
**Wichtig: Wenn Sie Änderungen an der Schemadatei vornehmen, stellen Sie bitte sicher, dass Sie Ihr Typenverzeichnis neu generieren.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Sie finden die generierten Modelle im Verzeichnis `/src/types/models`. Weitere Informationen zur Datei `schema.graphql` finden Sie in unserer Dokumentation unter [Build/GraphQL Schema](../build/graphql.md)
@@ -111,7 +111,6 @@ dataSources:
filter:
module: balances
method: Transfer
-
```
Das bedeutet, dass wir jedes Mal, wenn ein `balances.Transfer`-Ereignis auftritt, eine `handleEvent`-Mapping-Funktion ausführen.
@@ -134,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -161,7 +160,7 @@ Weitere Informationen zu Mapping-Funktionen finden Sie in unserer Dokumentation
Um Ihr neues SubQuery-Projekt auszuführen, müssen wir zuerst unsere Arbeit erstellen. Führen Sie den Build-Befehl im Stammverzeichnis des Projekts aus.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Wichtig: Wenn Sie Änderungen an Ihren Zuordnungsfunktionen vornehmen, müssen Sie Ihr Projekt neu erstellen**
@@ -175,13 +174,11 @@ Die gesamte Konfiguration, die steuert, wie ein SubQuery-Node ausgeführt wird,
Führen Sie im Projektverzeichnis den folgenden Befehl aus:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Es kann einige Zeit dauern, die erforderlichen Pakete herunterzuladen ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 > und Postgres) zum ersten Mal, aber bald sollten Sie einen laufenden SubQuery-Node auf dem Terminalbildschirm sehen.
-
-
+`@subql/query` und Postgres) zum ersten Mal, aber bald sollten Sie einen laufenden SubQuery-Node auf dem Terminalbildschirm sehen.
### Fragen Sie Ihr Projekt ab
@@ -191,15 +188,10 @@ Sie sollten einen GraphQL-Playground im Browser und die Schemas sehen, die zur A
Probieren Sie für ein neues SubQuery-Starterprojekt die folgende Abfrage aus, um zu verstehen, wie sie funktioniert, oder erfahren Sie mehr über die [GraphQL-Abfragesprache](../run_publish/graphql.md).
-
-
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
@@ -210,21 +202,14 @@ Probieren Sie für ein neues SubQuery-Starterprojekt die folgende Abfrage aus, u
}
}
}
-
-
```
-
-
-
### Veröffentlichen Sie Ihr SubQuery-Projekt
SubQuery bietet einen kostenlosen verwalteten Dienst, in dem Sie Ihr neues Projekt bereitstellen können. Sie können es in [SubQuery-Projekten](https://project.subquery.network) bereitstellen und mit unserem [Explorer](https://explorer.subquery.network) abfragen.
Lesen Sie den Leitfaden zum [Veröffentlichen Ihres neuen Projekts in SubQuery Projects](../run_publish/publish.md)
-
-
## Weitere Schritte
Herzlichen Glückwunsch, Sie haben jetzt ein lokal ausgeführtes SubQuery-Projekt, das GraphQL-API-Anforderungen für die Datenübertragung akzeptiert.
diff --git a/docs/de/quickstart/quickstart-terra.md b/docs/de/quickstart/quickstart-terra.md
index 1220736033e..a6458486893 100644
--- a/docs/de/quickstart/quickstart-terra.md
+++ b/docs/de/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Während das SubQuery-Projekt initialisiert wird, werden Ihnen bestimmte Fragen gestellt:
- Projektname: Ein Name für Ihr SubQuery-Projekt
-- Netzwerkfamilie: Die Layer-1-Blockchain-Netzwerkfamilie, für deren Indizierung dieses SubQuery-Projekt entwickelt wird, verwenden Sie die Pfeiltasten auf Ihrer Tastatur, um aus den Optionen auszuwählen. Für diese Anleitung verwenden wir *"Terra"*
-- Netzwerk: Das spezifische Netzwerk, für dessen Indexierung dieses SubQuery-Projekt entwickelt wird, verwenden Sie die Pfeiltasten auf Ihrer Tastatur, um aus den Optionen auszuwählen. Für diese Anleitung verwenden wir *"Terra"*
-- Vorlage: Wählen Sie eine SubQuery-Projektvorlage aus, die einen Ausgangspunkt für den Beginn der Entwicklung bietet. Wir empfehlen die Auswahl des *"Starter-Projekts"*
+- Netzwerkfamilie: Die Layer-1-Blockchain-Netzwerkfamilie, für deren Indizierung dieses SubQuery-Projekt entwickelt wird, verwenden Sie die Pfeiltasten auf Ihrer Tastatur, um aus den Optionen auszuwählen. Für diese Anleitung verwenden wir _"Terra"_
+- Netzwerk: Das spezifische Netzwerk, für dessen Indexierung dieses SubQuery-Projekt entwickelt wird, verwenden Sie die Pfeiltasten auf Ihrer Tastatur, um aus den Optionen auszuwählen. Für diese Anleitung verwenden wir _"Terra"_
+- Vorlage: Wählen Sie eine SubQuery-Projektvorlage aus, die einen Ausgangspunkt für den Beginn der Entwicklung bietet. Wir empfehlen die Auswahl des _"Starter-Projekts"_
- Git-Repository (optional): Geben Sie eine Git-URL zu einem Repository an, in dem dieses SubQuery-Projekt gehostet wird (wenn es in SubQuery Explorer gehostet wird).
-- RPC-Endpunkt (erforderlich): Geben Sie eine HTTPS-URL zu einem ausgeführten RPC-Endpunkt an, der standardmäßig für dieses Projekt verwendet wird. Dieser RPC-Node muss ein Archivnode sein (den Zustand der vollständigen Chain haben). Für diese Anleitung verwenden wir den Standardwert *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC-Endpunkt (erforderlich): Geben Sie eine HTTPS-URL zu einem ausgeführten RPC-Endpunkt an, der standardmäßig für dieses Projekt verwendet wird. Dieser RPC-Node muss ein Archivnode sein (den Zustand der vollständigen Chain haben). Für diese Anleitung verwenden wir den Standardwert _"https://terra-columbus-5.beta.api.onfinality.io"_
- Autoren (erforderlich): Geben Sie hier den Eigentümer dieses SubQuery-Projekts ein (z. B. Ihren Namen!)
- Beschreibung (Optional): Sie können einen kurzen Absatz über Ihr Projekt bereitstellen, der beschreibt, welche Daten es enthält und was Benutzer damit tun können
- Version (erforderlich): Geben Sie eine benutzerdefinierte Versionsnummer ein oder verwenden Sie die Standardversion (`1.0.0`).
@@ -59,8 +59,8 @@ Nachdem der Initialisierungsprozess abgeschlossen ist, sollten Sie sehen, dass e
Führen Sie zuletzt im Projektverzeichnis den folgenden Befehl aus, um die Abhängigkeiten des neuen Projekts zu installieren.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Änderungen an Ihrem Projekt vornehmen
@@ -91,8 +91,8 @@ type Transfer @entity {
**Wichtig: Wenn Sie Änderungen an der Schemadatei vornehmen, stellen Sie bitte sicher, dass Sie Ihr Typenverzeichnis neu generieren. Tun Sie dies jetzt.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Sie finden die generierten Modelle im Verzeichnis `/src/types/models`. Weitere Informationen zur Datei `schema.graphql` finden Sie in unserer Dokumentation unter [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Debugging-Daten aus dem Ereignis drucken
- // logger.info(JSON.stringify(event));
-
- // Erstellen Sie die neue Übertragungsentität mit einer eindeutigen ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Debugging-Daten aus dem Ereignis drucken
+ // logger.info(JSON.stringify(event));
+
+ // Erstellen Sie die neue Übertragungsentität mit einer eindeutigen ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ Weitere Informationen zu Mapping-Funktionen finden Sie in unserer Dokumentation
Um Ihr neues SubQuery-Projekt auszuführen, müssen wir zuerst unsere Arbeit erstellen. Führen Sie den Build-Befehl im Stammverzeichnis des Projekts aus.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Wichtig: Wenn Sie Änderungen an Ihren Zuordnungsfunktionen vornehmen, müssen Sie Ihr Projekt neu erstellen**
@@ -192,13 +192,11 @@ Die gesamte Konfiguration, die steuert, wie ein SubQuery-Node ausgeführt wird,
Führen Sie im Projektverzeichnis den folgenden Befehl aus:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Es kann einige Zeit dauern, die erforderlichen Pakete herunterzuladen ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 > und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Knoten sehen. Seien Sie hier bitte geduldig.
-
-
+`@subql/query` und Postgres) zum ersten Mal, aber bald werden Sie einen laufenden SubQuery-Knoten sehen. Seien Sie hier bitte geduldig.
### Fragen Sie Ihr Projekt ab
@@ -208,15 +206,10 @@ Sie sollten sehen, dass im Explorer ein GraphQL-Playground angezeigt wird und di
Für ein neues SubQuery-Starterprojekt können Sie die folgende Abfrage ausprobieren, um einen Eindruck davon zu bekommen, wie sie funktioniert, oder [mehr über die GraphQL-Abfragesprache erfahren](../run_publish/graphql.md).
-
-
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
@@ -230,17 +223,12 @@ Für ein neues SubQuery-Starterprojekt können Sie die folgende Abfrage ausprobi
}
```
-
-
-
### Veröffentlichen Sie Ihr SubQuery-Projekt
SubQuery bietet einen kostenlosen verwalteten Dienst, wenn Sie Ihr neues Projekt bereitstellen können. Sie können es in [SubQuery-Projekten](https://project.subquery.network) bereitstellen und mit unserem [Explorer](https://explorer.subquery.network) abfragen.
[Lesen Sie die Anleitung zum Veröffentlichen Ihres neuen Projekts in SubQuery Projects](../publish/publish.md)
-
-
## Weitere Schritte
Herzlichen Glückwunsch, Sie haben jetzt ein lokal ausgeführtes SubQuery-Projekt, das GraphQL-API-Anforderungen für die Übertragung von Daten von bLuna akzeptiert.
diff --git a/docs/de/quickstart/quickstart.md b/docs/de/quickstart/quickstart.md
index de9954bd6fc..43c496a3a19 100644
--- a/docs/de/quickstart/quickstart.md
+++ b/docs/de/quickstart/quickstart.md
@@ -89,8 +89,8 @@ Nachdem Sie den Initialisierungsprozess abgeschlossen haben, sehen Sie einen Ord
Führen Sie abschließend den folgenden Befehl aus, um die Abhängigkeiten des neuen Projekts aus dem Verzeichnis des neuen Projekts zu installieren.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
Mit wenigen Handgriffen haben Sie nun Ihr erstes SubQuery-Projekt initialisiert. Lassen Sie uns nun das Standardvorlagenprojekt für eine bestimmte Blockchain von Interesse anpassen.
@@ -104,4 +104,4 @@ Es gibt 3 wichtige Dateien, die geändert werden müssen. Dies sind:
2. Das Projektmanifest in `project.yaml`.
3. Die Mapping-Funktionen im Verzeichnis `src/mappings/`.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/de/run_publish/connect.md b/docs/de/run_publish/connect.md
index 1369cf327ae..1ed28c965dd 100644
--- a/docs/de/run_publish/connect.md
+++ b/docs/de/run_publish/connect.md
@@ -2,10 +2,10 @@
Sobald Ihre Bereitstellung erfolgreich abgeschlossen wurde und unsere Knoten Ihre Daten aus der Chain indiziert haben, können Sie über den angezeigten Abfrageendpunkt eine Verbindung zu Ihrem Projekt herstellen.
-![Projekt wird bereitgestellt und synchronisiert](/assets/img/projects-deploy-sync.png)
+![Projekt wird bereitgestellt und synchronisiert](/assets/img/projects_deploy_sync.png)
Alternativ können Sie auf die drei Punkte neben dem Titel Ihres Projekts klicken und es im SubQuery Explorer anzeigen. Dort können Sie den Spielplatz im Browser verwenden, um loszulegen.
-![Projekte im SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projekte im SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
+::: tip Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
diff --git a/docs/de/run_publish/query.md b/docs/de/run_publish/query.md
index b028e4dd4a5..08557f14219 100644
--- a/docs/de/run_publish/query.md
+++ b/docs/de/run_publish/query.md
@@ -12,4 +12,4 @@ Sie werden auch feststellen, dass der SubQuery-Explorer eine Spielwiese zum Auff
Oben rechts auf dem Spielplatz finden Sie eine Schaltfläche _Dokumente_, die eine Dokumentationsziehung öffnet. Diese Dokumentation wird automatisch generiert und hilft Ihnen zu finden, welche Entitäten und Methoden Sie abfragen können.
-::: info Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
+::: tip Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
diff --git a/docs/de/run_publish/references.md b/docs/de/run_publish/references.md
index cc07f31b733..dc5f74cf201 100644
--- a/docs/de/run_publish/references.md
+++ b/docs/de/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | Beschreibung |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | Beschreibung |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/de/run_publish/run.md b/docs/de/run_publish/run.md
index 851d03b9ef9..35bdb3d98de 100644
--- a/docs/de/run_publish/run.md
+++ b/docs/de/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/de/run_publish/subscription.md b/docs/de/run_publish/subscription.md
index 72a6d75aedb..44456f14117 100644
--- a/docs/de/run_publish/subscription.md
+++ b/docs/de/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery unterstützt jetzt auch Graphql-Abonnements. Wie Abfragen ermöglichen
Abonnements sind sehr nützlich, wenn Sie möchten, dass Ihre Clientanwendung Daten ändert oder einige neue Daten anzeigt, sobald diese Änderung eintritt oder die neuen Daten verfügbar sind. Mit Abonnements können Sie Ihr SubQuery-Projekt für Änderungen _abonnieren_.
-::: info Hinweis Lesen Sie mehr über [Abonnements](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Hinweis Lesen Sie mehr über [Abonnements](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## So abonnieren Sie eine Entität
@@ -39,7 +39,7 @@ Es gibt zwei Arten von Filtern, die wir unterstützen:
- `id` : Filtern, um nur Änderungen zurückzugeben, die eine bestimmte Entität betreffen (durch die ID bezeichnet).
- `mutation_type`: Nur der gleiche Mutationstyp wird aktualisiert.
-Angenommen, wir haben eine Entität ` Balances `, die den Saldo jedes Kontos aufzeichnet.
+Angenommen, wir haben eine Entität `Balances`, die den Saldo jedes Kontos aufzeichnet.
```graphql
type Balances {
diff --git a/docs/de/run_publish/upgrade.md b/docs/de/run_publish/upgrade.md
index a0bbd39e2ea..b94541de1f1 100644
--- a/docs/de/run_publish/upgrade.md
+++ b/docs/de/run_publish/upgrade.md
@@ -71,18 +71,16 @@ jobs:
## Führen Sie ein Upgrade auf den neuesten Indexer- und Abfragedienst durch
-Wenn Sie nur auf den neuesten Indexer ([`@subql/node`](https://www.npmjs.com/package/@subql/node)) oder Abfragedienst (
-
-`@subql/query`) aktualisieren möchten 2>) Um von unseren regelmäßigen Leistungs- und Stabilitätsverbesserungen zu profitieren, wählen Sie einfach eine neuere Version unserer Pakete aus und speichern Sie. Dies verursacht nur wenige Minuten Ausfallzeit, da die Dienste, auf denen Ihr Projekt ausgeführt wird, neu gestartet werden.
+Wenn Sie nur auf den neuesten Indexer ([`@subql/node`](https://www.npmjs.com/package/@subql/node)) oder Abfragedienst (`@subql/query`) aktualisieren möchten. Um von unseren regelmäßigen Leistungs- und Stabilitätsverbesserungen zu profitieren, wählen Sie einfach eine neuere Version unserer Pakete aus und speichern Sie. Dies verursacht nur wenige Minuten Ausfallzeit, da die Dienste, auf denen Ihr Projekt ausgeführt wird, neu gestartet werden.
## Nächste Schritte - Verbinden Sie sich mit Ihrem Projekt
Sobald Ihre Bereitstellung erfolgreich abgeschlossen wurde und unsere Nodes Ihre Daten aus der Chain indiziert haben, können Sie über den angezeigten GraphQL-Abfrageendpunkt eine Verbindung zu Ihrem Projekt herstellen.
-![Projekt wird bereitgestellt und synchronisiert](/assets/img/projects-deploy-sync.png)
+![Projekt wird bereitgestellt und synchronisiert](/assets/img/projects_deploy_sync.png)
Alternativ können Sie auf die drei Punkte neben dem Titel Ihres Projekts klicken und es im SubQuery Explorer anzeigen. Dort können Sie den Playground im Browser verwenden, um loszulegen - [lesen Sie hier mehr über die Verwendung unseres Explorers](../run_publish/query.md).
-![Projekte im SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projekte im SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
+::: tip Hinweis Erfahren Sie mehr über die [GraphQL-Abfragesprache.](./graphql.md) :::
diff --git a/docs/de/subquery_network/introduction.md b/docs/de/subquery_network/introduction.md
index b9d551d9fb0..0609a232c7e 100644
--- a/docs/de/subquery_network/introduction.md
+++ b/docs/de/subquery_network/introduction.md
@@ -18,22 +18,22 @@ Es gibt eine Rolle für jeden im Netzwerk, von hochtechnischen Entwicklern bis h
Verbraucher fragen das SubQuery-Netzwerk nach bestimmten Daten für ihre dApps oder Tools und zahlen für jede Anfrage einen beworbenen SQT-Betrag.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexers
Indexer führen und pflegen qualitativ hochwertige SubQuery-Projekte in ihrer eigenen Infrastruktur, führen sowohl den Indexer als auch den Abfragedienst aus und werden für die von ihnen bedienten Anforderungen mit SQT belohnt.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegators
Die Delegierenden beteiligen sich am Netzwerk, indem sie ihre bevorzugten Indexer dabei unterstützen, Belohnungen basierend auf der Arbeit dieser Indexer zu verdienen.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Architekt
Architekten sind die Ersteller der SubQuery-Projekte, auf denen das Netzwerk ausgeführt wird. Sie erstellen und veröffentlichen SubQuery-Projekte, die das Netzwerk indizieren und ausführen kann.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/es/README.md b/docs/es/README.md
index 589da661cac..156377bb309 100644
--- a/docs/es/README.md
+++ b/docs/es/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/es/build/install.md b/docs/es/build/install.md
index c265b416b74..1be74447bcd 100644
--- a/docs/es/build/install.md
+++ b/docs/es/build/install.md
@@ -8,28 +8,30 @@ La herramienta [@subql/cli](https://github.com/subquery/subql/tree/docs-new-sect
Instalar SubQuery CLI globalmente en tu terminal usando Yarn o NPM:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
You can then run help to see available commands and usage provide by CLI:
```shell
subql help
```
+
## Install @subql/node
Un nodo de SubQuery es una implementación que extrae datos de blockchain basados en substrate por el proyecto SubQuery y lo guarda en una base de datos de Postgres.
Instala la consulta de SubQuery globalmente en tu terminal usando Yarn o NPM:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Once installed, you can can start a node with:
```shell
subql-node
```
+
> Nota: Si estás usando Docker o alojando tu proyecto en Proyectos de SubQuery, puedes saltarte este paso. Esto se debe a que el nodo SubQuery ya se proporciona en el contenedor Docker y en la infraestructura de alojamiento.
## Install @subql/query
@@ -38,7 +40,7 @@ La biblioteca de consultas de SubQuery proporciona un servicio que le permite co
Instala el nodo SubQuery globalmente en tu terminal usando Yarn o NPM:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Nota: Si estás usando Docker o alojando tu proyecto en Proyectos de SubQuery, puedes saltarte este paso también. Esto se debe a que el nodo SubQuery ya se proporciona en el contenedor Docker y en la infraestructura de alojamiento.
\ No newline at end of file
+> Nota: Si estás usando Docker o alojando tu proyecto en Proyectos de SubQuery, puedes saltarte este paso también. Esto se debe a que el nodo SubQuery ya se proporciona en el contenedor Docker y en la infraestructura de alojamiento.
diff --git a/docs/es/build/introduction.md b/docs/es/build/introduction.md
index 25013c5f849..3973cc4320d 100644
--- a/docs/es/build/introduction.md
+++ b/docs/es/build/introduction.md
@@ -51,8 +51,8 @@ Para ejecutar tu proyecto SubQuery en un nodo SubQuery alojado localmente, prime
Ejecuta el comando de compilación desde el directorio raíz del proyecto.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Alternative build options
diff --git a/docs/es/build/manifest.md b/docs/es/build/manifest.md
index 345f315783e..78a505e6b25 100644
--- a/docs/es/build/manifest.md
+++ b/docs/es/build/manifest.md
@@ -4,7 +4,7 @@ El Manifiesto `project.yaml` puede ser visto como un punto de entrada de tu proy
El manifiesto puede estar en formato YAML o JSON. En este documento, utilizaremos YAML en todos los ejemplos. A continuación se muestra un ejemplo estándar de un `project.yaml` básico.
- ``` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````. [Vea aquí](#cli-options) para más información
+``` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````. [Vea aquí](#cli-options) para más información
Bajo `red`:
@@ -77,9 +77,9 @@ Define los datos que serán filtrados y extraídos y la ubicación del manejador
### Especificación de mapeo
-| Campo | v0.0.1 | v0.2.0 | Descripción |
-| --------------------------- | --------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **archivo** | String | 𐄂 | Ruta a la entrada de mapeo |
+| Campo | v0.0.1 | v0.2.0 | Descripción |
+| --------------------------- | --------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **archivo** | String | 𐄂 | Ruta a la entrada de mapeo |
| **manipuladores y filtros** | [Controladores y filtros predeterminados](./manifestar/#mapeo-handlers-y-filtros) | Controladores y filtros por defecto, [Controladores y filtros personalizados](#custom-data-sources) | Lista todas las [funciones de mapeo](./mapping/polkadot.md) y sus correspondientes tipos de manejador, con filtros de mapeo adicionales.
Para manejadores de mapeo de tiempo de ejecución personalizados, por favor vea [fuentes de datos personalizadas](#custom-data-sources) |
## Fuentes de datos y mapeo
@@ -100,8 +100,8 @@ La siguiente tabla explica los filtros soportados por diferentes manejadores.
**Tu proyecto de SubQuery será mucho más eficiente cuando sólo utilices controladores de eventos y llamadas con filtros de mapeo apropiados**
-| Manejador | Filtro compatible |
-| -------------------------------------------------- | ---------------------------- |
+| Manejador | Filtro compatible |
+| ----------------------------------------------------------- | ---------------------------- |
| [Manejador de bloques](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -149,8 +149,8 @@ Soportamos los tipos adicionales usados por módulos de tiempo de ejecución sub
En el ejemplo v0.2.0 de abajo, la red `. haintypes` están apuntando a un archivo que tiene todos los tipos personalizados incluidos, Este es un archivo estándar de chainspec que declara los tipos específicos soportados por este blockchain en cualquiera de los dos `. son`, `.yaml` o `.js formato`.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # la ruta relativa al lugar donde se almacenan los tipos personalizados ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # la ruta relativa al lugar donde se almacenan los tipos personalizados ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
To use typescript for your chain types file include it in the `src` folder (e.g. `./src/types.ts`), run `yarn build` and then point to the generated js file located in the `dist` folder.
@@ -167,7 +167,7 @@ Cosas a tener en cuenta sobre el uso del archivo de tipos de cadena con extensi
A continuación se muestra un ejemplo de un archivo de tipos de cadena `.ts `:
- ts importar {typesBundleDeprecated} desde "moonbeam-types-bundle" exportar predeterminado {typesBundle: typesBundleDeprecated}; ''
+::: code-tabs @tab types.ts ts importar {typesBundleDeprecated} desde "moonbeam-types-bundle" exportar predeterminado {typesBundle: typesBundleDeprecated}; '' :::
## Fuentes de datos personalizadas
@@ -179,29 +179,11 @@ Las fuentes de datos personalizadas se pueden utilizar con fuentes de datos norm
Aquí hay una lista de fuentes de datos personalizadas compatibles:
-
-
-
- Amable
-
-
-
- Controladores admitidos
-
. Entre redes, es probable que varias opciones sean diferentes (por ejemplo, el bloque de inicio del índice). Por lo tanto, permitimos a los usuarios definir diferentes detalles para cada fuente de datos, lo que significa que un proyecto de SubQuery puede ser utilizado en múltiples redes.
-
-
- Los usuarios pueden añadir un filtro en fuentes de datos para decidir qué fuente de datos ejecutar en cada red.
-
-
-
- A continuación hay un ejemplo que muestra diferentes fuentes de datos para las redes Polkadot y Kusama.
-
+Controladores admitidos
+Entre redes, es probable que varias opciones sean diferentes (por ejemplo, el bloque de inicio del índice). Por lo tanto, permitimos a los usuarios definir diferentes detalles para cada fuente de datos, lo que significa que un proyecto de SubQuery puede ser utilizado en múltiples redes.
+
+Los usuarios pueden añadir un `filtro` en `fuentes de datos` para decidir qué fuente de datos ejecutar en cada red.
+
+A continuación hay un ejemplo que muestra diferentes fuentes de datos para las redes Polkadot y Kusama.
+
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change ` :::
diff --git a/docs/es/build/mapping.md b/docs/es/build/mapping.md
index ee045de8d0f..ac178153cd2 100644
--- a/docs/es/build/mapping.md
+++ b/docs/es/build/mapping.md
@@ -67,9 +67,9 @@ Nuestro objetivo es cubrir todas las fuentes de datos para los usuarios de los m
Estas son las interfaces que actualmente soportamos:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) consultará el bloque actual.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) hará múltiples consultas del mismo tipo en el bloque actual.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) hará múltiples consultas de diferentes tipos en el bloque actual.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) consultará el bloque actual.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) hará múltiples consultas del mismo tipo en el bloque actual.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) hará múltiples consultas de **diferentes** tipos en el bloque actual.
Estas son las interfaces que actualmente no soportamos **NOT**:
diff --git a/docs/es/build/substrate-evm.md b/docs/es/build/substrate-evm.md
index 6172ea8770f..e41abe5b3f9 100644
--- a/docs/es/build/substrate-evm.md
+++ b/docs/es/build/substrate-evm.md
@@ -74,7 +74,7 @@ Funciona de la misma manera que [substrate/EventHandler](../create/mapping/#even
| ----- | --------------------- | ------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| temas | Arreglo de secuencias | Transferencia (índice de dirección de,dirección indexada a,valor uint256) | El filtro de temas sigue los filtros de registro JSON-PRC de Ethereum, se puede encontrar más documentación [aquí](https://docs.ethers.io/v5/concepts/events/). |
-Nota sobre temas:
+**Nota sobre temas:**
Hay un par de mejoras en los filtros básicos de registro:
- Los temas no necesitan ser acolchados 0
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Limitantes conocidas
diff --git a/docs/es/faqs/faqs.md b/docs/es/faqs/faqs.md
index 9f82532871e..b182ff2d4d5 100644
--- a/docs/es/faqs/faqs.md
+++ b/docs/es/faqs/faqs.md
@@ -16,7 +16,7 @@ Subquery también provee alojamiento gratuito, de grado de producción de proyec
**La red de SubQuery**
-La red SubQuery permite a los desarrolladores descentralizar completamente su pila de infraestructura. Es el servicio de datos más abierto, eficiente, fiable y escalable para dApps. SubQuery Network indexa y da servicio a la comunidad global de una manera incentivada y verificable. Después de publicar tu proyecto en SubQuery Network, cualquiera puede indexarlo y alojarlo - proporcionando datos a los usuarios de todo el mundo de manera más rápida y fiable.
+La red SubQuery permite a los desarrolladores descentralizar completamente su pila de infraestructura. Es el servicio de datos más abierto, eficiente, fiable y escalable para dApps. SubQuery Network indexa y da servicio a la comunidad global de una manera incentivada y verificable. Después de publicar tu proyecto en SubQuery Network, cualquiera puede indexarlo y alojarlo - proporcionando datos a los usuarios de todo el mundo de manera más rápida y fiable.
Más información [aquí](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ La mejor manera de empezar con SubQuery es probar nuestro [tutorial de Hola Mund
## ¿Cómo puedo contribuir o dar comentarios a SubQuery?
-Nos encantan las contribuciones y comentarios de la comunidad. Para contribuir con el código, bifurca el repositorio de su interés y realice sus cambios. Luego envíe un PR o Pull Request. No te olvides de probar también. Also check out our contributions guidelines.
+Nos encantan las contribuciones y comentarios de la comunidad. Para contribuir con el código, bifurca el repositorio de su interés y realice sus cambios. Luego envíe un PR o Pull Request. No te olvides de probar también. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
Para dar comentarios, contáctanos a hello@subquery.network o salta a nuestro [canal de discord](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Tenga en cuenta que se recomienda usar `--force-clean` al cambiar el `startBlock` dentro del manifiesto del proyecto (`proyecto. aml`) para comenzar a reindexar desde el bloque configurado. Si `startBlock` se cambia sin un `--force-clean` del proyecto entonces el indexador continuará indexando con el `startBlock` previamente configurado.
-
## How can I optimise my project to speed it up?
Performance is a crucial factor in each project. Fortunately, there are several things you could do to improve it. Here is the list of some suggestions:
@@ -89,13 +88,13 @@ Performance is a crucial factor in each project. Fortunately, there are several
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/es/miscellaneous/contributing.md b/docs/es/miscellaneous/contributing.md
index 913ae7519d1..33047085d9c 100644
--- a/docs/es/miscellaneous/contributing.md
+++ b/docs/es/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
¡Bienvenido y un gran agradecimiento por considerar contribuir a este proyecto de SubQuery! Juntos podemos allanar el camino hacia un futuro más descentralizado.
-::: info Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
+::: tip Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
What follows is a set of guidelines (not rules) for contributing to SubQuery. Following these guidelines will help us make the contribution process easy and effective for everyone involved. It also communicates that you agree to respect the time of the developers managing and developing this project. In return, we will reciprocate that respect by addressing your issue, considering changes, collaborating on improvements, and helping you finalise your pull requests.
@@ -14,8 +14,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* Busca problemas y PRs existentes antes de crear los tuyos.
-* Trabajamos arduamente para garantizar que las cuestiones se traten con rapidez, pero dependiendo del impacto, podría tardar un tiempo en investigar la causa raíz. Una mención amigable @ en el hilo de comentarios para el envío o un colaborador puede ayudar a llamar la atención si su problema está bloqueando.
+- Busca problemas y PRs existentes antes de crear los tuyos.
+- Trabajamos arduamente para garantizar que las cuestiones se traten con rapidez, pero dependiendo del impacto, podría tardar un tiempo en investigar la causa raíz. Una mención amigable @ en el hilo de comentarios para el envío o un colaborador puede ayudar a llamar la atención si su problema está bloqueando.
## ¿Cómo contribuir?
@@ -23,32 +23,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* Utilice un título claro y descriptivo para identificar el problema.
-* Describa los pasos exactos para reproducir el problema.
-* Describa el comportamiento que observó después de seguir los pasos.
-* Explicar qué comportamiento esperabas ver en su lugar y por qué.
-* Incluye capturas de pantalla si es posible.
+- Utilice un título claro y descriptivo para identificar el problema.
+- Describa los pasos exactos para reproducir el problema.
+- Describa el comportamiento que observó después de seguir los pasos.
+- Explicar qué comportamiento esperabas ver en su lugar y por qué.
+- Incluye capturas de pantalla si es posible.
### Envío de Pull Request
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## Convenciones de Código
### Mensajes de Git Commit
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### Guía de estilo de JavaScript
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/es/quickstart/helloworld-localhost.md b/docs/es/quickstart/helloworld-localhost.md
index a73234062dc..0f0f4f9b928 100644
--- a/docs/es/quickstart/helloworld-localhost.md
+++ b/docs/es/quickstart/helloworld-localhost.md
@@ -70,7 +70,7 @@ RPC endpoint: [wss://polkadot. pi.onfinality.io/public-ws]:
Repositorio Git [https://github.com/subquery/subql-starter]:
Obteniendo la genesis de red hash.. done
Autor [Ian He & Jay Ji]:
-Descripción [Este proyecto puede ser utilizado como un inicio de...:
+Descripción [Este proyecto puede ser utilizado como un inicio de...]:
Versión [0.0.4]:
Licencia [MIT]:
proyecto de preparación... hecho
@@ -88,14 +88,16 @@ cd subqlHelloWorld
Ahora haga una instalación de yarn o node para instalar las distintas dependencias.
- ```shell yarn build ``` ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `bash npm run-script build ` :::
An example of `yarn build`
-```shell
-# Yarn yarn install # NPM npm install
+````shell
+# Yarn yarn install # NPM npm install
+```
-
> yarn install
+```shell
+yarn install
yarn install v1.22.10
info No lockfile found.
[1/4] 🔍 Resolving packages...
@@ -104,13 +106,16 @@ info No lockfile found.
[4/4] 🔨 Building fresh packages...
success Saved lockfile.
✨ Done in 31.84s.
-```
+````
## 3. Step 3: Generador de codigo
Ahora ejecuta `yarn codegen` para generar Typescript desde el esquema GraphQL.
- # Yarn yarn codegen # NPM npm run-script codegen
+```shell
+# Yarn yarn codegen
+# NPM npm run-script codegen
+```
```shell
> yarn codegen
@@ -122,9 +127,11 @@ $ ./node_modules/.bin/subql codegen
* Schema StarterEntity generated !
* Models index generated !
* Types index generated !
-✨ Done in 1.02s
+✨ Done in 1.02s
+```
-
> yarn codegen
+````shell
+yarn codegen
yarn run v1.22.10
$ ./node_modules/.bin/subql codegen
===============================
@@ -138,7 +145,8 @@ $ ./node_modules/.bin/subql codegen
* Tipo de índice generado!
* Tipo de índice generado!
✨ Hecho en 0.06s. ```
- ✨ Hecho en 1.02s.
+ ✨ Hecho en 1.02s.
+```
**Advertencia** Cuando se hacen cambios en el archivo de schema, por favor recuerde volver a ejecutar `yarn codegen` para regenerar el directorio de tipos.
@@ -146,7 +154,7 @@ $ ./node_modules/.bin/subql codegen
El siguiente paso es construir el código con `yarn build`.
- # Yarn yarn build # NPM npm run-script build
+`# Yarn yarn build # NPM npm run-script build`
```shell
> yarn build
@@ -199,3 +207,4 @@ El número de bloques en el patio de juego debe coincidir con el número de bloq
## Resúmen
En este inicio rápido, demostramos los pasos básicos para poner en marcha un proyecto inicial dentro de un entorno Docker y luego navegamos a localhost:3000 y ejecutamos una consulta para devolver el número de bloque de la red mainnet Polkadot.
+````
diff --git a/docs/es/quickstart/quickstart-avalanche.md b/docs/es/quickstart/quickstart-avalanche.md
index beb72e2c0ce..65191f5e8c8 100644
--- a/docs/es/quickstart/quickstart-avalanche.md
+++ b/docs/es/quickstart/quickstart-avalanche.md
@@ -59,7 +59,11 @@ Después de completar el proceso de inicialización, debería ver una carpeta co
Por último, bajo el directorio del proyecto, ejecute el siguiente comando para instalar las dependencias del nuevo proyecto.
- shell cd PROJECT_NAME npm install ``` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Usted trabajará principalmente en los siguientes archivos:
+```shell
+cd PROJECT_NAME npm install
+```
+
+Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Usted trabajará principalmente en los siguientes archivos:
1. El esquema GraphQL en `schema.graphql`
2. El manifiesto del proyecto en `project.yaml`
@@ -87,8 +91,8 @@ type PangolinApproval @entity {
**Importante: Cuando realice cambios en el archivo de esquema, asegúrese de que regenera el directorio de sus tipos. Hágalo ahora.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models`. Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
@@ -164,15 +168,15 @@ Para más información sobre las funciones de mapeo, revisa nuestra documentaci
Para ejecutar tu nuevo SubQuery Project primero necesitamos construir nuestro trabajo. Ejecuta el comando de compilación desde el directorio raíz del proyecto.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
-**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**. La forma más sencilla de hacerlo es utilizando Docker.
+**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**. La forma más sencilla de hacerlo es utilizando Docker.
Toda la configuración que controla cómo se ejecuta un nodo SubQuery se define en este archivo `docker-compose.yml`. Para un nuevo proyecto que ha sido inicializado no necesitarás cambiar nada aquí, pero puedes leer más sobre el archivo y la configuración en nuestra sección [Ejecutar un proyecto](../run_publish/run.md)
Bajo el directorio del proyecto ejecute el siguiente comando:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Sea paciente aquí.
@@ -186,7 +190,7 @@ Para un nuevo proyecto inicial de SubQuery, puedes probar la siguiente consulta
```graphql
query {
- pangolinApprovals(first: 5) {
+ pangolinApprovals(first: 5) {
nodes {
id
blockNumber
diff --git a/docs/es/quickstart/quickstart-cosmos.md b/docs/es/quickstart/quickstart-cosmos.md
index 0a34a6d5ecb..95d056d2b3d 100644
--- a/docs/es/quickstart/quickstart-cosmos.md
+++ b/docs/es/quickstart/quickstart-cosmos.md
@@ -44,7 +44,7 @@ Después de completar el proceso de inicialización, debería ver una carpeta co
Por último, bajo el directorio del proyecto, ejecute el siguiente comando para instalar las dependencias del nuevo proyecto.
- shell cd PROJECT_NAME npm install ``` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Usted trabajará principalmente en los siguientes archivos:
+`cd PROJECT_NAME npm install` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Usted trabajará principalmente en los siguientes archivos:
1. El esquema GraphQL en `schema.graphql`
2. El manifiesto del proyecto en `project.yaml`
@@ -71,10 +71,10 @@ type Vote @entity {
**Importante: Cuando realice cambios en el archivo de esquema, asegúrese de que regenera el directorio de sus tipos. Hágalo ahora.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
-You'll find the generated models in the `/src/types/models` Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
+You'll find the generated models in the `/src/types/models` Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
### Actualizando el archivo de manifiesto del proyecto
@@ -141,15 +141,15 @@ Para más información sobre las funciones de mapeo, revisa nuestra documentaci
Para ejecutar tu nuevo SubQuery Project primero necesitamos construir nuestro trabajo. Ejecuta el comando de compilación desde el directorio raíz del proyecto.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
-**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project** La forma más fácil de hacer esto es usando Docker.
+**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project** La forma más fácil de hacer esto es usando Docker.
Toda la configuración que controla cómo se ejecuta un nodo de SubQuery está definida en este archivo `docker-compose.yml`. Para un nuevo proyecto que ha sido inicializado no necesitarás cambiar nada aquí, pero puedes leer más sobre el archivo y la configuración en nuestra sección [Ejecutar un proyecto](../run_publish/run.md)
Bajo el directorio del proyecto ejecute el siguiente comando:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Sea paciente aquí.
@@ -163,10 +163,9 @@ Para un nuevo proyecto inicial de SubQuery, puedes probar la siguiente consulta
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/es/quickstart/quickstart-polkadot.md b/docs/es/quickstart/quickstart-polkadot.md
index 089b17c8cc5..0048478f49d 100644
--- a/docs/es/quickstart/quickstart-polkadot.md
+++ b/docs/es/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Se le harán ciertas preguntas ya que el proyecto de SubQuery está initalizado:
- Nombre del proyecto: Un nombre de proyecto para su proyecto de SubQuery
-- Familia de redes: La familia de redes de blockchain de capa 1 para la que se desarrollará este proyecto de SubQuery. Utilice las teclas de flecha para seleccionar las opciones disponibles. Para esta guía, utilizaremos *"Substrate"*
-- Red: La red específica para la que se desarrollará este proyecto de SubQuery. Utilice las teclas de flecha para seleccionar las opciones disponibles. Para esta guía, utilizaremos *"Polkadot"*
-- Proyecto de plantilla: Seleccione un proyecto de plantilla de SubQuery que le proporcionará un punto de partida para comenzar el desarrollo. Sugerimos seleccionar el proyecto *"subql-starter"*.
-- Punto final RPC: Proporcione una URL HTTPS a un punto final RPC en ejecución que se utilizará por defecto para este proyecto. Puede acceder rápidamente a los puntos finales públicos para diferentes redes de Polkadot, crear tu propio nodo privado dedicado utilizando [OnFinality](https://app.onfinality.io) o simplemente utilizar el punto final predeterminado de Polkadot. Este nodo RPC debe ser un nodo de archivo (tienen el estado completo de cadena). Para esta guía, utilizaremos el valor por defecto *"https://polkadot.api.onfinality.io"*
+- Familia de redes: La familia de redes de blockchain de capa 1 para la que se desarrollará este proyecto de SubQuery. Utilice las teclas de flecha para seleccionar las opciones disponibles. Para esta guía, utilizaremos _"Substrate"_
+- Red: La red específica para la que se desarrollará este proyecto de SubQuery. Utilice las teclas de flecha para seleccionar las opciones disponibles. Para esta guía, utilizaremos _"Polkadot"_
+- Proyecto de plantilla: Seleccione un proyecto de plantilla de SubQuery que le proporcionará un punto de partida para comenzar el desarrollo. Sugerimos seleccionar el proyecto _"subql-starter"_.
+- Punto final RPC: Proporcione una URL HTTPS a un punto final RPC en ejecución que se utilizará por defecto para este proyecto. Puede acceder rápidamente a los puntos finales públicos para diferentes redes de Polkadot, crear tu propio nodo privado dedicado utilizando [OnFinality](https://app.onfinality.io) o simplemente utilizar el punto final predeterminado de Polkadot. Este nodo RPC debe ser un nodo de archivo (tienen el estado completo de cadena). Para esta guía, utilizaremos el valor por defecto _"https://polkadot.api.onfinality.io"_
- Git repository: Proporcione una URL Git a un repositorio en el que se alojará este proyecto SubQuery (cuando se aloje en el Explorador de SubQuery) o acepte el predeterminado proporcionado.
- Autores: Introduzca aquí el propietario de este proyecto de SubQuery (por ejemplo, su nombre) o acepte el valor predeterminado proporcionado.
- Descripción: Proporcione un breve párrafo sobre su proyecto que describa los datos que contiene y lo que los usuarios pueden hacer con ellos o acepte el valor predeterminado proporcionado.
@@ -57,7 +57,7 @@ Una vez completado el proceso de inicialización, deberías ver que se ha creado
Por último, en el directorio del proyecto, ejecute el siguiente comando para instalar las dependencias del nuevo proyecto.
- shell cd PROJECT_NAME npm install ``` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Estos son:
+`cd PROJECT_NAME npm install` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Estos son:
1. El esquema GraphQL en `schema.graphql`
2. El manifiesto del proyecto en `project.yaml`
@@ -83,8 +83,8 @@ type Transfer @entity {
**Importante: Cuando realice cambios en el archivo de esquema, asegúrese de que regenera el directorio de sus tipos.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
@@ -149,15 +149,15 @@ Para más información sobre las funciones de mapeo, revisa nuestra documentaci
Para ejecutar su nuevo Proyecto SubQuery, primero tenemos que construir nuestro trabajo. Ejecuta el comando de compilación desde el directorio raíz del proyecto.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
-**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**. La forma más fácil de hacer esto es usando Docker.
+**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**. La forma más fácil de hacer esto es usando Docker.
Toda la configuración que controla cómo se ejecuta un nodo SubQuery se define en el archivo `docker-compose.yml`. Para un nuevo proyecto que acaba de ser inicializado no necesitará cambiar nada, pero puede leer más sobre el archivo y los ajustes en nuestra sección [Ejecutar un proyecto](../run_publish/run.md).
En el directorio del proyecto, ejecute el siguiente comando:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node.
@@ -172,10 +172,7 @@ Para un nuevo proyecto de inicio de SubQuery, pruebe la siguiente consulta para
```graphql
{
consulta {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodos {
id
cantidad
diff --git a/docs/es/quickstart/quickstart-terra.md b/docs/es/quickstart/quickstart-terra.md
index 0ff6b5ac098..8cf07cc402b 100644
--- a/docs/es/quickstart/quickstart-terra.md
+++ b/docs/es/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Se le harán ciertas preguntas ya que el proyecto de SubQuery está initalizado:
- Nombre del proyecto: Un nombre para tu proyecto de Subconsulta
-- Familia de Red: La familia de red de blockchain capa 1 que este proyecto de SubQuery será desarrollado para indexar, usa las teclas de flecha de tu teclado para seleccionar entre las opciones, para esta guía usaremos *"Terra"*
-- Red: La red específica que este proyecto de Subconsulta será desarrollado para indexar, usa las teclas de flecha de tu teclado para seleccionar entre las opciones, para esta guía usaremos *"Terra"*
-- Plantilla: Seleccione una plantilla de proyecto de SubQuery que proporcionará un punto de partida para comenzar el desarrollo, le sugerimos seleccionar el *"Inicio del proyecto"*
+- Familia de Red: La familia de red de blockchain capa 1 que este proyecto de SubQuery será desarrollado para indexar, usa las teclas de flecha de tu teclado para seleccionar entre las opciones, para esta guía usaremos _"Terra"_
+- Red: La red específica que este proyecto de Subconsulta será desarrollado para indexar, usa las teclas de flecha de tu teclado para seleccionar entre las opciones, para esta guía usaremos _"Terra"_
+- Plantilla: Seleccione una plantilla de proyecto de SubQuery que proporcionará un punto de partida para comenzar el desarrollo, le sugerimos seleccionar el _"Inicio del proyecto"_
- Repositorio Git (opcional): Proporcione una URL Git a un repositorio en el que este proyecto de SubQuery será alojado (cuando esté alojado en SubQuery Explorer)
-- endpoint RPC (requerido): Proporcione una URL HTTPS a un endpoint RPC en ejecución que se utilizará por defecto para este proyecto. Este nodo RPC debe ser un nodo de archivo (tienen el estado completo de cadena). Para esta guía usaremos el valor predeterminado *"https://terra-columbus-5.beta.api.onfinality.io"*
+- endpoint RPC (requerido): Proporcione una URL HTTPS a un endpoint RPC en ejecución que se utilizará por defecto para este proyecto. Este nodo RPC debe ser un nodo de archivo (tienen el estado completo de cadena). Para esta guía usaremos el valor predeterminado _"https://terra-columbus-5.beta.api.onfinality.io"_
- Autores (Requeridos): Introduzca el propietario de este proyecto de Subconsulta aquí (por ejemplo, su nombre)
- Descripción (Opcional): Puede proporcionar un párrafo corto sobre su proyecto que describa qué datos contiene y qué pueden hacer los usuarios con él
- Versión (Requerida): Introduzca un número de versión personalizado o utilice el predeterminado (`1.0.0`)
@@ -59,7 +59,7 @@ Después de completar el proceso de inicialización, debería ver una carpeta co
Por último, bajo el directorio del proyecto, ejecute el siguiente comando para instalar las dependencias del nuevo proyecto.
- shell cd PROJECT_NAME npm install ``` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Usted trabajará principalmente en los siguientes archivos:
+`cd PROJECT_NAME npm install` Hacer cambios en su proyecto En el paquete de inicio que acaba de inicializar, proporcionamos una configuración estándar para su nuevo proyecto. Estos son:
1. El esquema GraphQL en `schema.graphql`
2. El manifiesto del proyecto en `project.yaml`
@@ -86,10 +86,10 @@ type Transfer @entity {
**Importante: Cuando realice cambios en el archivo de esquema, asegúrese de que regenera el directorio de sus tipos. Hágalo ahora.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
-You'll find the generated models in the `/src/types/models` Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
+You'll find the generated models in the `/src/types/models` Para más información sobre el archivo `schema.graphql` , revisa nuestra documentación en [Esquema de Build/GraphQL](../build/graphql.md)
### Actualizando el archivo de manifiesto del proyecto
@@ -159,15 +159,15 @@ Para más información sobre las funciones de mapeo, revisa nuestra documentaci
Para ejecutar tu nuevo SubQuery Project primero necesitamos construir nuestro trabajo. Ejecuta el comando de compilación desde el directorio raíz del proyecto.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
-**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project** La forma más fácil de hacer esto es usando Docker.
+**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project** La forma más fácil de hacer esto es usando Docker.
Toda la configuración que controla cómo se ejecuta un nodo de SubQuery está definida en este archivo `docker-compose.yml`. Para un nuevo proyecto que ha sido inicializado no necesitarás cambiar nada aquí, pero puedes leer más sobre el archivo y la configuración en nuestra sección [Ejecutar un proyecto](../run_publish/run.md)
Bajo el directorio del proyecto ejecute el siguiente comando:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Sea paciente aquí.
@@ -182,10 +182,7 @@ Para un nuevo proyecto inicial de SubQuery, puedes probar la siguiente consulta
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/es/quickstart/quickstart.md b/docs/es/quickstart/quickstart.md
index 87d0d832d7a..107a94b460e 100644
--- a/docs/es/quickstart/quickstart.md
+++ b/docs/es/quickstart/quickstart.md
@@ -89,8 +89,8 @@ Después de completar el proceso de inicialización, verá una carpeta con el no
Finalmente, ejecute el siguiente comando para instalar las dependencias del nuevo proyecto desde el directorio del nuevo proyecto.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
You have now initialised your first SubQuery project with just a few simple steps. Personalicemos ahora el proyecto de plantilla estándar para un blockchain específico de interés.
@@ -104,4 +104,4 @@ Hay 3 archivos importantes que necesitan ser modificados. These are:
2. The Project Manifest in `project.yaml`.
3. The Mapping functions in `src/mappings/` directory.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/es/run_publish/connect.md b/docs/es/run_publish/connect.md
index 9fa0abeab57..64aec0ae1db 100644
--- a/docs/es/run_publish/connect.md
+++ b/docs/es/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![Proyecto en despliegue y sincronización](/assets/img/projects-deploy-sync.png)
+![Proyecto en despliegue y sincronización](/assets/img/projects_deploy_sync.png)
Alternativamente, puedes hacer clic en los tres puntos al lado del título de tu proyecto, y verlo en SubQuery Explorer. There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/es/run_publish/query.md b/docs/es/run_publish/query.md
index 88d42cae25a..11c65360cc4 100644
--- a/docs/es/run_publish/query.md
+++ b/docs/es/run_publish/query.md
@@ -12,4 +12,4 @@ También notará que el SubQuery Explorer proporciona un área de juego para des
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. Esta documentación se genera automáticamente y le ayuda a encontrar qué entidades y métodos puede consultar.
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/es/run_publish/references.md b/docs/es/run_publish/references.md
index caf40cea3b4..b7e6ff2dfe4 100644
--- a/docs/es/run_publish/references.md
+++ b/docs/es/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | Descripción |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | Descripción |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/es/run_publish/run.md b/docs/es/run_publish/run.md
index bbbd9b160bc..db1a7889079 100644
--- a/docs/es/run_publish/run.md
+++ b/docs/es/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/es/run_publish/subscription.md b/docs/es/run_publish/subscription.md
index 9c4eb48da19..e8edcf2f0ea 100644
--- a/docs/es/run_publish/subscription.md
+++ b/docs/es/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery ahora también soporta Suscripciones Graphql. Al igual que las consulta
Las suscripciones son muy útiles cuando desea que su aplicación cliente cambie datos o muestre algunos nuevos datos tan pronto como se produzca ese cambio o los nuevos datos estén disponibles. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## Cómo suscribirse a una entidad
diff --git a/docs/es/run_publish/upgrade.md b/docs/es/run_publish/upgrade.md
index 09ba29cc68d..a12393e0ce3 100644
--- a/docs/es/run_publish/upgrade.md
+++ b/docs/es/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
Una vez que el despliegue se ha completado correctamente y nuestros nodos han indexado sus datos de la cadena, podrás conectarte a tu proyecto a través del punto final de la Consulta mostrada en GraphQL.
-![Proyecto en despliegue y sincronización](/assets/img/projects-deploy-sync.png)
+![Proyecto en despliegue y sincronización](/assets/img/projects_deploy_sync.png)
Alternativamente, puedes hacer clic en los tres puntos al lado del título de tu proyecto, y verlo en SubQuery Explorer. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/es/subquery_network/introduction.md b/docs/es/subquery_network/introduction.md
index e6b0f6586bc..75618fbc75b 100644
--- a/docs/es/subquery_network/introduction.md
+++ b/docs/es/subquery_network/introduction.md
@@ -18,22 +18,22 @@ Hay un papel para todos en la red, desde desarrolladores altamente técnicos has
Los consumidores solicitarán a SubQuery Network datos específicos para sus dApps o herramientas, y pagarán una cantidad anunciada de SQT por cada solicitud.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexadores
Los indexadores ejecutarán y mantendrán proyectos de SubQuery de alta calidad en su propia infraestructura, ejecutando tanto el indexador como el servicio de consultas, y será recompensado en SQT por las peticiones que sirven.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegadores
Los dedores participarán en la Red apoyando a sus indexadores favoritos para ganar recompensas en función del trabajo que hagan esos indexadores.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Arquitectos
Los arquitectos son los constructores de los proyectos SubQuery en los que se ejecuta la red. Autorizan y publican proyectos de SubQuery para la Red para indexar y ejecutar.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/faqs/faqs.md b/docs/faqs/faqs.md
index 7c4848a2a0f..9a35a11769f 100644
--- a/docs/faqs/faqs.md
+++ b/docs/faqs/faqs.md
@@ -2,9 +2,9 @@
## What is SubQuery?
-SubQuery is an open source blockchain data indexer for developers that provides fast, flexible, reliable, and decentralised APIs to power leading multi-chain apps.
+SubQuery is an open source blockchain data indexer for developers that provides fast, flexible, reliable, and decentralised APIs to power leading multi-chain apps.
-Our goal is to save developers' time and money by eliminating the need of building their own indexing solution. Now, they can fully focus on developing their applications. SubQuery helps developers create the decentralised products of the future.
+Our goal is to save developers' time and money by eliminating the need of building their own indexing solution. Now, they can fully focus on developing their applications. SubQuery helps developers create the decentralised products of the future.
, `.yaml` atau `.js` format.
+Dalam contoh v0.2.0 di bawah ini, `network.chaintypes` menunjuk ke file yang memiliki semua tipe kustom yang disertakan, Ini adalah file chainspec standar yang menyatakan tipe spesifik yang didukung oleh blockchain ini di `.json`, `.yaml` atau `.js` format.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # Filepath relatif ke tempat jenis kustom disimpan ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # Filepath relatif ke tempat jenis kustom disimpan ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
Untuk menggunakan TypeScript untuk file jenis rantai Anda, masukkan dalam folder `src` (misalnya `./src/types.ts`), jalankan `yarn build` dan lalu arahkan ke file js yang dihasilkan yang terletak di folder `dist`.
@@ -171,7 +171,7 @@ Hal-hal yang perlu diperhatikan tentang menggunakan file jenis rantai dengan eks
Berikut adalah contoh file jenis rantai `.ts`:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## Sumber Data Khusus
@@ -197,6 +197,6 @@ Pengguna dapat menambahkan `filter` pada `dataSources` untuk memutuskan sumber d
Di bawah ini merupakan contoh yang menunjukkan sumber data berbeda untuk jaringan Polkadot dan Kusama.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Buat template untuk menghindari redundansi definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #pakai template disini - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # dapat digunakan kembali atau diubah `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Buat template untuk menghindari redundansi definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #pakai template disini - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # dapat digunakan kembali atau diubah `
-
+:::
diff --git a/docs/id/build/mapping.md b/docs/id/build/mapping.md
index d76de7c853f..089089dd4ee 100644
--- a/docs/id/build/mapping.md
+++ b/docs/id/build/mapping.md
@@ -67,9 +67,9 @@ Tujuan kami adalah mencakup semua sumber data bagi pengguna untuk mapping handle
Ini adalah interface yang saat ini kami dukung:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) akan mengkueri balok current.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) akan membuat beberapa jenis kueri yang sama di balok saat ini.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) akan membuat beberapa jenis kueri berbeda di balok saat ini.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) akan mengkueri balok **current**.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) akan membuat beberapa jenis kueri yang **sama** di balok saat ini.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) akan membuat beberapa jenis kueri **berbeda** di balok saat ini.
Ini adalah interface yang kami **TIDAK** dukung saat ini:
diff --git a/docs/id/build/substrate-evm.md b/docs/id/build/substrate-evm.md
index 25c35568579..49a014dbe40 100644
--- a/docs/id/build/substrate-evm.md
+++ b/docs/id/build/substrate-evm.md
@@ -74,7 +74,7 @@ Bekerja dengan cara yang sama seperti [substrate/EventHandler](../create/mapping
| ------ | ------------ | --------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------- |
| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | Filter topik mengikuti filter log Ethereum JSON-PRC, dokumentasi selengkapnya dapat ditemukan [di sini](https://docs.ethers.io/v5/concepts/events/). |
-Catatan tentang topik:
+**Catatan tentang topik:**
Ada beberapa peningkatan dari filter log dasar:
- Topik tidak perlu diisi 0
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Alamat kontrak (atau penerima jika transfer) untuk difilter, jika `null` seharusnya untuk pembuatan kontrak
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Batasan yang Diketahui
diff --git a/docs/id/faqs/faqs.md b/docs/id/faqs/faqs.md
index f2d629fa149..dcbb92d6d82 100644
--- a/docs/id/faqs/faqs.md
+++ b/docs/id/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery juga menyediakan hosting proyek kelas produksi gratis untuk para pengem
**SubQuery Network**
-Jaringan SubQuery memungkinkan pengembang untuk sepenuhnya mendesentralisasi tumpukan infrastruktur mereka. Ini adalah layanan data yang paling terbuka, berkinerja, andal, dan dapat diskalakan untuk dApps. Jaringan SubQuery mengindeks dan melayani data ke komunitas global dengan cara yang berinsentif dan dapat diverifikasi. Setelah memublikasikan proyek Anda ke Jaringan SubQuery, siapa pun dapat mengindeks dan menghostingnya - menyediakan data kepada pengguna di seluruh dunia dengan lebih cepat dan andal.
+Jaringan SubQuery memungkinkan pengembang untuk sepenuhnya mendesentralisasi tumpukan infrastruktur mereka. Ini adalah layanan data yang paling terbuka, berkinerja, andal, dan dapat diskalakan untuk dApps. Jaringan SubQuery mengindeks dan melayani data ke komunitas global dengan cara yang berinsentif dan dapat diverifikasi. Setelah memublikasikan proyek Anda ke Jaringan SubQuery, siapa pun dapat mengindeks dan menghostingnya - menyediakan data kepada pengguna di seluruh dunia dengan lebih cepat dan andal.
Informasi lebih lanjut [di sini](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ Cara terbaik untuk memulai SubQuery adalah mencoba [Hello World Tutorial](/asset
## Bagaimana saya bisa berkontribusi atau memberi masukan ke SubQuery?
-Kami menyukai kontribusi dan umpan balik dari komunitas. Untuk menyumbangkan kode, fork repositori yang Anda minati dan buat perubahan Anda. Kemudian kirimkan PR atau Pull Request. Jangan lupa untuk mengujinya juga. Lihat juga pedoman kontribusi kami.
+Kami menyukai kontribusi dan umpan balik dari komunitas. Untuk menyumbangkan kode, fork repositori yang Anda minati dan buat perubahan Anda. Kemudian kirimkan PR atau Pull Request. Jangan lupa untuk mengujinya juga. Lihat juga pedoman [kontribusi kami](../miscellaneous/contributing.html).
Untuk memberi umpan balik, hubungi kami di hello@subquery.network atau buka [discord channel](https://discord.com/invite/78zg8aBSMG) kami.
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Perhatikan bahwa disarankan untuk menggunakan `--force-clean` saat mengubah `startBlock` dalam manifes proyek (`project.yaml`) untuk memulai pengindeksan ulang dari blok yang dikonfigurasi. Jika `startBlock` diubah tanpa `--force-clean` proyek, maka pengindeks akan melanjutkan pengindeksan dengan `startBlock` yang dikonfigurasi sebelumnya.
-
## Bagaimana saya bisa mengoptimalkan proyek saya untuk mempercepatnya?
Performa merupakan faktor krusial dalam setiap proyek. Untungnya, ada beberapa hal yang bisa Anda lakukan untuk memperbaikinya. Berikut ini daftar beberapa saran:
@@ -89,13 +88,13 @@ Performa merupakan faktor krusial dalam setiap proyek. Untungnya, ada beberapa h
- Atur blok awal ke saat kontrak diinisialisasi.
- Selalu gunakan [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (kami dapat membantu membuatnya untuk jaringan baru Anda).
- Optimalkan desain skema Anda, buatlah sesederhana mungkin.
- - Cobalah untuk mengurangi bidang dan kolom yang tidak perlu.
- - Buat indeks sesuai kebutuhan.
+ - Cobalah untuk mengurangi bidang dan kolom yang tidak perlu.
+ - Buat indeks sesuai kebutuhan.
- Gunakan pemrosesan paralel/batch sesering mungkin.
- - Gunakan `api.queryMulti()` untuk mengoptimalkan panggilan API Polkadot di dalam fungsi pemetaan dan menanyakannya secara paralel. Ini adalah cara yang lebih cepat daripada loop.
- - Gunakan `Promise.all()`. Dalam kasus beberapa fungsi async, lebih baik mengeksekusinya dan menyelesaikannya secara paralel.
- - Jika Anda ingin membuat banyak entitas dalam satu handler, Anda dapat menggunakan `store.bulkCreate(entityName: string, entities: Entity[])`. Anda bisa membuatnya secara paralel, tidak perlu melakukannya satu per satu.
+ - Gunakan `api.queryMulti()` untuk mengoptimalkan panggilan API Polkadot di dalam fungsi pemetaan dan menanyakannya secara paralel. Ini adalah cara yang lebih cepat daripada loop.
+ - Gunakan `Promise.all()`. Dalam kasus beberapa fungsi async, lebih baik mengeksekusinya dan menyelesaikannya secara paralel.
+ - Jika Anda ingin membuat banyak entitas dalam satu handler, Anda dapat menggunakan `store.bulkCreate(entityName: string, entities: Entity[])`. Anda bisa membuatnya secara paralel, tidak perlu melakukannya satu per satu.
- Membuat panggilan API untuk menanyakan state bisa lambat. Anda bisa mencoba untuk meminimalkan pemanggilan jika memungkinkan dan menggunakan data `ekstrinsik/transaksi/event`.
- Gunakan `worker threads` untuk memindahkan pengambilan blok dan pemrosesan blok ke dalam thread pekerja sendiri. Ini bisa mempercepat pengindeksan hingga 4 kali lipat (tergantung pada proyek tertentu). Anda bisa dengan mudah mengaktifkannya dengan menggunakan flag `-workers=`. Perhatikan bahwa jumlah core CPU yang tersedia sangat membatasi penggunaan thread pekerja. Untuk saat ini, ini hanya tersedia untuk Substrate dan Cosmos dan akan segera diintegrasikan untuk Avalanche.
- Perhatikan bahwa `JSON.stringify` tidak mendukung native `BigInts`. Pustaka logging kami akan melakukan hal ini secara internal jika Anda mencoba untuk mencatat sebuah objek. Kami sedang mencari solusi untuk ini.
-- Gunakan filter `modulo` yang mudah digunakan untuk menjalankan handler hanya sekali ke blok tertentu. Filter ini memungkinkan penanganan sejumlah blok tertentu, yang sangat berguna untuk mengelompokkan dan menghitung data pada interval yang ditetapkan. Sebagai contoh, jika modulo diatur ke 50, block handler akan berjalan pada setiap 50 blok. Ini memberikan lebih banyak kontrol atas data pengindeksan kepada pengembang dan dapat diimplementasikan seperti di bawah ini dalam manifes proyek Anda.
\ No newline at end of file
+- Gunakan filter `modulo` yang mudah digunakan untuk menjalankan handler hanya sekali ke blok tertentu. Filter ini memungkinkan penanganan sejumlah blok tertentu, yang sangat berguna untuk mengelompokkan dan menghitung data pada interval yang ditetapkan. Sebagai contoh, jika modulo diatur ke 50, block handler akan berjalan pada setiap 50 blok. Ini memberikan lebih banyak kontrol atas data pengindeksan kepada pengembang dan dapat diimplementasikan seperti di bawah ini dalam manifes proyek Anda.
diff --git a/docs/id/miscellaneous/contributing.md b/docs/id/miscellaneous/contributing.md
index 5ecbdf0d86f..2afb238d60c 100644
--- a/docs/id/miscellaneous/contributing.md
+++ b/docs/id/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
Selamat datang dan terima kasih banyak telah mempertimbangkan untuk berkontribusi pada proyek SubQuery ini! Bersama-sama kita dapat membuka jalan menuju masa depan yang lebih terdesentralisasi.
-:::: info Catatan Dokumentasi ini secara aktif dikelola oleh tim SubQuery. Kami juga menerima kontribusi. Anda dapat melakukannya dengan melakukan forking proyek GitHub kami dan membuat perubahan pada semua file markdown dokumentasi di bawah direktori `docs`. :::
+::: tip Catatan Dokumentasi ini secara aktif dikelola oleh tim SubQuery. Kami juga menerima kontribusi. Anda dapat melakukannya dengan melakukan forking proyek GitHub kami dan membuat perubahan pada semua file markdown dokumentasi di bawah direktori `docs`. :::
Berikut ini adalah seperangkat pedoman (bukan aturan) untuk berkontribusi pada SubQuery. Mengikuti panduan ini akan membantu kami membuat proses kontribusi menjadi mudah dan efektif untuk semua orang yang terlibat. Ini juga menyampaikan bahwa Anda setuju untuk menghormati waktu dari developer yang mengelola dan mengembangkan proyek ini. Sebagai imbalannya, kami akan membalas rasa hormat itu dengan mengatasi masalah Anda, mempertimbangkan perubahan, berkolaborasi dalam peningkatan, dan membantu Anda menyelesaikan pull request Anda.
@@ -14,8 +14,8 @@ Kami menganggap serius proyek dan tanggung jawab komunitas open source kami dan
Kontribusi ke repositori kami dilakukan melalui Issue and Pull Request (PR). Beberapa pedoman umum yang mencakup keduanya:
-* Cari Issue and PR yang ada terlebih dahulu sebelum membuat milik Anda sendiri.
-* Kami bekerja keras untuk memastikan issue ditangani dengan segera, tetapi tergantung pada dampaknya, mungkin bisa memakan waktu cukup lama untuk menyelidiki akar masalahnya. Sebuah @ sebutan ramah di utas komentar kepada pengirim atau kontributor dapat membantu menarik perhatian jika issue Anda terblokir.
+- Cari Issue and PR yang ada terlebih dahulu sebelum membuat milik Anda sendiri.
+- Kami bekerja keras untuk memastikan issue ditangani dengan segera, tetapi tergantung pada dampaknya, mungkin bisa memakan waktu cukup lama untuk menyelidiki akar masalahnya. Sebuah @ sebutan ramah di utas komentar kepada pengirim atau kontributor dapat membantu menarik perhatian jika issue Anda terblokir.
## Bagaimana Berkontribusi
@@ -23,32 +23,32 @@ Kontribusi ke repositori kami dilakukan melalui Issue and Pull Request (PR). Beb
Bug dilacak sebagai issue GitHub. Saat mencatatkan log issue, jelaskan masalahnya dan sertakan detail tambahan untuk membantu pengelola mereproduksi masalah itu:
-* Gunakan judul issue yang jelas dan deskriptif untuk mengidentifikasi masalah.
-* Jelaskan langkah-langkah yang akurat untuk mereproduksi masalah.
-* Jelaskan perilaku yang Anda amati setelah mengikuti langkah-langkah tersebut.
-* Jelaskan perilaku mana yang Anda harapkan untuk dilihat dan mengapa.
-* Sertakan screenshot jika memungkinkan.
+- Gunakan judul issue yang jelas dan deskriptif untuk mengidentifikasi masalah.
+- Jelaskan langkah-langkah yang akurat untuk mereproduksi masalah.
+- Jelaskan perilaku yang Anda amati setelah mengikuti langkah-langkah tersebut.
+- Jelaskan perilaku mana yang Anda harapkan untuk dilihat dan mengapa.
+- Sertakan screenshot jika memungkinkan.
### Mengirimkan Pull Request
Secara umum, kami mengikuti alur kerja "fork-and-pull" Git:
-* Fork repositori ke akun Github Anda sendiri.
-* Clone proyek ke mesin Anda.
-* Buat branch secara lokal dengan nama yang ringkas namun deskriptif.
-* Commit perubahan ke branch.
-* Ikuti pedoman pemformatan dan testing apa pun yang khusus untuk repo ini.
-* Push perubahan ke fork Anda.
-* Buka sebuah PR di repositori kami.
+- Fork repositori ke akun Github Anda sendiri.
+- Clone proyek ke mesin Anda.
+- Buat branch secara lokal dengan nama yang ringkas namun deskriptif.
+- Commit perubahan ke branch.
+- Ikuti pedoman pemformatan dan testing apa pun yang khusus untuk repo ini.
+- Push perubahan ke fork Anda.
+- Buka sebuah PR di repositori kami.
## Konvensi Coding
### Pesan Git Commit
-* Gunakan bentuk waktu kini ("Tambahkan fitur" bukan "Fitur yang ditambahkan").
-* Gunakan suasana perintah ("Pindahkan kursor ke..." bukan "Memindahkan kursor ke...").
-* Batasi baris pertama hingga 72 karakter atau kurang.
+- Gunakan bentuk waktu kini ("Tambahkan fitur" bukan "Fitur yang ditambahkan").
+- Gunakan suasana perintah ("Pindahkan kursor ke..." bukan "Memindahkan kursor ke...").
+- Batasi baris pertama hingga 72 karakter atau kurang.
### JavaScript Styleguide
-* Semua kode JavaScript diverifikasi dengan Prettier dan ESLint.
+- Semua kode JavaScript diverifikasi dengan Prettier dan ESLint.
diff --git a/docs/id/quickstart/helloworld-localhost.md b/docs/id/quickstart/helloworld-localhost.md
index d8b52f91c1a..08091c67bcc 100644
--- a/docs/id/quickstart/helloworld-localhost.md
+++ b/docs/id/quickstart/helloworld-localhost.md
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Sekarang lakukan instal yarn atau node untuk menginstal berbagai dependencies.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
Sebagai Contoh `yarn install`
@@ -109,8 +109,8 @@ berhasil Menyimpan file kunci.
Sekarang jalankan `yarn codegen` untuk menghasilkan TypeScript dari skema GraphQL.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
An example of `yarn codegen`
@@ -133,8 +133,8 @@ $ ./node_modules/.bin/subql codegen
Langkah selanjutnya adalah membuat kode dengan `yarn build`.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
An example of `yarn build`
diff --git a/docs/id/quickstart/quickstart-avalanche.md b/docs/id/quickstart/quickstart-avalanche.md
index 1ab6b9c72d9..0b8078789b9 100644
--- a/docs/id/quickstart/quickstart-avalanche.md
+++ b/docs/id/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ Setelah proses inisialisasi selesai, Anda akan melihat folder dengan nama proyek
Terakhir, di bawah direktori proyek, jalankan perintah berikut untuk menginstal dependensi proyek baru.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Membuat Perubahan pada Proyek Anda
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Penting: Saat Anda membuat perubahan apa pun pada file skema, pastikan Anda membuat ulang direktori tipe Anda. Lakukan ini sekarang.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Anda akan menemukan model yang dihasilkan di direktori `/src/types/models`. Untuk informasi lebih lanjut tentang file `schema.graphql`, lihat dokumentasi kami di bawah [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ Untuk informasi lebih lanjut tentang fungsi pemetaan, lihat dokumentasi kami di
Untuk menjalankan Proyek SubQuery baru Anda, pertama-tama kita perlu membangun pekerjaan kita. Jalankan perintah build dari direktori root proyek.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Penting: Setiap kali Anda membuat perubahan pada fungsi pemetaan, Anda harus membangun kembali proyek Anda**
@@ -183,13 +183,11 @@ Semua konfigurasi yang mengontrol bagaimana node SubQuery dijalankan didefinisik
Di bawah direktori proyek jalankan perintah berikut:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Mungkin perlu beberapa saat untuk mengunduh paket yang diperlukan ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
-
-
+`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
### Kueri Proyek Anda
@@ -199,8 +197,6 @@ Anda akan melihat taman bermain GraphQL ditampilkan di explorer dan skema yang s
Untuk proyek pemula SubQuery baru, Anda dapat mencoba kueri berikut untuk mengetahui cara kerjanya atau [pelajari lebih lanjut tentang bahasa Kueri GraphQL](../run_publish/graphql.md).
-
-
```graphql
query {
pangolinApprovals(first: 5) {
@@ -217,17 +213,12 @@ query {
}
```
-
-
-
### Publikasikan Proyek SubQuery Anda
SubQuery menyediakan layanan terkelola gratis saat Anda dapat menerapkan proyek baru Anda. Anda dapat menerapkannya ke [Proyek SubQuery](https://project.subquery.network) dan menanyakannya menggunakan [Explorer](https://explorer.subquery.network) kami.
[Baca panduan untuk memublikasikan proyek baru Anda ke Proyek SubQuery](../run_publish/publish.md), **Perhatikan bahwa Anda harus menerapkan melalui IPFS**.
-
-
## Langkah selanjutnya
Selamat, Anda sekarang memiliki proyek SubQuery yang berjalan secara lokal yang menerima permintaan GraphQL API untuk mentransfer data dari bLuna.
diff --git a/docs/id/quickstart/quickstart-cosmos.md b/docs/id/quickstart/quickstart-cosmos.md
index 7236f8edf11..b0c4be4d8e8 100644
--- a/docs/id/quickstart/quickstart-cosmos.md
+++ b/docs/id/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ Setelah proses inisialisasi selesai, Anda akan melihat folder dengan nama proyek
Terakhir, di bawah direktori proyek, jalankan perintah berikut untuk menginstal dependensi proyek baru.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Membuat Perubahan pada Proyek Anda
@@ -75,8 +75,8 @@ type Vote @entity {
**Penting: Saat Anda membuat perubahan apa pun pada file skema, pastikan Anda membuat ulang direktori tipe Anda. Lakukan ini sekarang.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Anda akan menemukan model yang dihasilkan di direktori `/src/types/models`. Untuk informasi lebih lanjut tentang file `schema.graphql`, lihat dokumentasi kami di bawah [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ Untuk informasi lebih lanjut tentang fungsi pemetaan, lihat dokumentasi kami di
Untuk menjalankan Proyek SubQuery baru Anda, pertama-tama kita perlu membangun pekerjaan kita. Jalankan perintah build dari direktori root proyek.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Penting: Setiap kali Anda membuat perubahan pada fungsi pemetaan, Anda harus membangun kembali proyek Anda**
@@ -159,13 +159,11 @@ Semua konfigurasi yang mengontrol bagaimana node SubQuery dijalankan didefinisik
Di bawah direktori proyek jalankan perintah berikut:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Mungkin perlu beberapa saat untuk mengunduh paket yang diperlukan ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
-
-
+`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
### Kueri Proyek Anda
@@ -175,14 +173,11 @@ Anda akan melihat taman bermain GraphQL ditampilkan di explorer dan skema yang s
Untuk proyek pemula SubQuery baru, Anda dapat mencoba kueri berikut untuk mengetahui cara kerjanya atau [pelajari lebih lanjut tentang bahasa Kueri GraphQL](../run_publish/graphql.md).
-
-
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
@@ -194,19 +189,14 @@ query {
}
```
-
Anda dapat melihat kode akhir proyek ini di [https://github.com/jamesbayly/juno-terra-developer-fund-votes](https://github.com/jamesbayly/juno-terra-developer-fund-votes)
-
-
### Publikasikan Proyek SubQuery Anda
SubQuery menyediakan layanan terkelola gratis saat Anda dapat menerapkan proyek baru Anda. Anda dapat menerapkannya ke [Proyek SubQuery](https://project.subquery.network) dan menanyakannya menggunakan [Explorer](https://explorer.subquery.network) kami.
[Baca panduan untuk memublikasikan proyek baru Anda ke Proyek SubQuery](../run_publish/publish.md)
-
-
## Langkah selanjutnya
Selamat, Anda sekarang memiliki proyek SubQuery yang berjalan secara lokal yang menerima permintaan GraphQL API untuk mentransfer data dari bLuna.
diff --git a/docs/id/quickstart/quickstart-polkadot.md b/docs/id/quickstart/quickstart-polkadot.md
index 36abb554644..74021b97fa5 100644
--- a/docs/id/quickstart/quickstart-polkadot.md
+++ b/docs/id/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Anda akan ditanyai pertanyaan tertentu saat proyek SubQuery diinisialisasi:
- Nama Proyek: Nama untuk proyek SubQuery Anda
-- Jaringan: Jaringan blockchain yang akan diindeks oleh proyek SubQuery ini. Gunakan tombol panah untuk memilih dari opsi yang tersedia. Untuk panduan ini, kami akan menggunakan *"Substrat"*
-- Jaringan: Jaringan blockchain yang akan diindeks oleh proyek SubQuery ini. Gunakan tombol panah untuk memilih dari opsi yang tersedia. Untuk panduan ini, kami akan menggunakan *"Polkadot"*
-- Template: Pilih template proyek SubQuery yang akan memberikan titik awal untuk memulai pengembangan. Sebaiknya pilih proyek *"subql-starter"*.
-- Titik akhir RPC: Berikan URL HTTPS ke titik akhir RPC yang sedang berjalan yang akan digunakan secara default untuk proyek ini. Anda dapat dengan cepat mengakses titik akhir publik untuk jaringan Polkadot yang berbeda atau bahkan membuat simpul khusus pribadi Anda sendiri menggunakan [OnFinality](https://app.onfinality.io) atau cukup gunakan titik akhir Polkadot default. Node RPC ini harus berupa node arsip (memiliki status rantai penuh). Untuk panduan ini kami akan menggunakan nilai default *"https://polkadot.api.onfinality.io"*
+- Jaringan: Jaringan blockchain yang akan diindeks oleh proyek SubQuery ini. Gunakan tombol panah untuk memilih dari opsi yang tersedia. Untuk panduan ini, kami akan menggunakan _"Substrat"_
+- Jaringan: Jaringan blockchain yang akan diindeks oleh proyek SubQuery ini. Gunakan tombol panah untuk memilih dari opsi yang tersedia. Untuk panduan ini, kami akan menggunakan _"Polkadot"_
+- Template: Pilih template proyek SubQuery yang akan memberikan titik awal untuk memulai pengembangan. Sebaiknya pilih proyek _"subql-starter"_.
+- Titik akhir RPC: Berikan URL HTTPS ke titik akhir RPC yang sedang berjalan yang akan digunakan secara default untuk proyek ini. Anda dapat dengan cepat mengakses titik akhir publik untuk jaringan Polkadot yang berbeda atau bahkan membuat simpul khusus pribadi Anda sendiri menggunakan [OnFinality](https://app.onfinality.io) atau cukup gunakan titik akhir Polkadot default. Node RPC ini harus berupa node arsip (memiliki status rantai penuh). Untuk panduan ini kami akan menggunakan nilai default _"https://polkadot.api.onfinality.io"_
- Repositori Git: Berikan URL Git ke repo tempat proyek SubQuery ini akan dihosting (ketika dihosting di SubQuery Explorer) atau terima default yang disediakan.
- Penulis: Masukkan pemilik proyek SubQuery ini di sini (mis. nama Anda!) atau terima default yang disediakan.
- Deskripsi: Berikan paragraf singkat tentang proyek Anda yang menjelaskan data apa yang dikandungnya dan apa yang dapat dilakukan pengguna dengannya atau menerima default yang disediakan.
@@ -57,8 +57,8 @@ Setelah proses inisialisasi selesai, Anda akan melihat bahwa folder dengan nama
Terakhir, di bawah direktori proyek, jalankan perintah berikut untuk menginstal dependensi proyek baru.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Membuat Perubahan pada Proyek Anda
@@ -88,8 +88,8 @@ type Transfer @entity {
**Penting: Saat Anda membuat perubahan apa pun pada file skema, pastikan Anda membuat ulang direktori tipe Anda.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Anda akan menemukan model yang dihasilkan di `/src/types/models` directory. Untuk informasi lebih lanjut tentang file `schema.graphql`, lihat dokumentasi kami di bawah [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleTransfer(event: SubstrateEvent): Promise {
- // Dapatkan data dari event
- // Peristiwa balances.transfer memiliki muatan berikut \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Buat entitas transfer baru
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Dapatkan data dari event
+ // Peristiwa balances.transfer memiliki muatan berikut \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Buat entitas transfer baru
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ Untuk informasi lebih lanjut tentang fungsi pemetaan, lihat dokumentasi kami di
Untuk menjalankan Proyek SubQuery baru Anda, pertama-tama kita perlu membangun pekerjaan kita. Jalankan perintah build dari direktori root proyek.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Penting: Setiap kali Anda membuat perubahan pada fungsi pemetaan, Anda harus membangun kembali proyek Anda**
@@ -174,7 +174,7 @@ Semua konfigurasi yang mengontrol bagaimana node SubQuery dijalankan didefinisik
Di bawah direktori proyek jalankan perintah berikut:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Mungkin perlu beberapa saat untuk mengunduh paket yang diperlukan ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan.
@@ -189,10 +189,7 @@ Untuk proyek pemula SubQuery baru, coba kueri berikut untuk memahami cara kerjan
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/id/quickstart/quickstart-terra.md b/docs/id/quickstart/quickstart-terra.md
index caa068c580b..a2669b6837f 100644
--- a/docs/id/quickstart/quickstart-terra.md
+++ b/docs/id/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Anda akan ditanyai pertanyaan tertentu saat proyek SubQuery diinisialisasi:
- Nama Proyek: Nama untuk proyek SubQuery Anda
-- Keluarga Jaringan: Keluarga jaringan blockchain layer-1 yang proyek SubQuery ini akan dikembangkan untuk diindeks, gunakan tombol panah pada keyboard Anda untuk memilih dari opsi, untuk panduan ini kita akan menggunakan *"Terra"*
-- Jaringan: Jaringan spesifik yang akan diindeks oleh proyek SubQuery ini, gunakan tombol panah pada keyboard Anda untuk memilih dari opsi, untuk panduan ini kami akan menggunakan *"Terra"*
-- Template: Pilih template proyek SubQuery yang akan memberikan titik awal untuk memulai pengembangan, sebaiknya pilih *"Proyek Pemula"*
+- Keluarga Jaringan: Keluarga jaringan blockchain layer-1 yang proyek SubQuery ini akan dikembangkan untuk diindeks, gunakan tombol panah pada keyboard Anda untuk memilih dari opsi, untuk panduan ini kita akan menggunakan _"Terra"_
+- Jaringan: Jaringan spesifik yang akan diindeks oleh proyek SubQuery ini, gunakan tombol panah pada keyboard Anda untuk memilih dari opsi, untuk panduan ini kami akan menggunakan _"Terra"_
+- Template: Pilih template proyek SubQuery yang akan memberikan titik awal untuk memulai pengembangan, sebaiknya pilih _"Proyek Pemula"_
- Git repository (Opsional): Berikan URL Git ke repo tempat proyek SubQuery ini akan dihosting (saat dihosting di SubQuery Explorer)
-- RPC endpoint (Diperlukan): Berikan URL HTTPS ke titik akhir RPC yang sedang berjalan yang akan digunakan secara default untuk proyek ini. Node RPC ini harus berupa node arsip (memiliki status rantai penuh). Untuk panduan ini kita akan menggunakan nilai default *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Diperlukan): Berikan URL HTTPS ke titik akhir RPC yang sedang berjalan yang akan digunakan secara default untuk proyek ini. Node RPC ini harus berupa node arsip (memiliki status rantai penuh). Untuk panduan ini kita akan menggunakan nilai default _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (Diperlukan): Masukkan pemilik proyek SubQuery ini di sini (misal. nama Anda!)
- Description (Opsional): Anda dapat memberikan paragraf singkat tentang proyek Anda yang menjelaskan data apa yang ada di dalamnya dan apa yang dapat dilakukan pengguna dengannya
- Version (Diperlukan): Masukkan nomor versi khusus atau gunakan default (`1.0.0`)
@@ -59,8 +59,8 @@ Setelah proses inisialisasi selesai, Anda akan melihat folder dengan nama proyek
Terakhir, di bawah direktori proyek, jalankan perintah berikut untuk menginstal dependensi proyek baru.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Membuat Perubahan pada Proyek Anda
@@ -91,8 +91,8 @@ type Transfer @entity {
**Penting: Saat Anda membuat perubahan apa pun pada file skema, pastikan Anda membuat ulang direktori tipe Anda. Lakukan ini sekarang.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Anda akan menemukan model yang dihasilkan di direktori `/src/types/models`. Untuk informasi lebih lanjut tentang file `schema.graphql`, lihat dokumentasi kami di bawah [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ Untuk informasi lebih lanjut tentang fungsi pemetaan, lihat dokumentasi kami di
Untuk menjalankan Proyek SubQuery baru Anda, pertama-tama kita perlu membangun pekerjaan kita. Jalankan perintah build dari direktori root proyek.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Penting: Setiap kali Anda membuat perubahan pada fungsi pemetaan, Anda harus membangun kembali proyek Anda**
@@ -192,13 +192,11 @@ Semua konfigurasi yang mengontrol bagaimana node SubQuery dijalankan didefinisik
Di bawah direktori proyek jalankan perintah berikut:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Mungkin perlu beberapa saat untuk mengunduh paket yang diperlukan ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
-
-
+`@subql/query`7 >, dan Postgres) untuk pertama kalinya tetapi segera Anda akan melihat node SubQuery yang sedang berjalan. Sabar pada proses di sini.
### Kueri Proyek Anda
@@ -208,15 +206,10 @@ Anda akan melihat taman bermain GraphQL ditampilkan di explorer dan skema yang s
Untuk proyek pemula SubQuery baru, Anda dapat mencoba kueri berikut untuk mengetahui cara kerjanya atau [pelajari lebih lanjut tentang bahasa Kueri GraphQL](../run_publish/graphql.md).
-
-
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
@@ -230,17 +223,12 @@ Untuk proyek pemula SubQuery baru, Anda dapat mencoba kueri berikut untuk menget
}
```
-
-
-
### Publikasikan Proyek SubQuery Anda
SubQuery menyediakan layanan terkelola gratis saat Anda dapat menerapkan proyek baru Anda. Anda dapat menerapkannya ke [Proyek SubQuery](https://project.subquery.network) dan menanyakannya menggunakan [Explorer](https://explorer.subquery.network) kami.
[Baca panduan untuk memublikasikan proyek baru Anda ke Proyek SubQuery](../run_publish/publish.md)
-
-
## Langkah selanjutnya
Selamat, Anda sekarang memiliki proyek SubQuery yang berjalan secara lokal yang menerima permintaan GraphQL API untuk mentransfer data dari bLuna.
diff --git a/docs/id/quickstart/quickstart.md b/docs/id/quickstart/quickstart.md
index 0510cdd432c..c4cf603c691 100644
--- a/docs/id/quickstart/quickstart.md
+++ b/docs/id/quickstart/quickstart.md
@@ -26,7 +26,7 @@ Instal SubQuery CLI secara global di terminal Anda dengan menggunakan NPM:
npm install -g @subql/cli
```
-::::: bahaya Kami **JANGAN** mendorong penggunaan `yarn global` untuk menginstal `@subql/cli` karena manajemen ketergantungannya yang buruk. Hal ini dapat menyebabkan beberapa kesalahan. :::
+:::: bahaya Kami **JANGAN** mendorong penggunaan `yarn global` untuk menginstal `@subql/cli` karena manajemen ketergantungannya yang buruk. Hal ini dapat menyebabkan beberapa kesalahan. :::
Lihatlah semua perintah yang tersedia dan penggunaannya. Jalankan perintah yang diberikan di bawah ini di CLI:
@@ -42,7 +42,7 @@ Jalankan perintah berikut di dalam direktori yang ingin Anda buat proyek SubQuer
subql init
```
-:::: peringatan Penting
+::: peringatan Penting
**Untuk Pengguna Cosmos**
@@ -89,8 +89,8 @@ Setelah Anda menyelesaikan proses inisialisasi, Anda akan melihat folder dengan
Terakhir, jalankan perintah berikut untuk menginstal dependensi proyek baru dari dalam direktori proyek baru.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
You have now initialised your first SubQuery project with just a few simple steps. Sekarang mari kita sesuaikan proyek templat standar untuk blockchain tertentu yang diminati.
@@ -104,4 +104,4 @@ Ada 3 file penting yang perlu dimodifikasi. Ini adalah:
2. Manifes Proyek di `project.yaml`.
3. Fungsi-fungsi pemetaan dalam direktori `src/mappings/`.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/id/run_publish/aggregate.md b/docs/id/run_publish/aggregate.md
index 4404b672aed..e82f1cfaf2e 100644
--- a/docs/id/run_publish/aggregate.md
+++ b/docs/id/run_publish/aggregate.md
@@ -26,6 +26,6 @@ SubQuery menyediakan fungsi agregat berikut saat dalam mode tidak aman:
Implementasi fungsi agregat SubQuery didasarkan pada [pg-agregates](https://github.com/graphile/pg-aggregates), Anda dapat menemukan informasi lebih lanjut di sana.
-:::: peringatan Penting Harap dicatat bahwa anda harus mengaktifkan flag `--unsafe` pada layanan kueri untuk menggunakan fungsi-fungsi ini. [Baca untuk selengkapnya](./references.md#unsafe-query-service).
+::: peringatan Penting Harap dicatat bahwa anda harus mengaktifkan flag `--unsafe` pada layanan kueri untuk menggunakan fungsi-fungsi ini. [Baca untuk selengkapnya](./references.md#unsafe-query-service).
Juga, perhatikan bahwa perintah `--unsafe` akan mencegah proyek Anda dijalankan di Jaringan SubQuery, dan Anda harus menghubungi dukungan jika Anda ingin perintah ini dijalankan dengan proyek Anda di [ layanan terkelola SubQuery](https://project.subquery.network). :::
diff --git a/docs/id/run_publish/connect.md b/docs/id/run_publish/connect.md
index 1d32b06c3ab..ce011d5f798 100644
--- a/docs/id/run_publish/connect.md
+++ b/docs/id/run_publish/connect.md
@@ -2,10 +2,10 @@
Setelah penyebaran Anda berhasil diselesaikan dan node kami telah mengindeks data Anda dari rantai, Anda akan dapat terhubung ke proyek Anda melalui titik akhir Query yang ditampilkan.
-![Proyek sedang diterapkan dan disinkronkan](/assets/img/projects-deploy-sync.png)
+![Proyek sedang diterapkan dan disinkronkan](/assets/img/projects_deploy_sync.png)
Atau, Anda dapat mengklik tiga titik di samping judul proyek Anda, dan melihatnya di SubQuery Explorer. Di sana Anda dapat menggunakan taman bermain di browser untuk memulai.
-![Proyek di SubQuery Explorer](/assets/img/projects-explorer.png)
+![Proyek di SubQuery Explorer](/assets/img/projects_explorer.png)
-::::: info Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
+:::: tip Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/id/run_publish/publish.md b/docs/id/run_publish/publish.md
index c9c1c6190e9..7a3775495da 100644
--- a/docs/id/run_publish/publish.md
+++ b/docs/id/run_publish/publish.md
@@ -19,7 +19,7 @@ Anda bisa meningkatkan untuk memanfaatkan layanan berbayar berikut ini:
Saat menerapkan ke Layanan Terkelola SubQuery, Anda harus terlebih dahulu meng-host basis kode Anda di [IPFS](https://ipfs.io/). Hosting a project in IPFS makes it available for everyone and reduces your reliance on centralised services like GitHub.
-::::peringatan Alur Penyebaran GitHub sudah tidak digunakan lagi untuk IPFS
+:::peringatan Alur Penyebaran GitHub sudah tidak digunakan lagi untuk IPFS
Jika proyek Anda masih disebarkan melalui GitHub, baca panduan migrasi untuk penyebaran IPFS [di sini](./ipfs.md) :::
diff --git a/docs/id/run_publish/query.md b/docs/id/run_publish/query.md
index 1604d9f0a11..eec93d166c1 100644
--- a/docs/id/run_publish/query.md
+++ b/docs/id/run_publish/query.md
@@ -12,4 +12,4 @@ Anda juga akan melihat bahwa SubQuery Explorer menyediakan tempat bermain untuk
Pada bagian kanan atas taman bermain, Anda akan menemukan tombol _Docs_ yang akan membuka gambar dokumentasi. Dokumentasi ini dibuat secara otomatis dan membantu Anda menemukan entitas dan metode apa yang dapat Anda kueri.
-::::: info Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
+:::: tip Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/id/run_publish/references.md b/docs/id/run_publish/references.md
index 6d537b33a3c..faf1a7dfc52 100644
--- a/docs/id/run_publish/references.md
+++ b/docs/id/run_publish/references.md
@@ -21,11 +21,11 @@ PERINTAH
Perintah ini menggunakan webpack untuk menghasilkan bundel proyek subquery.
-| Pilihan | Deskripsi |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | folder lokal proyek subquery (jika belum ada di folder) |
-| -o, --output | tentukan folder keluaran build mis. membangun-folder |
-| --mode=(production | prod | development | dev) | [ default: production ] |
+| Pilihan | Deskripsi |
+| ------------------ | ------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | folder lokal proyek subquery (jika belum ada di folder) |
+| -o, --output | tentukan folder keluaran build mis. membangun-folder |
+| --mode=(production | prod | development | dev) | [ default: production ] |
- Dengan `subql build` Anda dapat menentukan titik masuk tambahan di bidang ekspor meskipun itu akan selalu dibangun `index.ts` secara otomatis.
@@ -106,7 +106,7 @@ Ini menampilkan versi saat ini.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ Ini akan memindahkan pengambilan dan pemrosesan blok ke dalam worker. Secara def
Saat ini, ini masih dalam tahap percobaan awal, tetapi kami berencana untuk mengaktifkannya secara default. :::
-:::: info Catatan
+::: tip Catatan
Fitur ini tersedia untuk Substrate dan Cosmos, dan akan segera diintegrasikan untuk Avalanche.
:::
diff --git a/docs/id/run_publish/run.md b/docs/id/run_publish/run.md
index 8676a8c74fc..b68788ffbbb 100644
--- a/docs/id/run_publish/run.md
+++ b/docs/id/run_publish/run.md
@@ -4,7 +4,7 @@ Panduan ini bekerja melalui cara menjalankan node SubQuery lokal pada infrastruk
## Gunakan Docker
-Solusi alternatif adalah dengan menjalankan Docker Container, yang ditentukan oleh file `docker-compose.yml`. Untuk proyek baru yang baru saja diinisialisasi, Anda tidak perlu mengubah apa pun di sini.
+Solusi alternatif adalah dengan menjalankan **Docker Container**, yang ditentukan oleh file `docker-compose.yml`. Untuk proyek baru yang baru saja diinisialisasi, Anda tidak perlu mengubah apa pun di sini.
Di bawah direktori proyek jalankan perintah berikut:
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Instalasi
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Setelah terinstal, Anda dapat memulai node dengan perintah berikut:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Arahkan ke jalur proyek lokal
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Bergantung pada konfigurasi database Postgres Anda (misalnya kata sandi database
#### Tentukan file konfigurasi
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ Saat pengindeks pertama kali mengindeks rantai, mengambil blok tunggal akan seca
#### Berjalan dalam mode lokal
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Beralih ke model lokal akan membuat tabel Postgres dalam skema default `publik`.
diff --git a/docs/id/run_publish/subscription.md b/docs/id/run_publish/subscription.md
index c732db9c928..138769ab050 100644
--- a/docs/id/run_publish/subscription.md
+++ b/docs/id/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery sekarang juga mendukung Graphql Subscriptions. Seperti kueri, langganan
Langganan sangat berguna ketika Anda ingin aplikasi klien Anda mengubah data atau menampilkan beberapa data baru segera setelah perubahan itu terjadi atau data baru tersedia. Langganan memungkinkan Anda untuk _berlangganan_ ke proyek SubQuery Anda untuk perubahan.
-:::: info Catatan Baca lebih lanjut tentang [Langganan](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Catatan Baca lebih lanjut tentang [Langganan](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## Cara Berlangganan Entitas
@@ -65,8 +65,8 @@ subscription {
Perhatikan bahwa filter `mutasi` dapat berupa salah satu dari `INSERT`, `UPDATE` atau `DELETE`.
-:::: peringatan Penting Harap dicatat bahwa Anda harus mengaktifkan flag `--subscription` pada node dan layanan kueri untuk menggunakan fungsi-fungsi ini. :::
+::: peringatan Penting Harap dicatat bahwa Anda harus mengaktifkan flag `--subscription` pada node dan layanan kueri untuk menggunakan fungsi-fungsi ini. :::
-::::: peringatan Penting
+:::: peringatan Penting
Fitur subkripsi berfungsi pada Layanan Terkelola SubQuery ketika Anda langsung memanggil titik akhir GraphQL yang terdaftar. Ini tidak akan berfungsi dalam taman bermain GraphQL dalam browser.
:::
diff --git a/docs/id/run_publish/upgrade.md b/docs/id/run_publish/upgrade.md
index 894cd3b2dd9..59d9e60259f 100644
--- a/docs/id/run_publish/upgrade.md
+++ b/docs/id/run_publish/upgrade.md
@@ -52,7 +52,7 @@ Dengan diperkenalkannya fitur penyebaran untuk CLI, kami telah menambahkan **Def
- Langkah 3: Setelah proyek Anda dibuat, navigasikan ke halaman Tindakan GitHub dari proyek Anda, dan pilih alur kerja `CLI deploy`.
- Langkah 4: Anda akan melihat bidang input di mana Anda dapat memasukkan kode unik proyek Anda yang dibuat di SubQuery Projects. Anda bisa mendapatkan kode dari URL di Proyek SubQuery [Proyek SubQuery](https://project.subquery.network). Kode ini didasarkan pada nama proyek Anda, di mana spasi diganti dengan tanda hubung `-`. misalnya `nama proyek saya` menjadi `nama proyek saya`.
-:::: tips Tip
+::: tips Tip
Setelah alur kerja selesai, Anda seharusnya dapat melihat proyek Anda diterapkan ke Managed Service kami.
:::
@@ -77,10 +77,10 @@ Jika Anda hanya ingin meng-upgrade ke indexer terbaru ([`@subql/node`](https://w
Setelah penerapan Anda berhasil diselesaikan dan node kami telah mengindeks data Anda dari chain, Anda akan dapat terhubung ke proyek Anda melalui titik akhir Kueri GraphQL yang ditampilkan.
-![Proyek sedang diterapkan dan disinkronkan](/assets/img/projects-deploy-sync.png)
+![Proyek sedang diterapkan dan disinkronkan](/assets/img/projects_deploy_sync.png)
Atau, Anda dapat mengklik tiga titik di samping judul proyek Anda, dan melihatnya di SubQuery Explorer. Di sana Anda dapat menggunakan taman bermain di browser untuk memulai - [baca lebih lanjut tentang cara menggunakan Explorer kami di sini](../run_publish/query.md).
-![Proyek di SubQuery Explorer](/assets/img/projects-explorer.png)
+![Proyek di SubQuery Explorer](/assets/img/projects_explorer.png)
-::::: info Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
+:::: tip Catatan Pelajari lebih lanjut tentang [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/id/subquery_network/introduction.md b/docs/id/subquery_network/introduction.md
index e347a578b6e..65eed51f966 100644
--- a/docs/id/subquery_network/introduction.md
+++ b/docs/id/subquery_network/introduction.md
@@ -18,22 +18,22 @@ Ada peran untuk semua orang di jaringan, mulai dari pengembang yang sangat tekni
Konsumen akan meminta Jaringan SubQuery untuk data spesifik untuk dApps atau alat mereka, dan membayar sejumlah SQT yang diiklankan untuk setiap permintaan.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Pengindeks
Pengindeks akan menjalankan dan memelihara proyek SubQuery berkualitas tinggi di infrastruktur mereka sendiri, menjalankan pengindeks dan layanan kueri, dan akan diberi imbalan dalam SQT untuk permintaan yang mereka layani.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegator
Delegator akan berpartisipasi dalam Jaringan dengan mendukung Pengindeks favorit mereka untuk mendapatkan hadiah berdasarkan pekerjaan yang dilakukan pengindeks tersebut.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Arsitek
Arsitek adalah pembangun proyek SubQuery yang dijalankan Jaringan. Mereka menulis dan menerbitkan proyek SubQuery untuk Jaringan untuk diindeks dan dijalankan.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/it/README.md b/docs/it/README.md
index 35751300ad8..51e4ceecb51 100644
--- a/docs/it/README.md
+++ b/docs/it/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -135,8 +135,7 @@
-
-
+
diff --git a/docs/it/academy/tutorials_examples/block-height.md b/docs/it/academy/tutorials_examples/block-height.md
index 7086169ed3a..76ae1f61026 100644
--- a/docs/it/academy/tutorials_examples/block-height.md
+++ b/docs/it/academy/tutorials_examples/block-height.md
@@ -42,7 +42,7 @@ Lo svantaggio più ovvio sarà che non sarete in grado di interrogare i dati sul
## Come capire l'altezza attuale della blockchain?
-Se si utilizza la rete Polkadot, è possibile visitare < 0 > https ://polkascan.io/0 >, selezionare la rete e quindi visualizzare la figura «Blocco finalizzato».
+Se si utilizza la rete Polkadot, è possibile visitare https://polkascan.io/, selezionare la rete e quindi visualizzare la figura «Blocco finalizzato».
## Devo fare una ricostruzione o un codegen?
diff --git a/docs/it/academy/tutorials_examples/delete-projects.md b/docs/it/academy/tutorials_examples/delete-projects.md
index e6157e71350..0746f4fba95 100644
--- a/docs/it/academy/tutorials_examples/delete-projects.md
+++ b/docs/it/academy/tutorials_examples/delete-projects.md
@@ -4,7 +4,7 @@
È importante mantenere i progetti in SubQuery Project aggiornati e pertinenti. Per i progetti caricati in SubQuery Project a fini di test, è consigliabile eliminarli successivamente per risparmiare risorse e costi.
-L'esecuzione di un nodo indicizzatore è un'altra opzione al di fuori dell'utilizzo di Docker o di un progetto ospitato in < 0 > Progetti SubQuery 0 >. Richiede più tempo e sforzo, ma migliorerà la comprensione di come SubQuery funziona sotto le copertine.
+L'esecuzione di un nodo indicizzatore è un'altra opzione al di fuori dell'utilizzo di Docker o di un progetto ospitato in Progetti SubQuery. Richiede più tempo e sforzo, ma migliorerà la comprensione di come SubQuery funziona sotto le copertine.
## Eliminazione di un progetto dallo slot di gestione temporanea
@@ -16,4 +16,4 @@ Per eliminare un progetto dallo slot di gestione temporanea, fare clic sui 3 pun
Per eliminare un progetto dallo slot di produzione, è necessario eliminare l'intero progetto. Passare all'angolo in alto a destra e fare clic sui 3 punti che sono le impostazioni per l'intero progetto. Quindi selezionare «Elimina progetto».
-![Eliminazione di un progetto dallo slot di gestione temporanea](/assets/img/delete_production.png)
\ No newline at end of file
+![Eliminazione di un progetto dallo slot di gestione temporanea](/assets/img/delete_production.png)
diff --git a/docs/it/academy/tutorials_examples/dictionary.md b/docs/it/academy/tutorials_examples/dictionary.md
index 5f7dfddcc9c..2660ec56733 100644
--- a/docs/it/academy/tutorials_examples/dictionary.md
+++ b/docs/it/academy/tutorials_examples/dictionary.md
@@ -1,10 +1,10 @@
# Funzionamento di un dizionario SubQuery
-L'idea di un progetto di dizionario generico è quella di indicizzare tutti i dati di una blockchain e registrare gli eventi, gli estrinsechi e i relativi tipi (modulo e metodo) in un database in ordine di altezza del blocco. Un altro progetto può quindi interrogare questo punto finale < 0 > network.dictionary 0 > anziché quello predefinito < 0 > network.endpoint 0 > definito nel file manifesto.
+L'idea di un progetto di dizionario generico è quella di indicizzare tutti i dati di una blockchain e registrare gli eventi, gli estrinsechi e i relativi tipi (modulo e metodo) in un database in ordine di altezza del blocco. Un altro progetto può quindi interrogare questo punto finale network.dictionary anziché quello predefinito network.endpoint definito nel file manifesto.
-L'endpoint < 0 > network.dictionary 0 > è un parametro facoltativo che, se presente, rileverà e utilizzerà automaticamente. < 0 > network.endpoint 0 > è obbligatorio e non verrà compilato se non è presente.
+L'endpoint network.dictionary è un parametro facoltativo che, se presente, rileverà e utilizzerà automaticamente. network.endpoint è obbligatorio e non verrà compilato se non è presente.
-Prendendo come esempio il dizionario < 0 > SubQuery 0 > del progetto, il file < 1 > schema 1 > definisce le entità 3; estrinseco, eventi, specVersion. Queste 3 entità contengono 6, 4 e 2 campi rispettivamente. Quando questo progetto è in esecuzione, questi campi sono riflessi nelle tabelle del database.
+Prendendo come esempio il dizionario SubQuery del progetto, il file schema definisce le entità 3; estrinseco, eventi, specVersion. Queste 3 entità contengono 6, 4 e 2 campi rispettivamente. Quando questo progetto è in esecuzione, questi campi sono riflessi nelle tabelle del database.
![tabella extrinsics](/assets/img/extrinsics_table.png) ![tabella eventi](/assets/img/events_table.png) ![tabella specversion](/assets/img/specversion_table.png)
@@ -12,7 +12,7 @@ I dati della blockchain vengono quindi memorizzati in queste tabelle e indicizza
## Come incorporare un dizionario nel progetto?
-Aggiungere < 0 > dizionario: https ://api.subquery.network/sq/subquery/dictionary-polkadot 0 > alla sezione di rete del manifesto. Eg:
+Aggiungere dizionario: https://api.subquery.network/sq/subquery/dictionary-polkadot alla sezione di rete del manifesto. Eg:
```shell
network:
@@ -22,7 +22,7 @@ network:
## Cosa succede quando un dizionario NON è usato?
-Quando un dizionario NON viene utilizzato, un indicizzatore preleva tutti i dati di blocco tramite l'api polkadot in base al flag < 0 > batch-size 0 > che è 100 per impostazione predefinita e lo inserisce in un buffer per l'elaborazione. In seguito, l'indicizzatore prende tutti questi blocchi dal buffer e, durante l'elaborazione dei dati del blocco, verifica se l'evento e l'estrinseco in questi blocchi corrispondono al filtro definito dall'utente.
+Quando un dizionario NON viene utilizzato, un indicizzatore preleva tutti i dati di blocco tramite l'api polkadot in base al flag `batch-size` che è 100 per impostazione predefinita e lo inserisce in un buffer per l'elaborazione. In seguito, l'indicizzatore prende tutti questi blocchi dal buffer e, durante l'elaborazione dei dati del blocco, verifica se l'evento e l'estrinseco in questi blocchi corrispondono al filtro definito dall'utente.
## Cosa succede quando si utilizza un dizionario?
@@ -38,6 +38,6 @@ Ciò significa che l'utilizzo di un dizionario può ridurre la quantità di dati
## Quando un dizionario NON è utile?
-Quando < 0 > gestori di blocchi 0 > vengono utilizzati per raccogliere dati da una catena, ogni blocco deve essere elaborato. Pertanto, l'utilizzo di un dizionario in questo caso non offre alcun vantaggio e l'indicizzatore passa automaticamente all'approccio predefinito non dizionario.
+Quando gestori di blocchi vengono utilizzati per raccogliere dati da una catena, ogni blocco deve essere elaborato. Pertanto, l'utilizzo di un dizionario in questo caso non offre alcun vantaggio e l'indicizzatore passa automaticamente all'approccio predefinito non dizionario.
-Inoltre, quando si tratta di eventi o di eventi estrinseci che si verificano o esistono in ogni blocco come < 0 > timestamp.set 0 >, l'utilizzo di un dizionario non offrirà alcun vantaggio aggiuntivo.
+Inoltre, quando si tratta di eventi o di eventi estrinseci che si verificano o esistono in ogni blocco come `timestamp.set`, l'utilizzo di un dizionario non offrirà alcun vantaggio aggiuntivo.
diff --git a/docs/it/academy/tutorials_examples/run-indexer.md b/docs/it/academy/tutorials_examples/run-indexer.md
index eeb77f0d92a..81af02487eb 100644
--- a/docs/it/academy/tutorials_examples/run-indexer.md
+++ b/docs/it/academy/tutorials_examples/run-indexer.md
@@ -8,7 +8,7 @@
## Introduction
-L'esecuzione di un nodo indicizzatore è un'altra opzione al di fuori dell'utilizzo di Docker o di un progetto ospitato in < 0 > Progetti SubQuery 0 >. Richiede più tempo e sforzo, ma migliorerà la comprensione di come SubQuery funziona sotto le copertine.
+L'esecuzione di un nodo indicizzatore è un'altra opzione al di fuori dell'utilizzo di Docker o di un progetto ospitato in `Progetti SubQuery`. Richiede più tempo e sforzo, ma migliorerà la comprensione di come SubQuery funziona sotto le copertine.
## Postgres
diff --git a/docs/it/build/install.md b/docs/it/build/install.md
index ddfa2eb6c32..22ea46b5302 100644
--- a/docs/it/build/install.md
+++ b/docs/it/build/install.md
@@ -8,28 +8,30 @@ Lo strumento [@subql/cli](https://github.com/subquery/subql/tree/docs-new-sectio
Installate SubQuery CLI globalmente sul vostro terminale usando Yarn o NPM:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
You can then run help to see available commands and usage provide by CLI:
```shell
subql help
```
+
## Installare @subql/node
Un nodo SubQuery è un'implementazione che estrae i dati della blockchain basati sul substrato secondo il progetto SubQuery e li salva in un database Postgres.
Installate il nodo SubQuery globalmente sul vostro terminale usando Yarn o NPM:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Once installed, you can can start a node with:
```shell
subql-node
```
+
> Nota: se state usando Docker o ospitate il vostro progetto in SubQuery Projects, potete saltare questo passo. Questo perché il nodo SubQuery è già fornito nel contenitore Docker e nell'infrastruttura di hosting.
## Installare @subql/query
@@ -38,7 +40,7 @@ La libreria di query SubQuery fornisce un servizio che ti permette di interrogar
Installate la query SubQuery globalmente sul vostro terminale usando Yarn o NPM:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Nota: se state usando Docker o ospitate il vostro progetto in SubQuery Projects, potete anche saltare questo passo. Questo perché il nodo SubQuery è già fornito nel contenitore Docker e nell'infrastruttura di hosting.
\ No newline at end of file
+> Nota: se state usando Docker o ospitate il vostro progetto in SubQuery Projects, potete anche saltare questo passo. Questo perché il nodo SubQuery è già fornito nel contenitore Docker e nell'infrastruttura di hosting.
diff --git a/docs/it/build/introduction.md b/docs/it/build/introduction.md
index 442fc83b5a7..1d16d7d9abe 100644
--- a/docs/it/build/introduction.md
+++ b/docs/it/build/introduction.md
@@ -51,8 +51,8 @@ Per eseguire il tuo progetto SubQuery su un SubQuery Node ospitato localmente, d
Eseguite il comando di compilazione dalla directory principale del progetto.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Alternative build options
diff --git a/docs/it/build/manifest.md b/docs/it/build/manifest.md
index e744ddddef4..8737af44f7d 100644
--- a/docs/it/build/manifest.md
+++ b/docs/it/build/manifest.md
@@ -4,7 +4,7 @@ Il file Manifest `project.yaml` può essere visto come un punto di ingresso del
Il manifesto può essere in formato YAML o JSON. In questo documento useremo YAML in tutti gli esempi. Di seguito è riportato un esempio standard di un `project.yaml` di base.
- ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## Migrating from v0.0.1 to v0.2.0
@@ -81,9 +81,9 @@ Defines the data that will be filtered and extracted and the location of the map
### Mapping Spec
-| Field | v0.0.1 | v0.2.0 | Description |
-| ---------------------- | ------------------------------------------------------------------------ | --------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **file** | String | 𐄂 | Path to the mapping entry |
+| Field | v0.0.1 | v0.2.0 | Description |
+| ---------------------- | ------------------------------------------------------------------------ | --------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **file** | String | 𐄂 | Path to the mapping entry |
| **handlers & filters** | [Default handlers and filters](./manifest/#mapping-handlers-and-filters) | Default handlers and filters, [Custom handlers and filters](#custom-data-sources) | Elenca tutte le [mapping functions](./mapping/polkadot.md) e i relativi tipi di gestori, con filtri di mappatura aggiuntivi.
For custom runtimes mapping handlers please view [Custom data sources](#custom-data-sources) |
## Data Sources and Mapping
@@ -104,8 +104,8 @@ The following table explains filters supported by different handlers.
**Your SubQuery project will be much more efficient when you only use event and call handlers with appropriate mapping filters**
-| Handler | Supported filter |
-| ------------------------------------------ | ---------------------------- |
+| Handler | Supported filter |
+| --------------------------------------------------- | ---------------------------- |
| [BlockHandler](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -153,8 +153,8 @@ We support the additional types used by substrate runtime modules, `typesAlias`,
In the v0.2.0 example below, the `network.chaintypes` are pointing to a file that has all the custom types included, This is a standard chainspec file that declares the specific types supported by this blockchain in either `.json`, `.yaml` or `.js` format.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
To use typescript for your chain types file include it in the `src` folder (e.g. `./src/types.ts`), run `yarn build` and then point to the generated js file located in the `dist` folder.
@@ -171,7 +171,7 @@ Things to note about using the chain types file with extension `.ts` or `.js`:
Here is an example of a `.ts` chain types file:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## Custom Data Sources
@@ -197,6 +197,6 @@ Users can add a `filter` on `dataSources` to decide which data source to run on
Below is an example that shows different data sources for both the Polkadot and Kusama networks.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/it/build/mapping.md b/docs/it/build/mapping.md
index 2dc8590b25b..3f65df618ab 100644
--- a/docs/it/build/mapping.md
+++ b/docs/it/build/mapping.md
@@ -67,9 +67,9 @@ Il nostro obiettivo è coprire tutte le origini dati per gli utenti per i gestor
Queste sono le interfacce che attualmente supportiamo:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) will query the current block.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) will make multiple queries of the same type at the current block.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) will make multiple queries of different types at the current block.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) will query the **current** block.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) will make multiple queries of the **same** type at the current block.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) will make multiple queries of **different** types at the current block.
Queste sono le interfacce che attualmente **NON** supportiamo:
diff --git a/docs/it/build/substrate-evm.md b/docs/it/build/substrate-evm.md
index c7897ced956..f83402c1538 100644
--- a/docs/it/build/substrate-evm.md
+++ b/docs/it/build/substrate-evm.md
@@ -74,7 +74,7 @@ Works in the same way as [substrate/EventHandler](../create/mapping/#event-handl
| ------ | ------------ | --------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------ |
| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | The topics filter follows the Ethereum JSON-PRC log filters, more documentation can be found [here](https://docs.ethers.io/v5/concepts/events/). |
-Note on topics:
+**Note on topics:**
There are a couple of improvements from basic log filters:
- Topics don't need to be 0 padded
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Known Limitations
diff --git a/docs/it/faqs/faqs.md b/docs/it/faqs/faqs.md
index eab94389f0e..89ba9691a63 100644
--- a/docs/it/faqs/faqs.md
+++ b/docs/it/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery also provides free, production grade hosting of projects for developers
**The SubQuery Network**
-The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
+The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
More information [here](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ The best way to get started with SubQuery is to try out our [Hello World tutoria
## Come posso contribuire o fornire feedback a SubQuery?
-Amiamo i contributi e il feedback della comunità. To contribute the code, fork the repository of your interest and make your changes. Quindi inviate una PR o una richiesta di estrazione. Don't forget to test as well. Also check out our contributions guidelines.
+Amiamo i contributi e il feedback della comunità. To contribute the code, fork the repository of your interest and make your changes. Quindi inviate una PR o una richiesta di estrazione. Don't forget to test as well. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
To give feedback, contact us at hello@subquery.network or jump onto our [discord channel](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Note that it is recommended to use `--force-clean` when changing the `startBlock` within the project manifest (`project.yaml`) in order to begin reindexing from the configured block. If `startBlock` is changed without a `--force-clean` of the project, then the indexer will continue indexing with the previously configured `startBlock`.
-
## How can I optimise my project to speed it up?
Performance is a crucial factor in each project. Fortunately, there are several things you could do to improve it. Here is the list of some suggestions:
@@ -89,13 +88,13 @@ Performance is a crucial factor in each project. Fortunately, there are several
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/it/miscellaneous/contributing.md b/docs/it/miscellaneous/contributing.md
index 8f148e18c7a..c39fc0fdd5a 100644
--- a/docs/it/miscellaneous/contributing.md
+++ b/docs/it/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
Benvenuto e un grande grazie per aver considerato di contribuire a questo progetto SubQuery! Insieme possiamo aprire la strada verso un futuro più decentralizzato.
-::: info Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
+::: tip Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
What follows is a set of guidelines (not rules) for contributing to SubQuery. Following these guidelines will help us make the contribution process easy and effective for everyone involved. It also communicates that you agree to respect the time of the developers managing and developing this project. In return, we will reciprocate that respect by addressing your issue, considering changes, collaborating on improvements, and helping you finalise your pull requests.
@@ -14,8 +14,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* Cerca le questioni e i PR esistenti prima di crearne uno tuo.
-* Lavoriamo duramente per assicurarci che i problemi siano gestiti tempestivamente ma, a seconda dell'impatto, potrebbe essere necessario un po' di tempo per indagare sulla causa principale. Un'amichevole menzione @ nel thread dei commenti al presentatore o a un collaboratore può aiutare ad attirare l'attenzione se il tuo problema è bloccante.
+- Cerca le questioni e i PR esistenti prima di crearne uno tuo.
+- Lavoriamo duramente per assicurarci che i problemi siano gestiti tempestivamente ma, a seconda dell'impatto, potrebbe essere necessario un po' di tempo per indagare sulla causa principale. Un'amichevole menzione @ nel thread dei commenti al presentatore o a un collaboratore può aiutare ad attirare l'attenzione se il tuo problema è bloccante.
## Come contribuire
@@ -23,32 +23,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* Usate un titolo chiaro e descrittivo per la questione per identificare il problema.
-* Descrivete i passi esatti per riprodurre il problema.
-* Descrivi il comportamento che hai osservato dopo aver seguito i passi.
-* Spiega quale comportamento ti aspettavi di vedere invece e perché.
-* Includi degli screenshot se possibile.
+- Usate un titolo chiaro e descrittivo per la questione per identificare il problema.
+- Descrivete i passi esatti per riprodurre il problema.
+- Descrivi il comportamento che hai osservato dopo aver seguito i passi.
+- Spiega quale comportamento ti aspettavi di vedere invece e perché.
+- Includi degli screenshot se possibile.
### Invio di richieste di pull
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## Convenzioni di codifica
### Messaggi di commit di Git
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### JavaScript Styleguide
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/it/quickstart/helloworld-localhost.md b/docs/it/quickstart/helloworld-localhost.md
index 9d3b421f191..b369894390b 100644
--- a/docs/it/quickstart/helloworld-localhost.md
+++ b/docs/it/quickstart/helloworld-localhost.md
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Ora fate un'installazione di yarn o node per installare le varie dipendenze.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
An example of `yarn install`
@@ -109,8 +109,8 @@ success Saved lockfile.
Ora esegui `yarn codegen` per generare Typescript dallo schema GraphQL.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
An example of `yarn codegen`
@@ -133,8 +133,8 @@ $ ./node_modules/.bin/subql codegen
Il prossimo passo è costruire il codice con `yarn build`.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
An example of `yarn build`
diff --git a/docs/it/quickstart/quickstart-avalanche.md b/docs/it/quickstart/quickstart-avalanche.md
index ebee0b8c298..dea3d8927fc 100644
--- a/docs/it/quickstart/quickstart-avalanche.md
+++ b/docs/it/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -183,7 +183,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
diff --git a/docs/it/quickstart/quickstart-cosmos.md b/docs/it/quickstart/quickstart-cosmos.md
index 32366fbd189..d41507a9961 100644
--- a/docs/it/quickstart/quickstart-cosmos.md
+++ b/docs/it/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -75,8 +75,8 @@ type Vote @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -159,7 +159,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -173,10 +173,9 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/it/quickstart/quickstart-polkadot.md b/docs/it/quickstart/quickstart-polkadot.md
index 6c68a29a9cb..66cd1cd66be 100644
--- a/docs/it/quickstart/quickstart-polkadot.md
+++ b/docs/it/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
You'll be asked certain questions as the SubQuery project is initalised:
- Project name: A project name for your SubQuery project
-- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Substrate"*
-- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Polkadot"*
-- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the *"subql-starter"* project.
-- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value *"https://polkadot.api.onfinality.io"*
+- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Substrate"_
+- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Polkadot"_
+- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the _"subql-starter"_ project.
+- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value _"https://polkadot.api.onfinality.io"_
- Git repository: Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer) or accept the provided default.
- Authors: Enter the owner of this SubQuery project here (e.g. your name!) or accept the provided default.
- Description: Provide a short paragraph about your project that describes what data it contains and what users can do with it or accept the provided default.
@@ -57,8 +57,8 @@ After the initialisation process is complete, you should see that a folder with
Last, under the project directory, run the following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -88,8 +88,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ For more information about mapping functions, check out our documentation under
In order to run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you will need to rebuild your project**
@@ -174,7 +174,7 @@ All configuration that controls how a SubQuery node is run is defined in the `do
Under the project directory, run the following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you should see a running SubQuery node in the terminal screen.
@@ -189,10 +189,7 @@ For a new SubQuery starter project, try the following query to understand how it
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/it/quickstart/quickstart-terra.md b/docs/it/quickstart/quickstart-terra.md
index 4baaf4b9b4e..47c85151a72 100644
--- a/docs/it/quickstart/quickstart-terra.md
+++ b/docs/it/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
You'll be asked certain questions as the SubQuery project is initalised:
- Project Name: A name for your SubQuery project
-- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the *"Starter project"*
+- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the _"Starter project"_
- Git repository (Optional): Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer)
-- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (Required): Enter the owner of this SubQuery project here (e.g. your name!)
- Description (Optional): You can provide a short paragraph about your project that describe what data it contains and what users can do with it
- Version (Required): Enter a custom version number or use the default (`1.0.0`)
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -91,8 +91,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -192,7 +192,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -207,10 +207,7 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/it/quickstart/quickstart.md b/docs/it/quickstart/quickstart.md
index 7b73707289a..d80fced0f30 100644
--- a/docs/it/quickstart/quickstart.md
+++ b/docs/it/quickstart/quickstart.md
@@ -89,8 +89,8 @@ After you complete the initialisation process, you will see a folder with your p
Finally, run the following command to install the new project’s dependencies from within the new project's directory.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
You have now initialised your first SubQuery project with just a few simple steps. Let’s now customise the standard template project for a specific blockchain of interest.
@@ -104,4 +104,4 @@ There are 3 important files that need to be modified. These are:
2. The Project Manifest in `project.yaml`.
3. The Mapping functions in `src/mappings/` directory.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/it/run_publish/connect.md b/docs/it/run_publish/connect.md
index 2db154b8305..3e5ecf629b6 100644
--- a/docs/it/run_publish/connect.md
+++ b/docs/it/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![Progetto distribuito e sincronizzato](/assets/img/projects-deploy-sync.png)
+![Progetto distribuito e sincronizzato](/assets/img/projects_deploy_sync.png)
In alternativa, puoi cliccare sui tre puntini accanto al titolo del tuo progetto e visualizzarlo su SubQuery Explorer. There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/it/run_publish/query.md b/docs/it/run_publish/query.md
index 8ddeca7197c..95a5d54a0a9 100644
--- a/docs/it/run_publish/query.md
+++ b/docs/it/run_publish/query.md
@@ -12,4 +12,4 @@ Noterai anche che il SubQuery Explorer fornisce un parco giochi per scoprire i d
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. Questa documentazione è generata automaticamente e vi aiuta a trovare quali entità e metodi potete interrogare.
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/it/run_publish/references.md b/docs/it/run_publish/references.md
index d8875888374..9330af125b2 100644
--- a/docs/it/run_publish/references.md
+++ b/docs/it/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | Descrizione |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | Descrizione |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/it/run_publish/run.md b/docs/it/run_publish/run.md
index b98c2543378..562e49b153f 100644
--- a/docs/it/run_publish/run.md
+++ b/docs/it/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/it/run_publish/subscription.md b/docs/it/run_publish/subscription.md
index 9bbc2661f96..29cf660d492 100644
--- a/docs/it/run_publish/subscription.md
+++ b/docs/it/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery now also supports Graphql Subscriptions. Like queries, subscriptions en
Subscriptions are very useful when you want your client application to change data or show some new data as soon as that change occurs or the new data is available. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## How to Subscribe to an Entity
diff --git a/docs/it/run_publish/upgrade.md b/docs/it/run_publish/upgrade.md
index 8b905941746..dc785fab231 100644
--- a/docs/it/run_publish/upgrade.md
+++ b/docs/it/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
Una volta che il tuo deployment è stato completato con successo e i nostri nodi hanno indicizzato i tuoi dati dalla catena, sarai in grado di connetterti al tuo progetto tramite l'endpoint GraphQL Query visualizzato.
-![Progetto distribuito e sincronizzato](/assets/img/projects-deploy-sync.png)
+![Progetto distribuito e sincronizzato](/assets/img/projects_deploy_sync.png)
In alternativa, puoi cliccare sui tre puntini accanto al titolo del tuo progetto e visualizzarlo su SubQuery Explorer. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/it/subquery_network/introduction.md b/docs/it/subquery_network/introduction.md
index 295d4a0e0d8..6394db677b8 100644
--- a/docs/it/subquery_network/introduction.md
+++ b/docs/it/subquery_network/introduction.md
@@ -18,22 +18,22 @@ There’s a role for everyone in the network, from highly technical developers t
Consumers will ask the SubQuery Network for specific data for their dApps or tools, and pay an advertised amount of SQT for each request.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexers
Indexers will run and maintain high quality SubQuery projects in their own infrastructure, running both the indexer and query service, and will be rewarded in SQT for the requests that they serve.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegators
Delegators will participate in the Network by supporting their favourite Indexers to earn rewards based on the work those indexers do.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Architects
Architects are the builders of the SubQuery projects that the Network runs on. They author and publish SubQuery projects for the Network to index and run.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/ja/README.md b/docs/ja/README.md
index 81325593827..0c3df69330f 100644
--- a/docs/ja/README.md
+++ b/docs/ja/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/ja/academy/tutorials_examples/run-indexer.md b/docs/ja/academy/tutorials_examples/run-indexer.md
index 79ee8b7e9b2..9fedd61bc6b 100644
--- a/docs/ja/academy/tutorials_examples/run-indexer.md
+++ b/docs/ja/academy/tutorials_examples/run-indexer.md
@@ -8,13 +8,13 @@
## はじめに
-Docker を使用するか、[SubQuery Projects](https://project.subquery.network/) でプロジェクトをホストしてもらう以外に、インデクサノードを実行することもできます。 より多くの時間と労力を必要としますが、SubQueryがどのように機能するのか、その理解を深めることができます。
+Docker を使用するか、[SubQuery Projects](https://project.subquery.network/) でプロジェクトをホストしてもらう以外に、インデクサノードを実行することもできます。 より多くの時間と労力を必要としますが、SubQuery がどのように機能するのか、その理解を深めることができます。
## Postgres
-インフラストラクチャ上でインデクサノードを実行するには、Postgres データベースのセットアップが必要です。 Postgresは [ここ](https://www.postgresql.org/download/)からインストールできます 。バージョンが12以上であることを確認してください。
+インフラストラクチャ上でインデクサノードを実行するには、Postgres データベースのセットアップが必要です。 Postgres は [ここ](https://www.postgresql.org/download/)からインストールできます 。バージョンが 12 以上であることを確認してください。
-## subql/nodeをインストールする
+## subql/node をインストールする
次に、SubQuery ノードを実行するには、次のコマンドを実行します。
@@ -31,7 +31,7 @@ npm install -g @subql/node
0.19.1
```
-## DBコンフィグの設定
+## DB コンフィグの設定
次に、以下の環境変数を設定する必要があります。
@@ -55,8 +55,8 @@ subql-node -f .
もしプロジェクトがない場合は、`git clone https://github.com/subquery/subql-helloworld` を実行してください。 インデクサノードが起動し、ブロックのインデックス作成を開始するのが見えるはずです。
-## Postgresの検査
+## Postgres の検査
-Postgresに移動すると、作成された2つのテーブルが表示されます。 `public.subquery` と `subquery_1.starter_entities`です。
+Postgres に移動すると、作成された 2 つのテーブルが表示されます。 `public.subquery` と `subquery_1.starter_entities`です。
-`public ubqueries ` には、インデクサが「現在の状態を理解する」ために開始時にチェックする行が 1 つだけ含まれており、どこから継続するかが分かります。 `starter_entities` テーブルには、インデックスが含まれます。 データを表示するには、`select (*) from subquery_1.starter_entities` を実行してください。
+`public subqueries` には、インデクサが「現在の状態を理解する」ために開始時にチェックする行が 1 つだけ含まれており、どこから継続するかが分かります。 `starter_entities` テーブルには、インデックスが含まれます。 データを表示するには、`select (*) from subquery_1.starter_entities` を実行してください。
diff --git a/docs/ja/build/install.md b/docs/ja/build/install.md
index ab3d1917648..d339ee62d29 100644
--- a/docs/ja/build/install.md
+++ b/docs/ja/build/install.md
@@ -1,4 +1,4 @@
-# SubQueryのインストール
+# SubQuery のインストール
SubQuery プロジェクトの作成には、さまざまなコンポーネントが必要です。 [@subql/cli](https://github.com/subquery/subql/tree/docs-new-section/packages/cli) ツールは、SubQuery プロジェクトの作成に使用されます。 インデクサを実行するには、 [@subql/node](https://github.com/subquery/subql/tree/docs-new-section/packages/node) コンポーネントが必要です。 クエリを生成するには、 [@subql/query](https://github.com/subquery/subql/tree/docs-new-section/packages/query) ライブラリが必要です。
@@ -8,37 +8,39 @@ SubQuery プロジェクトの作成には、さまざまなコンポーネン
Yarn または NPM を使用して、端末に SubQuery CLI をインストールします。
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
-helpを実行すると、CLIで利用可能なコマンドや使い方が表示されます。
+help を実行すると、CLI で利用可能なコマンドや使い方が表示されます。
```shell
subql help
```
+
## @subql/node をインストールする
-SubQueryノードは、SubQueryプロジェクトごとにSubstrateベースのブロックチェーンデータを抽出し、Postgresデータベースに保存します。
+SubQuery ノードは、SubQuery プロジェクトごとに Substrate ベースのブロックチェーンデータを抽出し、Postgres データベースに保存します。
-SubQueryノードをYarnやNPMを使って端末にインストールします。
+SubQuery ノードを Yarn や NPM を使って端末にインストールします。
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
インストールすると、次のようにノードを起動することができます。
```shell
subql-node
```
-> 注意: Dockerを使用している場合、またはSubQuery Projectsでプロジェクトをホスティングしている場合は、この手順をスキップできます。 これは、SubQueryノードがDockerコンテナとホスティングインフラストラクチャにすでに提供されているためです。
-## @subql/queryをインストールする
+> 注意: Docker を使用している場合、または SubQuery Projects でプロジェクトをホスティングしている場合は、この手順をスキップできます。 これは、SubQuery ノードが Docker コンテナとホスティングインフラストラクチャにすでに提供されているためです。
+
+## @subql/query をインストールする
-SubQueryクエリライブラリは、ブラウザを介した「プレイグラウンド」環境でプロジェクトにクエリを発行するサービスを提供します。
+SubQuery クエリライブラリは、ブラウザを介した「プレイグラウンド」環境でプロジェクトにクエリを発行するサービスを提供します。
Yarn または NPM を使用して、端末に SubQuery クエリ をインストールします。
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> 注意: Dockerを使用している場合、またはSubQuery Projectsでプロジェクトをホスティングしている場合は、この手順をスキップできます。 これは、SubQueryノードがDockerコンテナとホスティングインフラストラクチャにすでに提供されているためです。
\ No newline at end of file
+> 注意: Docker を使用している場合、または SubQuery Projects でプロジェクトをホスティングしている場合は、この手順をスキップできます。 これは、SubQuery ノードが Docker コンテナとホスティングインフラストラクチャにすでに提供されているためです。
diff --git a/docs/ja/build/introduction.md b/docs/ja/build/introduction.md
index bcb61350ff5..0a212fef28e 100644
--- a/docs/ja/build/introduction.md
+++ b/docs/ja/build/introduction.md
@@ -51,8 +51,8 @@ yarn codegen
プロジェクトのルートディレクトリから build コマンドを実行します。
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### 代替のビルド オプション
@@ -88,7 +88,7 @@ exports フィールドで指定されているかどうかに関わらず、 `i
## ロギング
-`console.log` メソッドは **サポートされなくなりました**。 代わりに、 `logger` モジュールが型に組み込まれています。 つまり、さまざまなロガーレベルを受け入れるロガーをサポートすることができます。
+`console.log` メソッドは **サポートされなくなりました**。 代わりに、 `logger` モジュールが型に組み込まれています。 つまり、さまざまなロガーレベルを受け入れるロガーをサポートすることができます。
```typescript
logger.info("Info level message");
diff --git a/docs/ja/build/manifest.md b/docs/ja/build/manifest.md
index 09fb25b5f35..1b83d8c0727 100644
--- a/docs/ja/build/manifest.md
+++ b/docs/ja/build/manifest.md
@@ -4,7 +4,7 @@
マニフェストは YAML または JSON 形式で使用できます。 このドキュメントでは、すべての例で YAML を使用します。 以下は、基本的な `project.yaml` の例です。
- ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## v0.0.1 から v0.2.0 への移行
@@ -81,9 +81,9 @@
### Mapping の仕様
-| フィールド | v0.0.1 | v0.2.0 | 説明 |
-| ---------------------- | -------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| **file** | String | 𐄂 | マッピングエントリへのパス |
+| フィールド | v0.0.1 | v0.2.0 | 説明 |
+| ---------------------- | -------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **file** | String | 𐄂 | マッピングエントリへのパス |
| **handlers & filters** | [デフォルトのハンドラとフィルタ](./manifest/#mapping-handlers-and-filters) | デフォルトのハンドラとフィルタ、 [カスタムハンドラとフィルタ](#custom-data-sources) | 追加のマッピングフィルタを使用して、すべての [マッピング関数](./mapping/polkadot.md) とそれに対応するハンドラータイプをリストします。
カスタムランタイムマッピングハンドラについては、 [カスタムデータソース](#custom-data-sources) を参照してください。 |
## データソースとマッピング
@@ -104,8 +104,8 @@ dataSources:
**SubQuery プロジェクトは、イベントと適切なマッピングフィルタを使用するだけで、より効率的になります。**
-| ハンドラ | サポートされるフィルタ |
-| ------------------------------------------ | ---------------------------- |
+| ハンドラ | サポートされるフィルタ |
+| --------------------------------------------------- | ---------------------------- |
| [BlockHandler](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -153,8 +153,8 @@ substrate ランタイムモジュールで使用される追加の型をサポ
以下の v0.2.0 の例では、`network.chaintypes`は、すべてのカスタムタイプが含まれているファイルを指しています。 これは、このブロックチェーンがサポートする特定のタイプを`.json`または`.yaml`形式で宣言する標準的なチェーンスペックファイルです。
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
To use typescript for your chain types file include it in the `src` folder (e.g. `./src/types.ts`), run `yarn build` and then point to the generated js file located in the `dist` folder.
@@ -171,7 +171,7 @@ network:
以下は、 `.ts` チェーンタイプファイルの例です:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## カスタムデータソース
@@ -197,6 +197,6 @@ network:
以下は Polkadot と Kusama ネットワークの異なるデータソースを示す例です。
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/ja/build/mapping.md b/docs/ja/build/mapping.md
index 96e5a10115f..cc41cb57750 100644
--- a/docs/ja/build/mapping.md
+++ b/docs/ja/build/mapping.md
@@ -67,9 +67,9 @@ export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
これらは現在サポートされているインターフェースです:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) は 現在の ブロックを問い合わせます。
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) は、現在のブロックで 同じ 型の複数のクエリを実行します。
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) は、現在のブロックで 異なる 型の複数のクエリを実行します。
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) は **現在の** ブロックを問い合わせます。
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) は、現在のブロックで **同じ** 型の複数のクエリを実行します。
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) は、現在のブロックで **異なる** 型の複数のクエリを実行します。
これらは現在サポート **されていない** インターフェイスです:
diff --git a/docs/ja/build/substrate-evm.md b/docs/ja/build/substrate-evm.md
index d3ab24b4430..ccb37b5d247 100644
--- a/docs/ja/build/substrate-evm.md
+++ b/docs/ja/build/substrate-evm.md
@@ -1,10 +1,10 @@
-# SubstrateのEVMサポート
+# Substrate の EVM サポート
-MoonbeamとMoonriverのEVM用にカスタムデータソースプロセッサを提供しています。 これは、Moonbeamのネットワーク上のEVMとSubstrateのアクティビティを1つのSubQueryプロジェクトでフィルタリングし、インデックスを作成するシンプルな方法です。
+Moonbeam と Moonriver の EVM 用にカスタムデータソースプロセッサを提供しています。 これは、Moonbeam のネットワーク上の EVM と Substrate のアクティビティを 1 つの SubQuery プロジェクトでフィルタリングし、インデックスを作成するシンプルな方法です。
サポートされているネットワーク:
-| ネットワーク名 | Websocket エンドポイント | ディクショナリエンドポイント |
+| ネットワーク名 | Websocket エンドポイント | ディクショナリエンドポイント |
| -------------- | -------------------------------------------------- | -------------------------------------------------------------------- |
| Moonbeam | `wss://moonbeam.api.onfinality.io/public-ws` | `https://api.subquery.network/sq/subquery/moonbeam-dictionary` |
| Moonriver | `wss://moonriver.api.onfinality.io/public-ws` | `https://api.subquery.network/sq/subquery/moonriver-dictionary` |
@@ -20,34 +20,34 @@ MoonbeamとMoonriverのEVM用にカスタムデータソースプロセッサを
## データソース仕様
-| フィールド | 型 | 必須 | 説明 |
-| ----------------- | -------------------------------------------------------------- | --- | --------------------- |
-| processor.file | `'./node_modules/@subql/contract-processors/dist/moonbeam.js'` | Yes | データプロセッサコードへのファイル参照 |
-| processor.options | [ProcessorOptions](#processor-options) | No | Moonbeamプロセッサ固有のオプション |
-| assets | `{ [key: String]: { file: String }}` | No | 外部ファイルのオブジェクト |
+| フィールド | 型 | 必須 | 説明 |
+| ----------------- | -------------------------------------------------------------- | ---- | -------------------------------------- |
+| processor.file | `'./node_modules/@subql/contract-processors/dist/moonbeam.js'` | Yes | データプロセッサコードへのファイル参照 |
+| processor.options | [ProcessorOptions](#processor-options) | No | Moonbeam プロセッサ固有のオプション |
+| assets | `{ [key: String]: { file: String }}` | No | 外部ファイルのオブジェクト |
### プロセッサオプション
-| フィールド | 型 | 必須 | 説明 |
-| ------- | ---------------- | -- | ----------------------------------------------------------- |
-| abi | String | No | 引数を解析するためにプロセッサが使用する ABI です。 `assets` のキーでなければなりません |
-| address | String or `null` | No | イベントの発信元または発信先となるコントラクトアドレス。 `null` はコントラクトの作成呼び出しをキャプチャします |
+| フィールド | 型 | 必須 | 説明 |
+| ---------- | ---------------- | ---- | -------------------------------------------------------------------------------------------------------------- |
+| abi | String | No | 引数を解析するためにプロセッサが使用する ABI です。 `assets` のキーでなければなりません |
+| address | String or `null` | No | イベントの発信元または発信先となるコントラクトアドレス。 `null` はコントラクトの作成呼び出しをキャプチャします |
## MoonbeamCall
ハンドラの引数が異なり、若干のフィルタリング変更以外は、[substrate/CallHandler](../create/mapping/#call-handler)と同じように動作する。
-| フィールド | 型 | 必須 | 説明 |
-| ------ | ---------------------------- | --- | -------------------- |
-| kind | 'substrate/MoonbeamCall' | Yes | 呼び出しハンドラであることを指定します。 |
-| filter | [Call Filter](#call-filters) | No | 実行するデータソースをフィルタする |
+| フィールド | 型 | 必須 | 説明 |
+| ---------- | ---------------------------- | ---- | ---------------------------------------- |
+| kind | 'substrate/MoonbeamCall' | Yes | 呼び出しハンドラであることを指定します。 |
+| filter | [Call Filter](#call-filters) | No | 実行するデータソースをフィルタする |
### Call Filters
-| フィールド | 型 | 例 | 説明 |
-| -------- | ------ | --------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------ |
-| function | String | 0x095ea7b3, approve(address to,uint256 value) | [Function Signature](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment) 文字列、またはコントラクトで呼び出された関数をフィルタする関数 `sighash` のいずれか。 |
-| from | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | トランザクションを送信したイーサリアムアドレス |
+| フィールド | 型 | 例 | 説明 |
+| ---------- | ------ | --------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| function | String | 0x095ea7b3, approve(address to,uint256 value) | [Function Signature](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment) 文字列、またはコントラクトで呼び出された関数をフィルタする関数 `sighash` のいずれか。 |
+| from | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | トランザクションを送信したイーサリアムアドレス |
### ハンドラ
@@ -63,22 +63,22 @@ MoonbeamとMoonriverのEVM用にカスタムデータソースプロセッサを
ハンドラの引数が異なり、若干のフィルタリング変更以外は、[substrate/EventHandler](../create/mapping/#event-handler)と同じように動作する。
-| フィールド | 型 | 必須 | 説明 |
-| ------ | ------------------------------ | --- | -------------------- |
-| kind | 'substrate/MoonbeamEvent' | Yes | 呼び出しハンドラであることを指定します。 |
-| filter | [Event Filter](#event-filters) | No | 実行するデータソースをフィルタする |
+| フィールド | 型 | 必須 | 説明 |
+| ---------- | ------------------------------ | ---- | ---------------------------------------- |
+| kind | 'substrate/MoonbeamEvent' | Yes | 呼び出しハンドラであることを指定します。 |
+| filter | [Event Filter](#event-filters) | No | 実行するデータソースをフィルタする |
### イベントフィルタ
-| フィールド | 型 | 例 | 説明 |
-| ------ | ------------ | --------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- |
-| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | topicsは、Ethereum JSON-PRCログフィルタに従います。詳細なドキュメントは[こちら](https://docs.ethers.io/v5/concepts/events/)です。 |
+| フィールド | 型 | 例 | 説明 |
+| ---------- | ------------ | --------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------- |
+| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | topics は、Ethereum JSON-PRC ログフィルタに従います。詳細なドキュメントは[こちら](https://docs.ethers.io/v5/concepts/events/)です。 |
-topicsに関する注意:
+**topics に関する注意:**
基本的なログフィルタにはいくつかの改善点があります:
-- topicsを 0 埋めする必要はありません。
-- [Event Fragment](https://docs.ethers.io/v5/api/utils/abi/fragments/#EventFragment) を提供し、そのIDに自動的に変換できます
+- topics を 0 埋めする必要はありません。
+- [Event Fragment](https://docs.ethers.io/v5/api/utils/abi/fragments/#EventFragment) を提供し、その ID に自動的に変換できます
### ハンドラ
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,11 +122,11 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## 既知の制限事項
-- ハンドラ内のEVM状態を問い合わせる方法は現在ありません。
+- ハンドラ内の EVM 状態を問い合わせる方法は現在ありません。
- 呼び出しハンドラで戻り値を取得する方法はありません。
- `blockHash` プロパティは現在未定義のままです。代わりに `blockNumber` プロパティを使用できます。
diff --git a/docs/ja/faqs/faqs.md b/docs/ja/faqs/faqs.md
index c29b3b9e5ec..bcc3606a131 100644
--- a/docs/ja/faqs/faqs.md
+++ b/docs/ja/faqs/faqs.md
@@ -1,6 +1,6 @@
# Frequently Asked Questions
-## SubQueryとは?
+## SubQuery とは?
SubQuery is an open source blockchain data indexer for developers that provides fast, flexible, reliable, and decentralised APIs to power leading multi-chain apps.
@@ -16,21 +16,21 @@ SubQuery also provides free, production grade hosting of projects for developers
**The SubQuery Network**
-The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
+The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
More information [here](/subquery_network/introduction.md).
-## SubQueryを始めるための最良の方法は何ですか?
+## SubQuery を始めるための最良の方法は何ですか?
The best way to get started with SubQuery is to try out our [Hello World tutorial](/assets/pdf/Hello_World_Lab.pdf). This is a simple 5 min walk through exercise. Download the starter template, build the project, use Docker to run a node on your localhost, and run a simple query.
-## SubQueryにどのように貢献したりフィードバックを与えたりできますか?
+## SubQuery にどのように貢献したりフィードバックを与えたりできますか?
-私たちはコミュニティからの貢献とフィードバックが大好きです。 To contribute the code, fork the repository of your interest and make your changes. 次にPRまたはPullリクエストを送信します。 Don't forget to test as well. Also check out our contributions guidelines.
+私たちはコミュニティからの貢献とフィードバックが大好きです。 To contribute the code, fork the repository of your interest and make your changes. 次に PR または Pull リクエストを送信します。 Don't forget to test as well. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
To give feedback, contact us at hello@subquery.network or jump onto our [discord channel](https://discord.com/invite/78zg8aBSMG).
-## 自分のプロジェクトをSubQuery Projectsで公開するにはどのくらいの費用がかかりますか?
+## 自分のプロジェクトを SubQuery Projects で公開するにはどのくらいの費用がかかりますか?
This service is being provided to the community with a generous free tier! You can host your first two SubQuery projects for absolutely free!
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Note that it is recommended to use `--force-clean` when changing the `startBlock` within the project manifest (`project.yaml`) in order to begin reindexing from the configured block. If `startBlock` is changed without a `--force-clean` of the project, then the indexer will continue indexing with the previously configured `startBlock`.
-
## How can I optimise my project to speed it up?
Performance is a crucial factor in each project. Fortunately, there are several things you could do to improve it. Here is the list of some suggestions:
@@ -89,13 +88,13 @@ Performance is a crucial factor in each project. Fortunately, there are several
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/ja/miscellaneous/contributing.md b/docs/ja/miscellaneous/contributing.md
index d631948c0ad..04769d6b292 100644
--- a/docs/ja/miscellaneous/contributing.md
+++ b/docs/ja/miscellaneous/contributing.md
@@ -1,8 +1,8 @@
-# SubQueryへの貢献
+# SubQuery への貢献
-このSubQueryプロジェクトへの貢献をご検討いただき、誠にありがとうございます! 私たちは共に、より分散化された未来への道を切り開くことができます。
+この SubQuery プロジェクトへの貢献をご検討いただき、誠にありがとうございます! 私たちは共に、より分散化された未来への道を切り開くことができます。
-::: info Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
+::: tip Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
What follows is a set of guidelines (not rules) for contributing to SubQuery. Following these guidelines will help us make the contribution process easy and effective for everyone involved. It also communicates that you agree to respect the time of the developers managing and developing this project. In return, we will reciprocate that respect by addressing your issue, considering changes, collaborating on improvements, and helping you finalise your pull requests.
@@ -14,8 +14,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* 自分で作成する前に、既存のIssuesやPRを検索してください。
-* 迅速な対応に努めていますが、影響によっては原因究明に時間がかかる場合もあります。 あなたの問題がブロックされている場合、コメントスレッドで投稿者や投稿者に友好的に@で言及することで、注意を引くことができます。
+- 自分で作成する前に、既存の Issues や PR を検索してください。
+- 迅速な対応に努めていますが、影響によっては原因究明に時間がかかる場合もあります。 あなたの問題がブロックされている場合、コメントスレッドで投稿者や投稿者に友好的に@で言及することで、注意を引くことができます。
## 貢献方法
@@ -23,32 +23,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* 問題を特定するために、明確で説明的なタイトルを使用します。
-* 問題を再現するための正確なステップを記述します。
-* 手順に従って観察した動作を説明します。
-* どの動作を期待するか、またその理由を説明します。
-* 可能であればスクリーンショットを含めます。
+- 問題を特定するために、明確で説明的なタイトルを使用します。
+- 問題を再現するための正確なステップを記述します。
+- 手順に従って観察した動作を説明します。
+- どの動作を期待するか、またその理由を説明します。
+- 可能であればスクリーンショットを含めます。
### プルリクエストを送る
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## コーディング規約
### コミットメッセージ
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### JavaScript Styleguide
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/ja/quickstart/helloworld-localhost.md b/docs/ja/quickstart/helloworld-localhost.md
index 9bf4d22dd0d..ea08d9494d9 100644
--- a/docs/ja/quickstart/helloworld-localhost.md
+++ b/docs/ja/quickstart/helloworld-localhost.md
@@ -1,6 +1,6 @@
# Hello World (localhost & Docker)
-SubQuery Hello World のクイックスタートへようこそ。 クイックスタートでは、いくつかの簡単な手順でデフォルトのスタータープロジェクトをDockerで実行する方法を説明します。
+SubQuery Hello World のクイックスタートへようこそ。 クイックスタートでは、いくつかの簡単な手順でデフォルトのスタータープロジェクトを Docker で実行する方法を説明します。
## 学習のねらい
@@ -8,12 +8,12 @@ SubQuery Hello World のクイックスタートへようこそ。 クイック
- 必要な前提条件を理解すること
- 基本的な一般的なコマンドを理解すること
-- localhost:3000に移動して、プレイグラウンドを表示できるようになること
-- Polkadotメインネットのブロックの高さを取得するための簡単なクエリを実行すること
+- localhost:3000 に移動して、プレイグラウンドを表示できるようになること
+- Polkadot メインネットのブロックの高さを取得するための簡単なクエリを実行すること
## 対象者
-このガイドは、開発経験があり、SubQueryについてもっと学ぶことに興味がある新規開発者を対象としています。
+このガイドは、開発経験があり、SubQuery についてもっと学ぶことに興味がある新規開発者を対象としています。
## ビデオガイド
@@ -86,10 +86,10 @@ cd subqlHelloWorld
## 2. 依存するモジュールをインストールする
-ここで様々な依存関係をインストールするために、yarnまたはnodeのインストールを実行します。
+ここで様々な依存関係をインストールするために、yarn または node のインストールを実行します。
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
例 `yarn install`
@@ -107,10 +107,10 @@ success Saved lockfile.
## 3. コードを生成する
-ここで`yarn codegen`を実行して、GraphQLスキーマからTypescriptを生成します。
+ここで`yarn codegen`を実行して、GraphQL スキーマから Typescript を生成します。
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
例 `yarn codegen`
@@ -127,14 +127,14 @@ $ ./node_modules/.bin/subql codegen
✨ Done in 1.02s.
```
-**警告** スキーマファイルに変更があった場合、`yarn codegen` を再実行し、typesディレクトリを再生成することを忘れないようにしてください。
+**警告** スキーマファイルに変更があった場合、`yarn codegen` を再実行し、types ディレクトリを再生成することを忘れないようにしてください。
## 4. コードをビルドする
次のステップは、 `yarn build` でコードをビルドすることです。
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
例 `yarn build`
@@ -147,7 +147,7 @@ $ tsc -b
## 5. Docker を実行する
-Dockerを使用すると、必要なインフラをすべてDockerイメージ内で提供できるため、この例を非常に迅速に実行することができます。 `docker-compose pull && docker-compose up` を実行する
+Docker を使用すると、必要なインフラをすべて Docker イメージ内で提供できるため、この例を非常に迅速に実行することができます。 `docker-compose pull && docker-compose up` を実行する
これですべてがキックされ、最終的にブロックがフェッチされます。
@@ -192,4 +192,4 @@ http://localhost:3000/ にアクセスし、以下のクエリを画面左側に
## 概要
-このクイックスタートでは、Docker環境内でプロジェクトを立ち上げて実行する基本的な手順を示した後、localhost:3000にナビゲートして、メインネットPolkadotネットワークのブロック番号を返すクエリーを実行しました。
+このクイックスタートでは、Docker 環境内でプロジェクトを立ち上げて実行する基本的な手順を示した後、localhost:3000 にナビゲートして、メインネット Polkadot ネットワークのブロック番号を返すクエリーを実行しました。
diff --git a/docs/ja/quickstart/quickstart-avalanche.md b/docs/ja/quickstart/quickstart-avalanche.md
index f2fbce2a895..c58c403992b 100644
--- a/docs/ja/quickstart/quickstart-avalanche.md
+++ b/docs/ja/quickstart/quickstart-avalanche.md
@@ -6,7 +6,7 @@ In this Quick start guide, we're going to start with a simple Avalanche starter
このガイドの最後には、SubQuery ノード上で動作する SubQuery プロジェクトを作成し、GraphQL エンドポイントからデータを照会できるようになります。
-まだの方は、SubQueryで使われている[用語](../#terminology)に慣れることをお勧めします。
+まだの方は、SubQuery で使われている[用語](../#terminology)に慣れることをお勧めします。
**The goal of this quick start guide is to index all Pangolin token _Approve_ logs, it should only take 10-15 minutes**
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -183,7 +183,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
diff --git a/docs/ja/quickstart/quickstart-cosmos.md b/docs/ja/quickstart/quickstart-cosmos.md
index 3ce94c5ae70..2af98831a18 100644
--- a/docs/ja/quickstart/quickstart-cosmos.md
+++ b/docs/ja/quickstart/quickstart-cosmos.md
@@ -6,7 +6,7 @@ In this Quick start guide, we're going to start with a simple Cosmos starter pro
このガイドの最後には、SubQuery ノード上で動作する SubQuery プロジェクトを作成し、GraphQL エンドポイントからデータを照会できるようになります。
-まだの方は、SubQueryで使われている[用語](../#terminology)に慣れることをお勧めします。
+まだの方は、SubQuery で使われている[用語](../#terminology)に慣れることをお勧めします。
**The goal of this quick start guide is to adapt the standard starter project to begin indexing all votes on the [Terra Developer Fund](https://daodao.zone/multisig/juno1lgnstas4ruflg0eta394y8epq67s4rzhg5anssz3rc5zwvjmmvcql6qps2) (which also contributed to SubQuery) from Cosmos, it should only take 10-15 minutes**
@@ -44,8 +44,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -75,8 +75,8 @@ type Vote @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -159,7 +159,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -173,10 +173,9 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/ja/quickstart/quickstart-polkadot.md b/docs/ja/quickstart/quickstart-polkadot.md
index 4742b385a2b..baf1ba6497b 100644
--- a/docs/ja/quickstart/quickstart-polkadot.md
+++ b/docs/ja/quickstart/quickstart-polkadot.md
@@ -4,7 +4,7 @@ In this quick start guide, we're going to start with a simple Substrate/Polkadot
このガイドの最後には、SubQuery ノード上で動作する SubQuery プロジェクトを作成し、GraphQL エンドポイントからデータを照会できるようになります。
-まだの方は、SubQueryで使われている[用語](../#terminology)に慣れることをお勧めします。
+まだの方は、SubQuery で使われている[用語](../#terminology)に慣れることをお勧めします。
**The goal of this quick start guide is to adapt the standard starter project to begin indexing all transfers from Polkadot, it should only take 10-15 minutes**
@@ -43,10 +43,10 @@ subql init
You'll be asked certain questions as the SubQuery project is initalised:
- Project name: A project name for your SubQuery project
-- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Substrate"*
-- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Polkadot"*
-- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the *"subql-starter"* project.
-- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value *"https://polkadot.api.onfinality.io"*
+- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Substrate"_
+- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Polkadot"_
+- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the _"subql-starter"_ project.
+- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value _"https://polkadot.api.onfinality.io"_
- Git repository: Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer) or accept the provided default.
- Authors: Enter the owner of this SubQuery project here (e.g. your name!) or accept the provided default.
- Description: Provide a short paragraph about your project that describes what data it contains and what users can do with it or accept the provided default.
@@ -57,8 +57,8 @@ After the initialisation process is complete, you should see that a folder with
Last, under the project directory, run the following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -88,8 +88,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ For more information about mapping functions, check out our documentation under
In order to run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you will need to rebuild your project**
@@ -174,7 +174,7 @@ All configuration that controls how a SubQuery node is run is defined in the `do
Under the project directory, run the following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you should see a running SubQuery node in the terminal screen.
@@ -189,10 +189,7 @@ For a new SubQuery starter project, try the following query to understand how it
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/ja/quickstart/quickstart-terra.md b/docs/ja/quickstart/quickstart-terra.md
index 0a0bbde9829..3c17d841a58 100644
--- a/docs/ja/quickstart/quickstart-terra.md
+++ b/docs/ja/quickstart/quickstart-terra.md
@@ -6,7 +6,7 @@ In this Quick start guide, we're going to start with a simple Terra starter proj
このガイドの最後には、SubQuery ノード上で動作する SubQuery プロジェクトを作成し、GraphQL エンドポイントからデータを照会できるようになります。
-まだの方は、SubQueryで使われている[用語](../#terminology)に慣れることをお勧めします。
+まだの方は、SubQuery で使われている[用語](../#terminology)に慣れることをお勧めします。
**The goal of this quick start guide is to adapt the standard starter project to begin indexing all transfers from Terra, it should only take 10-15 minutes**
@@ -45,11 +45,11 @@ subql init
You'll be asked certain questions as the SubQuery project is initalised:
- Project Name: A name for your SubQuery project
-- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the *"Starter project"*
+- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the _"Starter project"_
- Git repository (Optional): Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer)
-- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (Required): Enter the owner of this SubQuery project here (e.g. your name!)
- Description (Optional): You can provide a short paragraph about your project that describe what data it contains and what users can do with it
- Version (Required): Enter a custom version number or use the default (`1.0.0`)
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -91,8 +91,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -192,7 +192,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -207,10 +207,7 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/ja/quickstart/quickstart.md b/docs/ja/quickstart/quickstart.md
index 68d38a77381..510abb384b2 100644
--- a/docs/ja/quickstart/quickstart.md
+++ b/docs/ja/quickstart/quickstart.md
@@ -89,8 +89,8 @@ After you complete the initialisation process, you will see a folder with your p
Finally, run the following command to install the new project’s dependencies from within the new project's directory.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
You have now initialised your first SubQuery project with just a few simple steps. Let’s now customise the standard template project for a specific blockchain of interest.
@@ -104,4 +104,4 @@ There are 3 important files that need to be modified. These are:
2. The Project Manifest in `project.yaml`.
3. The Mapping functions in `src/mappings/` directory.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/ja/quickstart/understanding-helloworld.md b/docs/ja/quickstart/understanding-helloworld.md
index 26189b8ef19..02f97197576 100644
--- a/docs/ja/quickstart/understanding-helloworld.md
+++ b/docs/ja/quickstart/understanding-helloworld.md
@@ -1,6 +1,6 @@
# Hello World の説明
-[Hello World クイックスタート ガイド](helloworld-localhost.md)では、いくつかの簡単なコマンドを実行し、非常に迅速にプロジェクトを立ち上げて実行する方法を説明しました。 これにより、すべての前提条件が整っていることを確認し、ローカルのプレイグラウンドを使用して、SubQueryから最初のデータを取得するための簡単なクエリを作成することができたのです。 ここでは、これらのコマンドの意味を詳しく説明します。
+[Hello World クイックスタート ガイド](helloworld-localhost.md)では、いくつかの簡単なコマンドを実行し、非常に迅速にプロジェクトを立ち上げて実行する方法を説明しました。 これにより、すべての前提条件が整っていることを確認し、ローカルのプレイグラウンドを使用して、SubQuery から最初のデータを取得するための簡単なクエリを作成することができたのです。 ここでは、これらのコマンドの意味を詳しく説明します。
## subql init
@@ -14,21 +14,21 @@
![key subql files](/assets/img/main_subql_files.png)
-これらのファイルは私たちが行う全てのコアファイルです。 そのため、これらのファイルについては、別の記事で詳しく説明します。 今のところ、スキーマにはユーザーがSubQuery APIにリクエストできるデータの記述があり、プロジェクトのyamlファイルには「設定」タイプのパラメータ、そしてもちろんmappingHandlersにはデータを変換する関数が含まれるtypescriptがあることだけは知っておいてください。
+これらのファイルは私たちが行う全てのコアファイルです。 そのため、これらのファイルについては、別の記事で詳しく説明します。 今のところ、スキーマにはユーザーが SubQuery API にリクエストできるデータの記述があり、プロジェクトの yaml ファイルには「設定」タイプのパラメータ、そしてもちろん mappingHandlers にはデータを変換する関数が含まれる typescript があることだけは知っておいてください。
## yarn install
次に実行するのは`yarn install`です。 `npm install` も使用可能です。
-> 歴史を簡単に説明します。 Node Package Manager(npm)は、2010年にリリースされ、JavaScript開発者の間で絶大な人気を誇るパッケージマネージャです。 Node.jsをシステムにインストールする際に、自動的にインストールされるデフォルトのパッケージです。 Yarnは当時、npmで作業する際のパフォーマンスやセキュリティの欠点を解消する目的で、2016年にFacebookがリリースしたものです。
+> 歴史を簡単に説明します。 Node Package Manager(npm)は、2010 年にリリースされ、JavaScript 開発者の間で絶大な人気を誇るパッケージマネージャです。 Node.js をシステムにインストールする際に、自動的にインストールされるデフォルトのパッケージです。 Yarn は当時、npm で作業する際のパフォーマンスやセキュリティの欠点を解消する目的で、2016 年に Facebook がリリースしたものです。
-yarnが行うのは、`package.json`ファイルを見て、他の様々な依存関係をダウンロードすることです。 `package.json`ファイルを見ると、あまり依存関係がないように見えますが、コマンドを実行すると、18,983個のファイルが追加されていることに気づきます。 これは、それぞれの依存関係がまた依存関係を持つことになるからです。
+yarn が行うのは、`package.json`ファイルを見て、他の様々な依存関係をダウンロードすることです。 `package.json`ファイルを見ると、あまり依存関係がないように見えますが、コマンドを実行すると、18,983 個のファイルが追加されていることに気づきます。 これは、それぞれの依存関係がまた依存関係を持つことになるからです。
![key subql files](/assets/img/dependencies.png)
## yarn codegen
-次に `yarn codegen` または `npm run-script codegen` を実行します。 これはGraphQLスキーマ(`schema.graphql`内)を取得し、関連するtypescriptモデルファイルを生成します(したがって出力ファイルの拡張子は.tsになります)。 これらの生成されたファイルは決して変更せず、ソースの `schema.graphql` ファイルのみを変更します。
+次に `yarn codegen` または `npm run-script codegen` を実行します。 これは GraphQL スキーマ(`schema.graphql`内)を取得し、関連する typescript モデルファイルを生成します(したがって出力ファイルの拡張子は.ts になります)。 これらの生成されたファイルは決して変更せず、ソースの `schema.graphql` ファイルのみを変更します。
![key subql files](/assets/img/typescript.png)
@@ -40,7 +40,7 @@ yarnが行うのは、`package.json`ファイルを見て、他の様々な依
## docker-compose
-最後は、Dockerコマンド`docker-compose pull && docker-compose up`(別々に実行することも可能)を組み合わせて実行します。 `pull` コマンドは、Docker Hub から必要なすべてのイメージを取得し、 `up` コマンドはコンテナを起動します。
+最後は、Docker コマンド`docker-compose pull && docker-compose up`(別々に実行することも可能)を組み合わせて実行します。 `pull` コマンドは、Docker Hub から必要なすべてのイメージを取得し、 `up` コマンドはコンテナを起動します。
```shell
> docker-compose pull
@@ -49,16 +49,16 @@ Pulling subquery-node ... done
Pulling graphql-engine ... done
```
-コンテナが開始されると、ターミナルはノードの状態とGraphQLエンジンの状態を示す多くのテキストを出力します。 以下のように表示されます:
+コンテナが開始されると、ターミナルはノードの状態と GraphQL エンジンの状態を示す多くのテキストを出力します。 以下のように表示されます:
```
subquery-node_1 | 2021-06-06T02:04:25.490Z INFO fetch block [1, 100]
```
-SubQueryノードが同期を開始したことがわかります。
+SubQuery ノードが同期を開始したことがわかります。
## 概要
-さて、カバーの中で何が起きているのかがわかったところで、問題はここから先です。 あなたに自信があれば、プロジェクトの作成方法の学習に飛び込んで、3 つの重要なファイルについて詳しく学ぶことができます。 マニフェストファイル、GraphQL スキーマ、およびマッピングファイル。
+さて、カバーの中で何が起きているのかがわかったところで、問題はここから先です。 あなたに自信があれば、[プロジェクトの作成方法](../quickstart/quickstart.md)の学習に飛び込んで、3 つの重要なファイルについて詳しく学ぶことができます。 マニフェストファイル、GraphQL スキーマ、およびマッピングファイル。
-それ以外の場合は、SubQueryがホストするインフラストラクチャで、この Hello World の例をどのように実行するかを見ていきます。 スタートブロックを変更し、すぐに利用可能なオープンソースプロジェクトを実行することで、SubQueryプロジェクトを実行することにします。
+それ以外の場合は、SubQuery がホストするインフラストラクチャで、この Hello World の例をどのように実行するかを見ていきます。 スタートブロックを変更し、すぐに利用可能なオープンソースプロジェクトを実行することで、SubQuery プロジェクトを実行することにします。
diff --git a/docs/ja/run_publish/connect.md b/docs/ja/run_publish/connect.md
index f06aad4590f..f1a2f127d81 100644
--- a/docs/ja/run_publish/connect.md
+++ b/docs/ja/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![プロジェクトを展開および同期する](/assets/img/projects-deploy-sync.png)
+![プロジェクトを展開および同期する](/assets/img/projects_deploy_sync.png)
-または、プロジェクトのタイトルの横にある3つの点をクリックして、SubQuery Explorer で表示することもできます。 There you can use the in browser playground to get started.
+または、プロジェクトのタイトルの横にある 3 つの点をクリックして、SubQuery Explorer で表示することもできます。 There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ja/run_publish/query.md b/docs/ja/run_publish/query.md
index a0a9559008c..ffaf5e3fc56 100644
--- a/docs/ja/run_publish/query.md
+++ b/docs/ja/run_publish/query.md
@@ -4,12 +4,12 @@
![SubQuery Explorer](https://static.subquery.network/media/explorer/explorer-header.png)
-SubQuery Explorerは簡単に開始できます。 私たちはこれらのSubQueryノードをオンラインでホスティングしており、誰もが無料でそれぞれクエリを実行することができます。 これらのマネージドノードは、SubQueryチームによってパフォーマンスレベルで監視および実行され、これにより本番アプリケーションの使用と依存が可能になります。
+SubQuery Explorer は簡単に開始できます。 私たちはこれらの SubQuery ノードをオンラインでホスティングしており、誰もが無料でそれぞれクエリを実行することができます。 これらのマネージドノードは、SubQuery チームによってパフォーマンスレベルで監視および実行され、これにより本番アプリケーションの使用と依存が可能になります。
![SubQuery Project](https://static.subquery.network/media/explorer/explorer-project.png)
-SubQuery Explorer は、サンプルクエリを使用して利用可能なデータを検出するためのプレイグラウンドを提供していることにも注意してください。コードを実装することなくブラウザで直接クエリをテストすることができます。 さらに、開発者が世界中のPolkadotデータを照会し、分析するための旅をよりよくサポートするために、ドキュメントにいくつかの小さな改良を加えました。
+SubQuery Explorer は、サンプルクエリを使用して利用可能なデータを検出するためのプレイグラウンドを提供していることにも注意してください。コードを実装することなくブラウザで直接クエリをテストすることができます。 さらに、開発者が世界中の Polkadot データを照会し、分析するための旅をよりよくサポートするために、ドキュメントにいくつかの小さな改良を加えました。
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. このドキュメントは自動的に生成され、クエリできるエンティティやメソッドを見つけるのに役立ちます。
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ja/run_publish/references.md b/docs/ja/run_publish/references.md
index 1c52897aec5..9d7d3fce7e4 100644
--- a/docs/ja/run_publish/references.md
+++ b/docs/ja/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | 説明 |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | 説明 |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/ja/run_publish/run.md b/docs/ja/run_publish/run.md
index b98c2543378..562e49b153f 100644
--- a/docs/ja/run_publish/run.md
+++ b/docs/ja/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/ja/run_publish/subscription.md b/docs/ja/run_publish/subscription.md
index 9bbc2661f96..29cf660d492 100644
--- a/docs/ja/run_publish/subscription.md
+++ b/docs/ja/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery now also supports Graphql Subscriptions. Like queries, subscriptions en
Subscriptions are very useful when you want your client application to change data or show some new data as soon as that change occurs or the new data is available. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## How to Subscribe to an Entity
diff --git a/docs/ja/run_publish/upgrade.md b/docs/ja/run_publish/upgrade.md
index c3372468844..92da18a5d1e 100644
--- a/docs/ja/run_publish/upgrade.md
+++ b/docs/ja/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
デプロイが正常に完了し、ノードがチェーンからデータのインデックスを作成したら、表示された GraphQL クエリエンドポイントからプロジェクトに接続することができるようになります。
-![プロジェクトを展開および同期する](/assets/img/projects-deploy-sync.png)
+![プロジェクトを展開および同期する](/assets/img/projects_deploy_sync.png)
または、プロジェクトのタイトルの横にある 3 つの点をクリックして、SubQuery Explorer で表示することもできます。 There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ja/subquery_network/introduction.md b/docs/ja/subquery_network/introduction.md
index 295d4a0e0d8..6394db677b8 100644
--- a/docs/ja/subquery_network/introduction.md
+++ b/docs/ja/subquery_network/introduction.md
@@ -18,22 +18,22 @@ There’s a role for everyone in the network, from highly technical developers t
Consumers will ask the SubQuery Network for specific data for their dApps or tools, and pay an advertised amount of SQT for each request.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexers
Indexers will run and maintain high quality SubQuery projects in their own infrastructure, running both the indexer and query service, and will be rewarded in SQT for the requests that they serve.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegators
Delegators will participate in the Network by supporting their favourite Indexers to earn rewards based on the work those indexers do.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Architects
Architects are the builders of the SubQuery projects that the Network runs on. They author and publish SubQuery projects for the Network to index and run.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/ko/README.md b/docs/ko/README.md
index 0265018eaed..88bf227f0d7 100644
--- a/docs/ko/README.md
+++ b/docs/ko/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/ko/build/install.md b/docs/ko/build/install.md
index eb9fe331a9f..cefedff8a6c 100644
--- a/docs/ko/build/install.md
+++ b/docs/ko/build/install.md
@@ -8,28 +8,30 @@
Yarn 또는 NPM을 사용하여 서브쿼리 CLI를 단말기에 글로벌 설치:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
You can then run help to see available commands and usage provide by CLI:
```shell
subql help
```
+
## @subql/node 설치
서브쿼리 노드는 서브쿼리 프로젝트별 Substrate 기반 블록체인 데이터을 추출하고, Postgres 데이터베이스에 저장합니다.
Yarn 또는 NPM을 사용하여 단말기에 서브쿼리 노드를 글로벌 설치:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Once installed, you can can start a node with:
```shell
subql-node
```
+
> 주의: Docker를 사용하거나 서브쿼리 프로젝트에서 프로젝트를 호스팅하는 경우라면, 이 단계를 건너뛸 수 있습니다. 이는 서브쿼리 노드가 이미 Docker 컨테이너 및 호스팅 인프라에 제공되고 있기 때문입니다.
## @subql/query 설치
@@ -38,7 +40,7 @@ subql-node
Yarn 또는 NPM을 사용하여 서브쿼리 쿼리를 단말기에 글로벌 설치:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> 주의: Docker를 사용하거나 서브쿼리 프로젝트에서 프로젝트를 호스팅하는 경우라면, 이 단계를 건너뛸 수 있습니다. 이는 서브쿼리 노드가 이미 Docker 컨테이너 및 호스팅 인프라에 제공되고 있기 때문입니다.
\ No newline at end of file
+> 주의: Docker를 사용하거나 서브쿼리 프로젝트에서 프로젝트를 호스팅하는 경우라면, 이 단계를 건너뛸 수 있습니다. 이는 서브쿼리 노드가 이미 Docker 컨테이너 및 호스팅 인프라에 제공되고 있기 때문입니다.
diff --git a/docs/ko/build/introduction.md b/docs/ko/build/introduction.md
index 588ef9e0c54..b8387003bed 100644
--- a/docs/ko/build/introduction.md
+++ b/docs/ko/build/introduction.md
@@ -51,8 +51,8 @@ yarn codegen
프로젝트의 루트 디렉터리에서 빌드 명령을 실행합니다.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### 대체 빌드 옵션
diff --git a/docs/ko/build/manifest.md b/docs/ko/build/manifest.md
index 06e36aa4693..91c201ead5c 100644
--- a/docs/ko/build/manifest.md
+++ b/docs/ko/build/manifest.md
@@ -4,7 +4,7 @@ Manifest `project.yaml` 파일은 프로젝트의 시작점으로 볼 수 있으
매니페스트는 YAML 또는 JSON 형식일 수 있습니다. 이 문서의 모든 예제는 YAML을 기준으로 합니다. 다음은 기본 `project.yaml`의 표준 예시입니다.
- ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## v0.0.1에서 v0.2.0으로 업그레이드
@@ -30,13 +30,13 @@ USAGE $ subql init [PROJECTNAME]
ARGUMENTS PROJECTNAME Give the starter project name
-| 옵션 | 설명 |
-| ----------------------- | ----------------------------------------------------- |
-| -f, --force | |
-| -l, --location=location | 프로젝트를 생성할 로컬 폴더 |
-| --install-dependencies | 종속성들의 설치 |
+| 옵션 | 설명 |
+| ----------------------- | --------------------------------------------------------------------- | ----------------------------- |
+| -f, --force | |
+| -l, --location=location | 프로젝트를 생성할 로컬 폴더 |
+| --install-dependencies | 종속성들의 설치 |
| --npm | yarn 대신 NPM을 강제로 사용, `install-dependencies` 플래그에서만 작동 |
-| --specVersion=0.0.1 | 0.2.0 [기본값: 0.2.0] | 프로젝트에서 사용할 사양 버전 |
+| --specVersion=0.0.1 | 0.2.0 [기본값: 0.2.0] | 프로젝트에서 사용할 사양 버전 |
## 개요
@@ -71,19 +71,19 @@ ARGUMENTS PROJECTNAME Give the starter project name
### DataSource 사양
필터링 및 추출할 데이터와 적용할 데이터 변환에 대한 매핑 함수 처리기의 위치를 정의합니다.
-| 필드 | v0.0.1 | v0.2.0 | 설명 |
+| 필드 | v0.0.1 | v0.2.0 | 설명 |
| -------------- | --------------------------------------------------------- | -------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- |
-| **name** | String | 𐄂 | 데이터 출처 명명 |
-| **kind** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | 블록, 이벤트 및 외부(호출)와 같은 기본 기판 런타임의 데이터 유형을 지원합니다. v0.2.0부터 스마트 계약과 같은 사용자 정의 런타임의 데이터를 지원합니다. |
-| **startBlock** | Integer | Integer | 이것은 인덱싱 시작 블록을 변경하고 더 적은 데이터로 초기 블록을 건너뛰려면 이 값을 높게 설정합니다. |
-| **mapping** | Mapping Spec | Mapping Spec | |
-| **filter** | [network-filters](./manifest/#network-filters) | 𐄂 | 네트워크 끝점 사양 이름으로 실행할 데이터 원본 필터링 |
+| **name** | String | 𐄂 | 데이터 출처 명명 |
+| **kind** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | 블록, 이벤트 및 외부(호출)와 같은 기본 기판 런타임의 데이터 유형을 지원합니다. v0.2.0부터 스마트 계약과 같은 사용자 정의 런타임의 데이터를 지원합니다. |
+| **startBlock** | Integer | Integer | 이것은 인덱싱 시작 블록을 변경하고 더 적은 데이터로 초기 블록을 건너뛰려면 이 값을 높게 설정합니다. |
+| **mapping** | Mapping Spec | Mapping Spec | |
+| **filter** | [network-filters](./manifest/#network-filters) | 𐄂 | 네트워크 끝점 사양 이름으로 실행할 데이터 원본 필터링 |
### Mapping Spec
-| 필드 | v0.0.1 | v0.2.0 | 설명 |
-| ---------------------- | -------------------------------------------------------------- | ----------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **file** | String | 𐄂 | Entry 맵핑을 위한 path |
+| 필드 | v0.0.1 | v0.2.0 | 설명 |
+| ---------------------- | -------------------------------------------------------------- | ----------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **file** | String | 𐄂 | Entry 맵핑을 위한 path |
| **handlers & filters** | [기본 핸들러와 필터](./manifest/#mapping-handlers-and-filters) | 기본 핸들러와 필터, [Custom handlers and filters](#custom-data-sources) | [mapping functions](./mapping/polkadot.md) 과 그에 상응하는 핸들러 유형을 추가적인 맵핑 필터와 함께 나열하세요.
커스텀 런타임 맵핑을 위해서는 [Custom data sources](#custom-data-sources)을 참조하세요. |
## Data Source와 맵핑
@@ -104,8 +104,8 @@ dataSources:
**적절한 매핑 필터가 있는 이벤트 및 호출 핸들러만 사용할 때 SubQuery 프로젝트가 훨씬 더 효율적입니다.**
-| 핸들러 | 지원되는 필터 |
-| ------------------------------------------- | ---------------------------- |
+| 핸들러 | 지원되는 필터 |
+| ---------------------------------------------------- | ---------------------------- |
| [블록핸들러](./mapping/polkadot.md#block-handler) | `specVersion` |
| [이벤트 핸들러](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [콜핸들러](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -153,10 +153,10 @@ filter:
아래 v0.2.0 예제에서 `network.chaintypes`는 모든 사용자 정의 유형이 포함된 파일을 가리키고 있습니다. 이것은 이 블록체인이 지원하는 특정 유형을 `.json`, `.yaml` 또는 `.js`로 선언하는 표준 chainspec 파일입니다.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
-체인 유형 파일에 typescript를 사용하려면 `src` 폴더(예: `./src/types.ts`)에 파일을 포함하고 `yarn build 4> 그런 다음 dist` 폴더에 있는 생성된 js 파일을 가리킵니다.
+체인 유형 파일에 typescript를 사용하려면 `src` 폴더(예: `./src/types.ts`)에 파일을 포함하고 `yarn build` 그런 다음 `dist` 폴더에 있는 생성된 js 파일을 가리킵니다.
```yml
network:
@@ -171,7 +171,7 @@ network:
다음은 `.ts` 체인 유형 파일의 예입니다.
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## 사용자 정의 데이터 소스
@@ -183,7 +183,7 @@ network:
다음은 지원되는 사용자 지정 데이터 소스 목록입니다.
-| 종류 | 지원 Handlers | 필터 | 소개 |
+| 종류 | 지원 Handlers | 필터 | 소개 |
| ----------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | ------------------------------- | -------------------------------------------------------------------------------- |
| [substrate/Moonbeam](./moonbeam/#data-source-example) | [substrate/MoonbeamEvent](./moonbeam/#moonbeamevent), [substrate/MoonbeamCall](./moonbeam/#moonbeamcall) | See filters under each handlers | Provides easy interaction with EVM transactions and events on Moonbeams networks |
@@ -197,6 +197,6 @@ network:
다음은 Polkadot 및 Kusama 네트워크 모두에 대해 서로 다른 데이터 소스를 보여주는 예입니다.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/ko/build/mapping.md b/docs/ko/build/mapping.md
index be45457b0f0..00fe3e9c33f 100644
--- a/docs/ko/build/mapping.md
+++ b/docs/ko/build/mapping.md
@@ -67,9 +67,9 @@ export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
현재 지원되는 인터페이스는 다음과 같습니다:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) 은 현재의 블록을 쿼리합니다.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type)는 현재의 블록과 같은 타입의 여러 Query를 생성합니다.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types)은 현재 블럭과 다른 타입의 여러 Query를 생성합니다.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) 은 **현재의** 블록을 쿼리합니다.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type)는 현재의 블록과 **같은** 타입의 여러 Query를 생성합니다.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types)은 현재 블럭과 **다른** 타입의 여러 Query를 생성합니다.
현재 **지원하고 있지 않은** 인터페이스는 다음과 같습니다:
@@ -166,7 +166,7 @@ echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
[types setup](https://polkadot.js.org/docs/api/examples/promise/typegen#metadata-setup)에 따라 다음을 생성합니다:
-- < 0 > srcapi-interfaces definitions.ts < 0 >: 모든 서브폴더 정의를 내보냅니다.
+- `src.api-interfaces definitions.ts`: 모든 서브폴더 정의를 내보냅니다.
```ts
export { default as kitties } from "./kitties/definitions";
diff --git a/docs/ko/build/substrate-evm.md b/docs/ko/build/substrate-evm.md
index 52cf4d38a6d..db689001518 100644
--- a/docs/ko/build/substrate-evm.md
+++ b/docs/ko/build/substrate-evm.md
@@ -4,13 +4,13 @@
지원되는 네트워크:
-| 네트워크 이름 | 웹소켓 엔드포인트 | 딕셔너리 엔드포인트 |
-| ------- | ----------------------------------------------------------------- | -------------------------------------------------------------------- |
-| 문빔 | `wss://moonbeam.api.onfinality.io/public-ws` | `https://api.subquery.network/sq/subquery/moonbeam-dictionary` |
-| 문리버 | `wss://moonriver.api.onfinality.io/public-ws
-Contextrequest` | `https://api.subquery.network/sq/subquery/moonriver-dictionary` |
-| 문베이스 알파 | `wss://moonbeam-alpha.api.onfinality.io/public-ws
-Contextrequest` | `https://api.subquery.network/sq/subquery/moonbase-alpha-dictionary` |
+| 네트워크 이름 | 웹소켓 엔드포인트 | 딕셔너리 엔드포인트 |
+| --------------- | -------------------------------------------------------------------- | -------------------------------------------------------------- |
+| 문빔 | `wss://moonbeam.api.onfinality.io/public-ws` | `https://api.subquery.network/sq/subquery/moonbeam-dictionary` |
+| 문리버 | `wss://moonriver.api.onfinality.io/public-ws |
+| Contextrequest` | `https://api.subquery.network/sq/subquery/moonriver-dictionary` |
+| 문베이스 알파 | `wss://moonbeam-alpha.api.onfinality.io/public-ws |
+| Contextrequest` | `https://api.subquery.network/sq/subquery/moonbase-alpha-dictionary` |
**[기본 Moonriver EVM 프로젝트](https://github.com/subquery/tutorials-moonriver-evm-starter)와 더불어 이벤트 및 콜 핸들러를 참조하세요.** 또한 본 프로젝트는 [서브쿼리 익스플로러](https://explorer.subquery.network/subquery/subquery/moonriver-evm-starter-project)를 통해 실시간 호스팅됩니다.
@@ -22,31 +22,31 @@ Contextrequest` | `https://api.subquery.network/sq/subquery/moonbase-alpha-dicti
## 데이터 소스 사양
-| 필드 | 타입 | 요구사항 | 설명 |
-| ----------------- | -------------------------------------------------------------- | ---- | ------------------------------------------ |
-| processor.file | `'./node_modules/@subql/contract-processors/dist/moonbeam.js'` | 네 | 데이터 프로세서 코드에 대한 파일 참조 |
-| processor.options | [ProcessorOptions](#processor-options) | 아니오 | Options specific to the Moonbeam Processor |
-| assets | `{ [key: String]: { file: String }}` | 아니오 | An object of external asset files |
+| 필드 | 타입 | 요구사항 | 설명 |
+| ----------------- | -------------------------------------------------------------- | -------- | ------------------------------------------ |
+| processor.file | `'./node_modules/@subql/contract-processors/dist/moonbeam.js'` | 네 | 데이터 프로세서 코드에 대한 파일 참조 |
+| processor.options | [ProcessorOptions](#processor-options) | 아니오 | Options specific to the Moonbeam Processor |
+| assets | `{ [key: String]: { file: String }}` | 아니오 | An object of external asset files |
### 프로세서 옵션
-| 필드 | 타입 | 요구사항 | 설명 |
-| ---- | ---------------- | ---- | ----------------------------------------------------------------- |
-| abi | String | 아니오 | ABI는 프로세서가 인자 파싱을 위해 사용. 반드시 `assets`의 키여야 합니다 |
-| 어드레스 | String 또는 `null` | 아니오 | 이벤트 또는 콜이 만들어진 거래 주소. `null` will capture contract creation calls |
+| 필드 | 타입 | 요구사항 | 설명 |
+| -------- | ------------------ | -------- | -------------------------------------------------------------------------------- |
+| abi | String | 아니오 | ABI는 프로세서가 인자 파싱을 위해 사용. 반드시 `assets`의 키여야 합니다 |
+| 어드레스 | String 또는 `null` | 아니오 | 이벤트 또는 콜이 만들어진 거래 주소. `null` will capture contract creation calls |
## MoonbeamCall
Works in the same way as [substrate/CallHandler](../create/mapping/#call-handler) except with a different handler argument and minor filtering changes.
-| 필드 | 타입 | 요구사항 | 설명 |
-| ------ | ---------------------------- | ---- | ------------------------------------------- |
-| kind | 'substrate/MoonbeamCall' | 네 | Specifies that this is an Call type handler |
-| filter | [Call Filter](#call-filters) | 아니오 | Filter the data source to execute |
+| 필드 | 타입 | 요구사항 | 설명 |
+| ------ | ---------------------------- | -------- | ------------------------------------------- |
+| kind | 'substrate/MoonbeamCall' | 네 | Specifies that this is an Call type handler |
+| filter | [Call Filter](#call-filters) | 아니오 | Filter the data source to execute |
### Call Filters
-| 필드 | 타입 | 예시 | 설명 |
+| 필드 | 타입 | 예시 | 설명 |
| -------- | ------ | --------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| function | String | 0x095ea7b3, approve(address to,uint256 value) | Either [Function Signature](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment) strings or the function `sighash` to filter the function called on the contract |
| from | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | An Ethereum address that sent the transaction |
@@ -65,18 +65,18 @@ Works in the same way as [substrate/CallHandler](../create/mapping/#call-handler
Works in the same way as [substrate/EventHandler](../create/mapping/#event-handler) except with a different handler argument and minor filtering changes.
-| 필드 | 타입 | 요구사항 | 설명 |
-| ------ | ------------------------------ | ---- | -------------------------------------------- |
-| kind | 'substrate/MoonbeamEvent' | 네 | Specifies that this is an Event type handler |
-| filter | [Event Filter](#event-filters) | 아니오 | Filter the data source to execute |
+| 필드 | 타입 | 요구사항 | 설명 |
+| ------ | ------------------------------ | -------- | -------------------------------------------- |
+| kind | 'substrate/MoonbeamEvent' | 네 | Specifies that this is an Event type handler |
+| filter | [Event Filter](#event-filters) | 아니오 | Filter the data source to execute |
### 이벤트 필터
-| 필드 | 타입 | 예시 | 설명 |
+| 필드 | 타입 | 예시 | 설명 |
| ------ | ------------ | --------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------ |
| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | The topics filter follows the Ethereum JSON-PRC log filters, more documentation can be found [here](https://docs.ethers.io/v5/concepts/events/). |
-Note on topics:
+**Note on topics:**
There are a couple of improvements from basic log filters:
- Topics don't need to be 0 padded
@@ -99,17 +99,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -124,7 +124,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## 알려진 제약사항
diff --git a/docs/ko/faqs/faqs.md b/docs/ko/faqs/faqs.md
index 4eddd2a9c33..5204fd6a6eb 100644
--- a/docs/ko/faqs/faqs.md
+++ b/docs/ko/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery also provides free, production grade hosting of projects for developers
**The SubQuery Network**
-The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
+The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. The SubQuery Network indexes and services data to the global community in an incentivised and verifiable way. After publishing your project to the SubQuery Network, anyone can index and host it - providing data to users around the world faster and reliably.
More information [here](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ The best way to get started with SubQuery is to try out our [Hello World tutoria
## SubQuery에 기여하거나 피드백을 하려면 어떻게 해야하나요?
-우리는 언제나 커뮤니티의 기여와 피드백을 환영합니다. To contribute the code, fork the repository of your interest and make your changes. 그런 다음 PR 또는 풀 리퀘스트를 통해 제출해주세요. Don't forget to test as well. Also check out our contributions guidelines.
+우리는 언제나 커뮤니티의 기여와 피드백을 환영합니다. To contribute the code, fork the repository of your interest and make your changes. 그런 다음 PR 또는 풀 리퀘스트를 통해 제출해주세요. Don't forget to test as well. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
To give feedback, contact us at hello@subquery.network or jump onto our [discord channel](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Note that it is recommended to use `--force-clean` when changing the `startBlock` within the project manifest (`project.yaml`) in order to begin reindexing from the configured block. If `startBlock` is changed without a `--force-clean` of the project, then the indexer will continue indexing with the previously configured `startBlock`.
-
## How can I optimise my project to speed it up?
Performance is a crucial factor in each project. Fortunately, there are several things you could do to improve it. Here is the list of some suggestions:
@@ -89,13 +88,13 @@ Performance is a crucial factor in each project. Fortunately, there are several
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/ko/miscellaneous/contributing.md b/docs/ko/miscellaneous/contributing.md
index a45707e71d6..2d63908f320 100644
--- a/docs/ko/miscellaneous/contributing.md
+++ b/docs/ko/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
SubQuery 프로젝트에 도움을 주는 여러분을 환영하고 깊은 감사의 말씀을 드립니다. 우리는 함께 보다 탈중화 미래를 위한 길을 마련할 수 있습니다.
-::: info Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
+::: tip Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
What follows is a set of guidelines (not rules) for contributing to SubQuery. Following these guidelines will help us make the contribution process easy and effective for everyone involved. It also communicates that you agree to respect the time of the developers managing and developing this project. In return, we will reciprocate that respect by addressing your issue, considering changes, collaborating on improvements, and helping you finalise your pull requests.
@@ -14,8 +14,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* 본인의 것을 만들기 전에 기존의 Issues 및 PRs를 검색하십시오.
-* 우리는 문제가 신속하게 처리될 수 있도록 열심히 노력하고 있지만, 영향 정도에 따라 근본 원인을 파악하는데 좀 더 시간이 걸릴 수 있습니다. 당신의 문제가 막힌 경우, 댓글로 친절하게 @로 제출자 혹은 기여자를 언급하면 보다 쉽게 관심을 끌 수 있습니다.
+- 본인의 것을 만들기 전에 기존의 Issues 및 PRs를 검색하십시오.
+- 우리는 문제가 신속하게 처리될 수 있도록 열심히 노력하고 있지만, 영향 정도에 따라 근본 원인을 파악하는데 좀 더 시간이 걸릴 수 있습니다. 당신의 문제가 막힌 경우, 댓글로 친절하게 @로 제출자 혹은 기여자를 언급하면 보다 쉽게 관심을 끌 수 있습니다.
## 기여 방법
@@ -23,32 +23,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* 문제를 식별하기 위해 문제에 대해 명확하고 자세한 제목을 사용합니다.
-* 문제를 재현 확인하기 위한 정확한 단계를 설명합니다.
-* 단계들을 수행한 후에 당신이 보았던 동작을 설명하십시오.
-* 당신이 기대 예상했던 작동과 그 이유를 설명하세요.
-* 가능하다면, 스크린샷을 첨부하세요.
+- 문제를 식별하기 위해 문제에 대해 명확하고 자세한 제목을 사용합니다.
+- 문제를 재현 확인하기 위한 정확한 단계를 설명합니다.
+- 단계들을 수행한 후에 당신이 보았던 동작을 설명하십시오.
+- 당신이 기대 예상했던 작동과 그 이유를 설명하세요.
+- 가능하다면, 스크린샷을 첨부하세요.
### Pull Requests 제출
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## 코딩 규칙
### Git 커밋 메시지
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### 자바스크립트 스타일 지침
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/ko/quickstart/helloworld-localhost.md b/docs/ko/quickstart/helloworld-localhost.md
index 93885736b64..7328b32ec68 100644
--- a/docs/ko/quickstart/helloworld-localhost.md
+++ b/docs/ko/quickstart/helloworld-localhost.md
@@ -88,13 +88,13 @@ cd subqlHelloWorld
이제 다양한 종속성을 설치하기 위해 원사 또는 노드 설치를 수행합니다.
- `shell yarn install `
- `bash npm install `
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
An example of `yarn install`
```shell
-# Yarn yarn install # NPM npm install
+# Yarn yarn install # NPM npm install
> yarn codegen
yarn run v1.22.10
@@ -148,19 +148,19 @@ $ ./node_modules/.bin/subql codegen
# NPM npm run-script build
```
-````shell
+```shell
> yarn build
yarn run v1.22.10
$ tsc -b
-✨ Done in 5.68s. ```
-
+✨ Done in 5.68s.
+```
```shell
> yarn build
yarn run v1.22.10
$ tsc -b
✨ Done in 5.68s.
-````
+```
## 5. Docker 실행
diff --git a/docs/ko/quickstart/quickstart-avalanche.md b/docs/ko/quickstart/quickstart-avalanche.md
index 7ced458bfb3..62a6e88223a 100644
--- a/docs/ko/quickstart/quickstart-avalanche.md
+++ b/docs/ko/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -169,7 +169,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -183,7 +183,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
diff --git a/docs/ko/quickstart/quickstart-cosmos.md b/docs/ko/quickstart/quickstart-cosmos.md
index 947196968cc..5bda04ae26a 100644
--- a/docs/ko/quickstart/quickstart-cosmos.md
+++ b/docs/ko/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -75,8 +75,8 @@ type Vote @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -159,7 +159,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -173,10 +173,9 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/ko/quickstart/quickstart-polkadot.md b/docs/ko/quickstart/quickstart-polkadot.md
index 6e484c48f67..38f856093ac 100644
--- a/docs/ko/quickstart/quickstart-polkadot.md
+++ b/docs/ko/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql 초기화
You'll be asked certain questions as the SubQuery project is initalised:
- Project name: A project name for your SubQuery project
-- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Substrate"*
-- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use *"Polkadot"*
-- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the *"subql-starter"* project.
-- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value *"https://polkadot.api.onfinality.io"*
+- Network family: The layer-1 blockchain network family that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Substrate"_
+- Network: The specific network that this SubQuery project will be developed to index. Use the arrow keys to select from the available options. For this guide, we will use _"Polkadot"_
+- Template project: Select a SubQuery template project that will provide a starting point to begin development. We suggest selecting the _"subql-starter"_ project.
+- RPC endpoint: Provide an HTTPS URL to a running RPC endpoint that will be used by default for this project. You can quickly access public endpoints for different Polkadot networks, create your own private dedicated node using [OnFinality](https://app.onfinality.io) or just use the default Polkadot endpoint. This RPC node must be an archive node (have the full chain state). For this guide, we will use the default value _"https://polkadot.api.onfinality.io"_
- Git repository: Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer) or accept the provided default.
- Authors: Enter the owner of this SubQuery project here (e.g. your name!) or accept the provided default.
- Description: Provide a short paragraph about your project that describes what data it contains and what users can do with it or accept the provided default.
@@ -57,8 +57,8 @@ After the initialisation process is complete, you should see that a folder with
Last, under the project directory, run the following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -88,8 +88,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ For more information about mapping functions, check out our documentation under
In order to run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you will need to rebuild your project**
@@ -174,7 +174,7 @@ All configuration that controls how a SubQuery node is run is defined in the `do
Under the project directory, run the following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you should see a running SubQuery node in the terminal screen.
@@ -189,10 +189,7 @@ For a new SubQuery starter project, try the following query to understand how it
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/ko/quickstart/quickstart-terra.md b/docs/ko/quickstart/quickstart-terra.md
index 195ee24bd81..2bef39b3e92 100644
--- a/docs/ko/quickstart/quickstart-terra.md
+++ b/docs/ko/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql 초기화
You'll be asked certain questions as the SubQuery project is initalised:
- Project Name: A name for your SubQuery project
-- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use *"Terra"*
-- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the *"Starter project"*
+- Network Family: The layer-1 blockchain network family that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Network: The specific network that this SubQuery project will be developed to index, use the arrow keys on your keyboard to select from the options, for this guide we will use _"Terra"_
+- Template: Select a SubQuery project template that will provide a starting point to begin development, we suggest selecting the _"Starter project"_
- Git repository (Optional): Provide a Git URL to a repo that this SubQuery project will be hosted in (when hosted in SubQuery Explorer)
-- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Required): Provide a HTTPS URL to a running RPC endpoint that will be used by default for this project. This RPC node must be an archive node (have the full chain state). For this guide we will use the default value _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (Required): Enter the owner of this SubQuery project here (e.g. your name!)
- Description (Optional): You can provide a short paragraph about your project that describe what data it contains and what users can do with it
- Version (Required): Enter a custom version number or use the default (`1.0.0`)
@@ -59,8 +59,8 @@ After the initialisation process is complete, you should see a folder with your
Last, under the project directory, run following command to install the new project's dependencies.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Making Changes to your Project
@@ -91,8 +91,8 @@ type Transfer @entity {
**Important: When you make any changes to the schema file, please ensure that you regenerate your types directory. Do this now.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. For more information about the `schema.graphql` file, check out our documentation under [Build/GraphQL Schema](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ For more information about mapping functions, check out our documentation under
In order run your new SubQuery Project we first need to build our work. Run the build command from the project's root directory.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Important: Whenever you make changes to your mapping functions, you'll need to rebuild your project**
@@ -192,7 +192,7 @@ All configuration that controls how a SubQuery node is run is defined in this `d
Under the project directory run following command:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. Be patient here.
@@ -207,10 +207,7 @@ For a new SubQuery starter project, you can try the following query to get a tas
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/ko/quickstart/quickstart.md b/docs/ko/quickstart/quickstart.md
index 8f75618992d..7459ecb8b9b 100644
--- a/docs/ko/quickstart/quickstart.md
+++ b/docs/ko/quickstart/quickstart.md
@@ -89,8 +89,8 @@ After you complete the initialisation process, you will see a folder with your p
Finally, run the following command to install the new project’s dependencies from within the new project's directory.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
You have now initialised your first SubQuery project with just a few simple steps. Let’s now customise the standard template project for a specific blockchain of interest.
@@ -104,4 +104,4 @@ There are 3 important files that need to be modified. These are:
2. The Project Manifest in `project.yaml`.
3. The Mapping functions in `src/mappings/` directory.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/ko/run_publish/connect.md b/docs/ko/run_publish/connect.md
index 605fc02dba1..e21a29377f2 100644
--- a/docs/ko/run_publish/connect.md
+++ b/docs/ko/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![배포 및 동기화된 프로젝트](/assets/img/projects-deploy-sync.png)
+![배포 및 동기화된 프로젝트](/assets/img/projects_deploy_sync.png)
프로젝트 제목 옆에 있는 3개의 점을 클릭하여 SubQuery 탐색기로 표시할 수도 있습니다. There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ko/run_publish/query.md b/docs/ko/run_publish/query.md
index 0e427715fa4..e1ac6575f51 100644
--- a/docs/ko/run_publish/query.md
+++ b/docs/ko/run_publish/query.md
@@ -12,4 +12,4 @@ SubQuery Explorer를 사용하면 쉽게 시작할 수 있습니다. 저희는
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. 이 문서는 자동으로 생성되어 조회할 수 있는 Entity와 Method를 찾는데 도움이 됩니다.
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ko/run_publish/references.md b/docs/ko/run_publish/references.md
index 66cd155a16d..8da7819dfcd 100644
--- a/docs/ko/run_publish/references.md
+++ b/docs/ko/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | 설명 |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | 설명 |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/ko/run_publish/run.md b/docs/ko/run_publish/run.md
index b98c2543378..562e49b153f 100644
--- a/docs/ko/run_publish/run.md
+++ b/docs/ko/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/ko/run_publish/subscription.md b/docs/ko/run_publish/subscription.md
index 9bbc2661f96..29cf660d492 100644
--- a/docs/ko/run_publish/subscription.md
+++ b/docs/ko/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery now also supports Graphql Subscriptions. Like queries, subscriptions en
Subscriptions are very useful when you want your client application to change data or show some new data as soon as that change occurs or the new data is available. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## How to Subscribe to an Entity
diff --git a/docs/ko/run_publish/upgrade.md b/docs/ko/run_publish/upgrade.md
index b1391baa06a..883f2628ad3 100644
--- a/docs/ko/run_publish/upgrade.md
+++ b/docs/ko/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
배포가 성공적으로 완료되고 노드가 체인에서 데이터를 인덱스화하면 표출된 GraphQL 쿼리 엔드포인트를 통해 프로젝트에 접속할 수 있습니다.
-![배포 및 동기화된 프로젝트](/assets/img/projects-deploy-sync.png)
+![배포 및 동기화된 프로젝트](/assets/img/projects_deploy_sync.png)
프로젝트 제목 옆에 있는 3개의 점을 클릭하여 SubQuery 탐색기로 표시할 수도 있습니다. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/ko/subquery_network/introduction.md b/docs/ko/subquery_network/introduction.md
index 295d4a0e0d8..6394db677b8 100644
--- a/docs/ko/subquery_network/introduction.md
+++ b/docs/ko/subquery_network/introduction.md
@@ -18,22 +18,22 @@ There’s a role for everyone in the network, from highly technical developers t
Consumers will ask the SubQuery Network for specific data for their dApps or tools, and pay an advertised amount of SQT for each request.
-::: info Note Learn more about [Consumers](./consumers.md). :::
+::: tip Note Learn more about [Consumers](./consumers.md). :::
### Indexers
Indexers will run and maintain high quality SubQuery projects in their own infrastructure, running both the indexer and query service, and will be rewarded in SQT for the requests that they serve.
-::: info Note Learn more about [Indexers](./indexers.md). :::
+::: tip Note Learn more about [Indexers](./indexers.md). :::
### Delegators
Delegators will participate in the Network by supporting their favourite Indexers to earn rewards based on the work those indexers do.
-::: info Note Learn more about [Delegators](./delegators.md). :::
+::: tip Note Learn more about [Delegators](./delegators.md). :::
### Architects
Architects are the builders of the SubQuery projects that the Network runs on. They author and publish SubQuery projects for the Network to index and run.
-::: info Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
+::: tip Note Learn more about [how to build your first SubQuery project](../build/introduction.md). :::
diff --git a/docs/miscellaneous/contributing.md b/docs/miscellaneous/contributing.md
index 12b1bdfefed..4ab8192656a 100644
--- a/docs/miscellaneous/contributing.md
+++ b/docs/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
Welcome and a big thank you for considering contributing to this SubQuery project! Together we can pave the way to a more decentralised future.
-::: info Note
+::: tip Note
This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory.
:::
@@ -16,8 +16,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* Search for existing Issues and PRs before creating your own.
-* We work hard to makes sure issues are handled in promptly but, depending on the impact, it could take a while to investigate the root cause. A friendly @ mention in the comment thread to the submitter or a contributor can help draw attention if your issue is blocking.
+- Search for existing Issues and PRs before creating your own.
+- We work hard to makes sure issues are handled in promptly but, depending on the impact, it could take a while to investigate the root cause. A friendly @ mention in the comment thread to the submitter or a contributor can help draw attention if your issue is blocking.
## How to Contribute
@@ -25,32 +25,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* Use a clear and descriptive title for the issue to identify the problem.
-* Describe the exact steps to reproduce the problem.
-* Describe the behavior you observed after following the steps.
-* Explain which behavior you expected to see instead and why.
-* Include screenshots if possible.
+- Use a clear and descriptive title for the issue to identify the problem.
+- Describe the exact steps to reproduce the problem.
+- Describe the behavior you observed after following the steps.
+- Explain which behavior you expected to see instead and why.
+- Include screenshots if possible.
### Submitting Pull Requests
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## Coding Conventions
### Git Commit Messages
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### JavaScript Styleguide
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/quickstart/quickstart.md b/docs/quickstart/quickstart.md
index bc1e58556d4..fc468dee86c 100644
--- a/docs/quickstart/quickstart.md
+++ b/docs/quickstart/quickstart.md
@@ -92,25 +92,22 @@ After you complete the initialisation process, you will see a folder with your p
Finally, run the following command to install the new project’s dependencies from within the new project's directory.
-
-
+::: code-tabs
+@tab:active yarn
```shell
cd PROJECT_NAME
yarn install
```
-
-
-
+@tab npm
```shell
cd PROJECT_NAME
npm install
```
-
-
+:::
You have now initialised your first SubQuery project with just a few simple steps. Let’s now customise the standard template project for a specific blockchain of interest.
@@ -124,4 +121,4 @@ There are 3 important files that need to be modified. These are:
2. The Project Manifest in `project.yaml`.
3. The Mapping functions in `src/mappings/` directory.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/quickstart/quickstart_chains/algorand.md b/docs/quickstart/quickstart_chains/algorand.md
index 77bd5604603..fba84f3ec22 100644
--- a/docs/quickstart/quickstart_chains/algorand.md
+++ b/docs/quickstart/quickstart_chains/algorand.md
@@ -12,7 +12,7 @@ Now, let's move forward and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/algorand-planet-watch).
:::
@@ -36,23 +36,20 @@ type Transaction @entity {
When you make any changes to the schema file, please ensure that you regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -133,22 +130,20 @@ Check out our [Mappings](../../build/mapping/algorand.md) documentation to get m
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -166,24 +161,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -391,7 +384,7 @@ You will see the result similar to below:
}
```
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/algorand-planet-watch).
:::
diff --git a/docs/quickstart/quickstart_chains/avalanche.md b/docs/quickstart/quickstart_chains/avalanche.md
index 34cd957dea7..025e40f1362 100644
--- a/docs/quickstart/quickstart_chains/avalanche.md
+++ b/docs/quickstart/quickstart_chains/avalanche.md
@@ -11,7 +11,7 @@ Before we begin, make sure that you have initialised your project using the prov
Now, let's move forward and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/pangolin-rewards-tutorial).
:::
@@ -36,23 +36,20 @@ type PangolinRewards @entity {
When you make any changes to the schema file, please ensure that you regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -144,22 +141,20 @@ Check out our [Mappings](../../build/mapping/avalanche.md) documentation to get
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -177,24 +172,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -246,7 +239,7 @@ You will see the result similar to below:
}
```
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/pangolin-rewards-tutorial).
:::
diff --git a/docs/quickstart/quickstart_chains/cosmos-cronos.md b/docs/quickstart/quickstart_chains/cosmos-cronos.md
index 17ab7836385..8d287ec13a9 100644
--- a/docs/quickstart/quickstart_chains/cosmos-cronos.md
+++ b/docs/quickstart/quickstart_chains/cosmos-cronos.md
@@ -14,7 +14,7 @@ Now, let's move ahead in the process and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/deverka/cronos_crow_token_transfers).
:::
@@ -37,23 +37,20 @@ type Transfer @entity {
When you make any changes to the schema file, do not forget to regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -76,8 +73,8 @@ Note that the manifest file has already been set up correctly and doesn’t requ
There are two versions of this file depending on your choice to index data via the ETH or Cosmos RPC
:::
-
-
+::: code-tabs
+@tab ETH
```yml
dataSources:
@@ -102,9 +99,7 @@ dataSources:
- null
```
-
-
-
+@tab Cosmos RPC
```yml
dataSources:
@@ -131,12 +126,11 @@ dataSources:
- null
```
-
-
+:::
The above code defines that you will be running a `handleTransfer` mapping function whenever there is an event emitted with the `transfer` method. Check out our [Manifest File](../../build/manifest/cosmos.md) documentation to get more information about the Project Manifest (`project.yaml`) file.
-::: info Note
+::: tip Note
Please note that Cro Crow token requires a specific ABI interface. You need to:
- Get the [Cro Crow contract ABI](https://cronoscan.com/address/0xe4ab77ed89528d90e6bcf0e1ac99c58da24e79d5#code).
@@ -159,8 +153,8 @@ There are two versions of this file depending on your choice to index data via t
Update your mapping files to match the following (**note the additional imports**):
-
-
+::: code-tabs
+@tab ETH
```ts
import { Transfer } from "../types";
@@ -189,9 +183,7 @@ export async function handleTransfer(
}
```
-
-
-
+@tab Cosmos RPC
```ts
import { Transfer } from "../types";
@@ -220,8 +212,7 @@ export async function handleTransfer(
}
```
-
-
+:::
Let’s understand how the above code works. Here, the function receives an `EthereumLog` or `EthermintEvmEvent` which includes data on the payload. We extract this data and then create a new `Transfer` entity defined earlier in the `schema.graphql` file. After that we use the `.save()` function to save the new entity (SubQuery will automatically save this to the database). Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get information on the mapping functions in detail.
@@ -229,22 +220,20 @@ Let’s understand how the above code works. Here, the function receives an `Eth
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -262,24 +251,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -341,7 +328,7 @@ You will see the result similar to below:
}
```
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/deverka/cronos_crow_token_transfers).
:::
diff --git a/docs/quickstart/quickstart_chains/cosmos-thorchain.md b/docs/quickstart/quickstart_chains/cosmos-thorchain.md
index 93b552775c1..c44aeed748b 100644
--- a/docs/quickstart/quickstart_chains/cosmos-thorchain.md
+++ b/docs/quickstart/quickstart_chains/cosmos-thorchain.md
@@ -14,7 +14,7 @@ Now, let's move ahead in the process and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Thorchain/thorchain-starter).
:::
@@ -53,23 +53,20 @@ type Coin @entity {
When you make any changes to the schema file, do not forget to regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -175,22 +172,20 @@ Let’s understand how the above code works. Here, the function receives an `Cos
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -208,24 +203,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -284,7 +277,7 @@ query {
You will see the result in JSON
-::: info Note
+::: tip
The final code of this project can be found [here](https://github.com/subquery/cosmos-subql-starter/tree/main/Thorchain/thorchain-starter).
:::
diff --git a/docs/quickstart/quickstart_chains/cosmos.md b/docs/quickstart/quickstart_chains/cosmos.md
index 4defb3144fa..a0228dd9e32 100644
--- a/docs/quickstart/quickstart_chains/cosmos.md
+++ b/docs/quickstart/quickstart_chains/cosmos.md
@@ -18,7 +18,7 @@ Note that we are using Juno as the example here, but SubQuery supports all the f
- [Cronos](https://github.com/subquery/cosmos-subql-starter/tree/main/Cronos)
- and more, view the full list in [the cosmos-subql-starter repository](https://github.com/subquery/cosmos-subql-starter).
-::: info Note
+::: tip Note
SubQuery can support more Cosmos zones than listed above.
It requires importing `protobufs definitions` for specific chain types.
See [Custom Cosmos Chains](../../build/manifest/cosmos.md#custom-chains) for more information.
@@ -28,7 +28,7 @@ Now, let's move ahead in the process and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/juno-terra-developer-fund-votes).
:::
@@ -52,23 +52,20 @@ type Vote @entity {
When you make any changes to the schema file, do not forget to regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -150,22 +147,20 @@ Check out our [Mappings](../../build/mapping/cosmos.md) documentation and get in
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -183,24 +178,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -260,7 +253,7 @@ You will see the result similar to below:
}
```
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/juno-terra-developer-fund-votes).
:::
diff --git a/docs/quickstart/quickstart_chains/flare.md b/docs/quickstart/quickstart_chains/flare.md
index ed275a73d65..66611398bf4 100644
--- a/docs/quickstart/quickstart_chains/flare.md
+++ b/docs/quickstart/quickstart_chains/flare.md
@@ -12,7 +12,7 @@ Now, let's move forward and update these configurations.
Previously, in the [1. Create a New Project](../quickstart.md) section, you must have noted [3 key files](../quickstart.md#_3-make-changes-to-your-project). Let's begin updating them one by one.
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/subql-flare-ftso-rewards).
:::
@@ -45,23 +45,20 @@ Since we have a [many-to-many relationship](../../build/graphql.md#many-to-many-
When you make any changes to the schema file, please ensure that you regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -185,22 +182,20 @@ Check out our [Mappings](../../build/mapping/flare.md) documentation to get more
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -218,24 +213,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
@@ -308,7 +301,7 @@ You will see the result similar to below:
}
```
-::: info Note
+::: tip Note
The final code of this project can be found [here](https://github.com/jamesbayly/subql-flare-ftso-rewards).
:::
diff --git a/docs/quickstart/quickstart_chains/polkadot-humanode.md b/docs/quickstart/quickstart_chains/polkadot-humanode.md
index 9574fc8fccd..8af16f4677c 100644
--- a/docs/quickstart/quickstart_chains/polkadot-humanode.md
+++ b/docs/quickstart/quickstart_chains/polkadot-humanode.md
@@ -1,4 +1,4 @@
-# Humanode Quick Start
+# Polkadot Quick Start (Humanode)
## Goals
@@ -38,20 +38,21 @@ type ImOnlineSomeOffline @entity {
While making any changes to the schema file, make sure to regenerate your types directory
:::
-
-
- ```shell
- yarn codegen
- ```
-
-
-
- ```shell
- npm run-script codegen
- ```
-
-
-
+::: code-tabs
+@tab:active yarn
+
+```shell
+yarn codegen
+```
+
+@tab npm
+
+```shell
+npm run-script codegen
+```
+
+:::
+
You will find the generated models in the `/src/types/models` directory.
Check out the [GraphQL Schema](../../build/graphql.md) documentation to get in-depth information on `schema.graphql` file.
@@ -87,10 +88,10 @@ dataSources:
filter:
module: imOnline
method: SomeOffline
- ```
-
+```
+
This indicates that you will be running a `handleBioauthNewAuthenticationEvent` and `handleImonlineSomeOfflineEvent` mapping functions whenever there are events emitted from the `bioauth` and `imOnline modules` with the `NewAuthentication` and `SomeOffline` methods, respectively.
-
+
Check out our [documentation](../../build/manifest/polkadot.md) to get more information about the Project Manifest (`project.yaml`) file.
Next, let’s proceed ahead with the Mapping Function’s configuration.
@@ -148,24 +149,23 @@ export async function handleImonlineSomeOfflineEvent(
```
## 4. Building Your Project
+
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Make sure to rebuild your project when you change your mapping functions.
@@ -183,22 +183,20 @@ However, visit Running [SubQuery Locally](../../run_publish/run.html) to get mor
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
::: warning Note
It may take a few minutes to download the required images and start various nodes and Postgres databases.
@@ -222,7 +220,7 @@ query {
}
}
imOnlineSomeOfflineByNodeId (nodeId : 5) {
- id
+ id
}
_metadata {
lastProcessedHeight
diff --git a/docs/quickstart/quickstart_chains/polkadot.md b/docs/quickstart/quickstart_chains/polkadot.md
index 0e53c579ccb..3040a6e2151 100644
--- a/docs/quickstart/quickstart_chains/polkadot.md
+++ b/docs/quickstart/quickstart_chains/polkadot.md
@@ -32,19 +32,20 @@ type Transfer @entity {
When you make any changes to the schema file, please ensure that you regenerate your types directory.
:::
-
-
- ```shell
- yarn codegen
- ```
-
-
-
- ```shell
- npm run-script codegen
- ```
-
-
+::: code-tabs
+@tab:active yarn
+
+```shell
+yarn codegen
+```
+
+@tab npm
+
+```shell
+npm run-script codegen
+```
+
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -129,22 +130,20 @@ Check out our [Mappings](../../build/mapping/polkadot.md) documentation to get d
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, make sure to rebuild your project.
@@ -162,24 +161,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
diff --git a/docs/quickstart/quickstart_chains/terra.md b/docs/quickstart/quickstart_chains/terra.md
index 006789b3672..ee093cb8948 100644
--- a/docs/quickstart/quickstart_chains/terra.md
+++ b/docs/quickstart/quickstart_chains/terra.md
@@ -33,23 +33,20 @@ type Transfer @entity {
When you make any changes to the schema file, do make sure to regenerate your types directory.
:::
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn codegen
```
-
-
-
+@tab npm
```shell
npm run-script codegen
```
-
-
+:::
You will find the generated models in the `/src/types/models` directory.
@@ -150,22 +147,20 @@ Check out our [Mappings](../../build/mapping/terra.md) documentation to get deta
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn build
```
-
-
+@tab npm
```shell
npm run-script build
```
-
-
+:::
::: warning Important
Whenever you make changes to your mapping functions, you must rebuild your project.
@@ -183,24 +178,22 @@ However, visit the [Running SubQuery Locally](../../run_publish/run.md) to get m
Run the following command under the project directory:
-
-
+::: code-tabs
+@tab:active yarn
```shell
yarn start:docker
```
-
-
+@tab npm
```shell
npm run-script start:docker
```
-
-
+:::
-::: info Note
+::: tip Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
:::
diff --git a/docs/ru/README.md b/docs/ru/README.md
index f16c7bfa3cf..04ee9e01fe4 100644
--- a/docs/ru/README.md
+++ b/docs/ru/README.md
@@ -4,7 +4,7 @@
Build Faster dApps withSubQuery Academy
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/ru/build/install.md b/docs/ru/build/install.md
index 8b98ba64c33..a27797f52fe 100644
--- a/docs/ru/build/install.md
+++ b/docs/ru/build/install.md
@@ -8,28 +8,30 @@
Установите SubQuery CLI на терминал, используя Yarn или NPM:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
You can then run help to see available commands and usage provide by CLI:
```shell
subql help
```
+
## Установите @subql/node
Узел SubQuery - это реализация, которая извлекает субстратегически данные блокчейна в рамках проекта SubQuery и сохраняет их в базу данных Postgres.
Установите ноду SubQuery на терминал, используя Yarn или NPM:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Once installed, you can can start a node with:
```shell
subql-node
```
+
> Примечание: Если вы используете Docker или хостинг вашего проекта в проектах SubQuery вы можете пропустить этот шаг. Это происходит потому, что узел SubQuery уже находится в контейнере Docker и в инфраструктуре хостинга.
## Установите @subql/query
@@ -38,7 +40,7 @@ subql-node
Установите запрос SubQuery на терминал с помощью Yarn или NPM:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Примечание: Если вы используете Docker или хостинг вашего проекта в проектах SubQuery вы можете пропустить этот шаг. Это происходит потому, что узел SubQuery уже находится в контейнере Docker и в инфраструктуре хостинга.
\ No newline at end of file
+> Примечание: Если вы используете Docker или хостинг вашего проекта в проектах SubQuery вы можете пропустить этот шаг. Это происходит потому, что узел SubQuery уже находится в контейнере Docker и в инфраструктуре хостинга.
diff --git a/docs/ru/build/introduction.md b/docs/ru/build/introduction.md
index 8b87fd3d3e3..85b9bed8d65 100644
--- a/docs/ru/build/introduction.md
+++ b/docs/ru/build/introduction.md
@@ -51,8 +51,8 @@ yarn codegen
Запустите команду сборки из корневого каталога проекта.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Альтернативные варианты сборки
diff --git a/docs/ru/build/manifest.md b/docs/ru/build/manifest.md
index a802869d94c..c5e60280916 100644
--- a/docs/ru/build/manifest.md
+++ b/docs/ru/build/manifest.md
@@ -4,7 +4,7 @@
Манифест может быть в формате YAML или JSON. В этом документе мы будем использовать YAML во всех примерах. Ниже приведен стандартный пример базового файла Манифест `project.yaml`.
- ``` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` ``` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project # Provide the project name version: 1.0.0 # Project version description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: file: ./schema.graphql # The location of your GraphQL schema file network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Genesis hash of the network endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` @tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Description of your project repository: 'https://github.com/subquery/subql-starter' # Git repository address of your project schema: ./schema.graphql # The location of your GraphQL schema file network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Optionally provide the HTTP endpoint of a full chain dictionary to speed up processing dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # This changes your indexing start block, set this higher to skip initial blocks with less data mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter is optional but suggested to speed up event processing module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## Migrating from v0.0.1 to v0.2.0
@@ -28,15 +28,15 @@ subql migrate функцию можно запустить в существую
Использование $ subql init [PROJECTNAME]
-Аргументы PROJECTNAME Задайте имя стартовому проекту
+Аргументы PROJECTNAME Задайте имя стартовому проекту
| Параметры | Описание |
-| ----------------------- | --------------------------------------------------------------------------------------------- |
+| ----------------------- | --------------------------------------------------------------------------------------------- | ---------------------------------------------------------- |
| f, --force | |
| -l, --location=location | локальная папка для создания проекта |
| --install-dependencies | также устанавливает зависимости |
| --npm | Принудительное использование NPM вместо yarn, работает только с флагом `install-dependencies` |
-| --specVersion=0.0.1 | 0.2.0 [default: 0.2.0] | Версия спецификации, которая будет использоваться проектом |
+| --specVersion=0.0.1 | 0.2.0 [default: 0.2.0] | Версия спецификации, которая будет использоваться проектом |
## Обзор
@@ -71,19 +71,19 @@ subql migrate функцию можно запустить в существую
### Спецификация источника данных
Определяет данные, которые будут отфильтрованы и извлечены, а также расположение обработчика mapping функции для применяемого преобразования данных.
-| Поле | v0.0.1 | v0.2.0 | Описание |
+| Поле | v0.0.1 | v0.2.0 | Описание |
| -------------- | --------------------------------------------------------- | -------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **имя** | String | 𐄂 | Имя источника данных |
-| **вид** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | Мы поддерживаем типы данных из среды выполнения substrate по умолчанию, такие как block, event и extrinsic(call). Начиная с версии 0.2.0 мы поддерживаем данные из пользовательской среды выполнения, например смарт-контракта. |
-| **startBlock** | Integer | Integer | Это изменяет ваш начальный блок индексации, установите его выше, чтобы пропустить начальные блоки с меньшим количеством данных |
-| **mapping** | Mapping Spec | Mapping Spec | |
-| **фильтр** | [network-filters](./manifest/#network-filters) | 𐄂 | Отфильтровать источник данных для выполнения по имени спецификации конечной точки сети |
+| **имя** | String | 𐄂 | Имя источника данных |
+| **вид** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | Мы поддерживаем типы данных из среды выполнения substrate по умолчанию, такие как block, event и extrinsic(call). Начиная с версии 0.2.0 мы поддерживаем данные из пользовательской среды выполнения, например смарт-контракта. |
+| **startBlock** | Integer | Integer | Это изменяет ваш начальный блок индексации, установите его выше, чтобы пропустить начальные блоки с меньшим количеством данных |
+| **mapping** | Mapping Spec | Mapping Spec | |
+| **фильтр** | [network-filters](./manifest/#network-filters) | 𐄂 | Отфильтровать источник данных для выполнения по имени спецификации конечной точки сети |
### Mapping Spec
-| Поле | v0.0.1 | v0.2.0 | Описание |
-| ------------------------- | ------------------------------------------------------------------------------ | --------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **файла** | String | 𐄂 | Путь к записи сопоставления |
+| Поле | v0.0.1 | v0.2.0 | Описание |
+| ------------------------- | ------------------------------------------------------------------------------ | --------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **файла** | String | 𐄂 | Путь к записи сопоставления |
| **обработчики и фильтры** | [Обработчики и фильтры по умолчанию](./manifest/#mapping-handlers-and-filters) | Базовые обработчики и фильтры, Пользовательские обработчики и фильтры | Перечислите все [функции сопоставления](./mapping/polkadot.md) и их соответствующие типы обработчиков, с дополнительными фильтрами сопоставления. Для получения информации о пользовательских обработчиках отображения времени выполнения, пожалуйста, просмотрите раздел Пользовательские источники данных |
## Источники данных и Mapping
@@ -104,8 +104,8 @@ dataSources:
**Ваш проект SubQuery будет намного эффективнее, если вы будете использовать только обработчики событий и вызовов с соответствующими фильтрами сопоставления**
-| Обработчик | Поддерживаемый фильтр |
-| ------------------------------------------------ | --------------------- |
+| Обработчик | Поддерживаемый фильтр |
+| --------------------------------------------------------- | --------------------- |
| [Обработчик блоков](./mapping/polkadot.md#block-handler) | `спецификация версии` |
| [Обработчик событий](./mapping/polkadot.md#event-handler) | модуль,метод |
| [Обработчик вызовов](./mapping/polkadot.md#call-handler) | модуль,метод ,успех |
@@ -116,15 +116,15 @@ dataSources:
```yaml
# Пример фильтра из обработчика вызовов
-filter:
- module: balances
- method: Deposit
- success: true
+filter:
+ module: balances
+ method: Deposit
+ success: true
```
- Фильтры модулей и методов поддерживаются на любой цепи субстрата.
- Фильтр success принимает логическое значение и используется для фильтрации по статусу успеха.
-- Фильтр по спецификации определяет диапазон версии спецификации для блока субстрата. В следующих примерах описано, как выставить диапазоны версий.
+- Фильтр по спецификации определяет диапазон версии спецификации для блока субстрата. В следующих примерах описано, как выставить диапазоны версий.
```yaml
filter:
@@ -153,8 +153,8 @@ filter:
В приведенном ниже примере v0.2.0 `network.chaintypes` указывает на файл, в который включены все пользовательские типы, это стандартный файл chainspec, в котором объявляются конкретные типы, поддерживаемые этой цепочкой блоков, в формате `.json`, `.yaml` или `.js`.
- ``` yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ... ```
- ``` yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true ```
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
Чтобы использовать typescript для вашего файла типов цепочек, включите его в папку `src` (например, `./src/types.ts`), запустите `yarn build` и затем укажите созданный файл js, расположенный в папке `dist`.
@@ -162,7 +162,6 @@ filter:
network:
chaintypes:
file: ./dist/types.js # Будет сгенерирован после yarn run build
-...
```
На что следует обратить внимание при использовании файла типов цепочек с расширением `.ts` или `.js`:
@@ -172,9 +171,9 @@ network:
Вот пример файла типов цепочек `.ts`:
- ```ts
+::: code-tabs @tab types.ts `ts
import { typesBundleDeprecated } from "moonbeam-types-bundle"
-export default { typesBundle: typesBundleDeprecated }; ```
+export default { typesBundle: typesBundleDeprecated }; ` :::
## Пользовательские источники данных
@@ -200,6 +199,6 @@ export default { typesBundle: typesBundleDeprecated }; ``` ```yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change ```
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/ru/build/mapping.md b/docs/ru/build/mapping.md
index 18f6754fdcf..3e7c978085a 100644
--- a/docs/ru/build/mapping.md
+++ b/docs/ru/build/mapping.md
@@ -6,20 +6,20 @@
- Эти сопоставления также экспортированы в `src/index.ts`
- Файлы сопоставлений являются ссылками в `project.yaml` под обработчиками сопоставлений.
-Существует три класса функций сопоставления; [Block handlers](#block-handler), [Event Handlers](#event-handler), и [Call Handlers](#call-handler).
+Существует три класса функций сопоставления; [Block handlers](#block-handler), [Event Handlers](#event-handler), и [Call Handlers](#call-handler).
## Обработчик блоков
Вы можете использовать обработчики блоков для сбора информации каждый раз, когда новый блок присоединяется к цепочке Substrate, например, номер блока. Для этого определенный обработчик блоков будет вызываться один раз для каждого блока.
```ts
-import {SubstrateBlock} from "@subql/types";
+import { SubstrateBlock } from "@subql/types";
export async function handleBlock(block: SubstrateBlock): Promise {
- // Create a new StarterEntity with the block hash as it's ID
- const record = new starterEntity(block.block.header.hash.toString());
- record.field1 = block.block.header.number.toNumber();
- await record.save();
+ // Create a new StarterEntity with the block hash as it's ID
+ const record = new starterEntity(block.block.header.hash.toString());
+ record.field1 = block.block.header.number.toNumber();
+ await record.save();
}
```
@@ -51,26 +51,30 @@ export async function handleEvent(event: SubstrateEvent): Promise {
```ts
export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
- const record = new starterEntity(extrinsic.block.block.header.hash.toString());
- record.field4 = extrinsic.block.timestamp;
- await record.save();
+ const record = new starterEntity(
+ extrinsic.block.block.header.hash.toString()
+ );
+ record.field4 = extrinsic.block.timestamp;
+ await record.save();
}
-
```
[SubstrateExtrinsic](https://github.com/OnFinality-io/subql/blob/a5ab06526dcffe5912206973583669c7f5b9fdc9/packages/types/src/interfaces.ts#L21) расширяет [GenericExtrinsic](https://github.com/polkadot-js/api/blob/a9c9fb5769dec7ada8612d6068cf69de04aa15ed/packages/types/src/extrinsic/Extrinsic.ts#L170). Ему присваивается `id` (блок, к которому принадлежит данное внешнее свойство) и предоставляется внешнее свойство, расширяющее события этого блока. Кроме того, он регистрирует успешный статус этой надбавки.
## Состояния запроса
+
Наша цель - охватить все источники данных для пользователей для обработчиков отображения (больше, чем просто три вышеуказанных типа событий интерфейса). Поэтому мы раскрыли некоторые интерфейсы @polkadot/api для расширения возможностей.
Это те интерфейсы, которые мы поддерживаем в настоящее время:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) будет запрашивать current блок.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) сделает несколько запросов типа same в текущем блоке.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) сделает несколько запросов разных типов в текущем блоке.
+
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) будет запрашивать **current** блок.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) сделает несколько запросов типа **same** в текущем блоке.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) сделает несколько запросов **разных типов** в текущем блоке.
Это интерфейсы, которые мы **НЕ** поддерживаем сейчас:
-- ~~api.tx.*~~
-- ~~api.derive.*~~
+
+- ~~api.tx.\*~~
+- ~~api.derive.\*~~
- ~~api.query.<module>.<method>.at~~
- ~~api.query.<module>.<method>.entriesAt~~
- ~~api.query.<module>.<method>.entriesPaged~~
@@ -98,6 +102,7 @@ const b1 = ожидание api. pc.chain.getBlock(blockhash);
// Он будет использовать текущий блок по умолчанию так:
const b2 = await api.rpc.chain.getBlock();
```
+
- Для [Custom Substrate Chains](#custom-substrate-chains) RPC звонки смотрите [usage](#usage).
## Модули и Библиотеки
@@ -148,6 +153,7 @@ SubQuery можно использовать в любой цепочке на
```shell
curl -H "Content-Type: application/json" -d '{"id":"1", "jsonrpc":"2.0", "method": "state_getMetadata", "params":[]}' http://localhost:9933
```
+
или из его **веб-сокета** с помощью [`websocat`](https://github.com/vi/websocat):
```shell
@@ -161,9 +167,11 @@ echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
Далее скопируйте и вставьте вывод в файл JSON. В нашем [kitty example](https://github.com/subquery/subql-examples/tree/main/kitty), мы создали `api-interface/kitty.json`.
#### Определения типов
+
Мы предполагаем, что пользователь знает конкретные типы и поддержку RPC из цепочки, и она определена в [Манифесте](./manifest.md).
Следуя [установке типов](https://polkadot.js.org/docs/api/examples/promise/typegen#metadata-setup), мы создаем :
+
- `src/api-interfaces/definitions.ts` - экспортирует все определения подпапок
```ts
@@ -171,6 +179,7 @@ echo state_getMetadata | websocat 'ws://127.0.0.1:9944' --jsonrpc
```
- `src/api-interfaces/kitties/definitions.ts` - определения типа для модуля котят
+
```ts
экспорт по умолчанию {
// пользовательские типы
@@ -248,6 +257,7 @@ yarn generate:meta
### Использование
Теперь в функции отображения мы можем показать, как метаданные и типы фактически украшают API. Конечная точка RPC будет поддерживать модули и методы, которые мы объявили выше. А чтобы использовать пользовательский вызов rpc, смотрите раздел [ Пользовательские вызовы rpc цепочки](#custom-chain-rpc-calls).
+
```typescript
export async function kittyApiHandler(): Promise {
//return the KittyIndex type
@@ -265,6 +275,7 @@ export async function kittyApiHandler(): Promise {
### Пользовательские вызовы в цепочке rpc
Для поддержки индивидуальных вызовов цепочки RPC мы должны вручную ввести определения RPC для `typesBundle`, что позволяет настроить каждый конкретный вызов. Вы можете определить `typesBundle` в `project.yml`. И помните, что поддерживаются только звонки типа `isHistoric`.
+
```yaml
...
типы: {
diff --git a/docs/ru/build/substrate-evm.md b/docs/ru/build/substrate-evm.md
index 9ef9152cc16..c488c4ad61a 100644
--- a/docs/ru/build/substrate-evm.md
+++ b/docs/ru/build/substrate-evm.md
@@ -51,7 +51,7 @@
### Обработчики
-В отличие от обычного обработчика, вы не получите `SubstrateExtrinsic` в качестве параметра, вместо этого вы получите `MoonbeamCall`, основанный на типе Ethers [TransactionResponse](https://docs.ethers.io/v5/api/providers/types/#providers-TransactionResponse).
+В отличие от обычного обработчика, вы не получите `SubstrateExtrinsic` в качестве параметра, вместо этого вы получите `MoonbeamCall`, основанный на типе Ethers [TransactionResponse](https://docs.ethers.io/v5/api/providers/types/#providers-TransactionResponse).
Отличия от типа `TransactionResponse` :
@@ -74,7 +74,7 @@
| ---- | ------------- | ------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- |
| темы | Массив строк. | Передача (адрес проиндексирован, адрес проиндексирован, значение uint256) | Фильтр тем соответствует фильтрам журнала Ethereum JSON-PRC, дополнительную документацию можно найти здесь. |
- Примечание по темам:
+**Примечание по темам:**
Есть несколько улучшений по сравнению с базовыми фильтрами журналов:
- Темы не должны быть заполнены на 0
@@ -97,17 +97,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,7 +122,7 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Известные ограничения
diff --git a/docs/ru/faqs/faqs.md b/docs/ru/faqs/faqs.md
index 3e2feb0bc38..ce62c56f588 100644
--- a/docs/ru/faqs/faqs.md
+++ b/docs/ru/faqs/faqs.md
@@ -4,7 +4,7 @@
SubQuery , это индексатор данных блокчейна с открытым исходным кодом для разработчиков, который предоставляет быстрые, гибкие, надежные и децентрализованные API для управления ведущими многоканальными приложениями.
-Наша цель - это сэкономить время и деньги разработчиков, устранив необходимость разработки собственного решения для индексации. Сейчас они могут полностью сосредоточиться на разработке своих приложений. SubQuery - помогает разработчикам создавать децентрализованные продукты будущего.
+Наша цель - это сэкономить время и деньги разработчиков, устранив необходимость разработки собственного решения для индексации. Сейчас они могут полностью сосредоточиться на разработке своих приложений. SubQuery - помогает разработчикам создавать децентрализованные продукты будущего.
+Перерахуйте всі mapping functions<0> та відповідні типи обробників за допомогою додаткових фільтрів відображення. Для користувацьких обробників відображення часу перегляньте [ Спеціальні джерела даних ](#custom-data-sources)
## Джерела даних та картографування
@@ -101,8 +101,8 @@ dataSources:
**Ваш проект SubQuery буде набагато ефективнішим, якщо ви використовуєте лише обробники подій і викликів із відповідними фільтрами зіставлення**
-| Обробник | Підтримуваний фільтр |
-| ------------------------------------------------- | ------------------------- |
+| Обробник | Підтримуваний фільтр |
+| ---------------------------------------------------------- | ------------------------- |
| [Обробник блокування](./mapping/polkadot.md#block-handler) | `версія специфікації` |
| [Подіяльний обробник](./mapping/polkadot.md#event-handler) | `модуль`,`метод` |
| [Обробник дзвінків](./mapping/polkadot.md#call-handler) | `модуль`,`метод` ,`успіx` |
@@ -148,11 +148,11 @@ filter:
Ми підтримуємо додаткові типи, які використовуються модулями середовища виконання, `typesAlias`, `typesBundle`, `typesChain` і `typesSpec` також підтримуються .
-У наведеному нижче прикладі версії 0.2.0 `network.chaintypes` вказує на файл, який містить усі користувацькі типи. Це стандартний файл специфікації ланцюга, який оголошує конкретні типи, які підтримує цей блокчейн у < 0>.json, `.yaml` або `.js`.
+У наведеному нижче прикладі версії 0.2.0 `network.chaintypes` вказує на файл, який містить усі користувацькі типи. Це стандартний файл специфікації ланцюга, який оголошує конкретні типи, які підтримує цей блокчейн у `.json`, `.yaml` або `.js`.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
To use typescript for your chain types file include it in the `src` folder (e.g. `./src/types.ts`), run `yarn build` and then point to the generated js file located in the `dist` folder.
@@ -169,7 +169,7 @@ network:
Ось приклад файлу типів ланцюга `.ts`:
- `тс імпортувати {typeBundleDeprecated } з "moonbeam-types-bundle" експорт за замовчуванням {typeBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `тс імпортувати {typeBundleDeprecated } з "moonbeam-types-bundle" експорт за замовчуванням {typeBundle: typesBundleDeprecated }; ` :::
## Спеціальні джерела даних
@@ -195,6 +195,6 @@ network:
Нижче наведено приклад, який показує різні джерела даних для мереж Polkadot і Kusama.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/uk/build/mapping.md b/docs/uk/build/mapping.md
index ef83d7eff54..f6a950d5ff9 100644
--- a/docs/uk/build/mapping.md
+++ b/docs/uk/build/mapping.md
@@ -67,9 +67,9 @@ export async function handleCall(extrinsic: SubstrateExtrinsic): Promise {
Це інтерфейси, які ми зараз підтримуємо:
-- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) буде запитувати поточний блок.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) зробить декілька запитів того ж типу в поточному блоці.
-- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) зробить декілька запитів того ж типу в поточному блоці.
+- [api.query.<module>.<method>()](https://polkadot.js.org/docs/api/start/api.query) буде запитувати **поточний** блок.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) зробить декілька запитів **того ж** типу в поточному блоці.
+- [api.query.<module>.<method>.multi()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) зробить декілька запитів **того ж** типу в поточному блоці.
Це інтерфейси, які ми робимо **НЕ** підтримуємо:
diff --git a/docs/uk/build/substrate-evm.md b/docs/uk/build/substrate-evm.md
index 87947b03d2c..b63fc9cadc9 100644
--- a/docs/uk/build/substrate-evm.md
+++ b/docs/uk/build/substrate-evm.md
@@ -30,7 +30,7 @@
| поле | Тип | Обов’язково | Описання |
| ------- | ---------------- | ----------- | ------------------------------------------------------------------------------------------------------------ |
-| abi | String | Ні | Процесор ABI використовується для аналізу аргументів. ПОВИНЕН бути ключем ` assets ` |
+| abi | String | Ні | Процесор ABI використовується для аналізу аргументів. ПОВИНЕН бути ключем `assets` |
| Address | String or `null` | Ні | Адреса контракту, з якої здійснюється подія або дзвінок. `null` буде захоплювати виклики створення контракту |
## Виклик Moonbeam
@@ -44,20 +44,20 @@
### Фільтр викликів
-| поле | Тип | Приклади | Описання |
-| ------- | ------ | --------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| Функція | String | 0x095ea7b3, затвердити (адресу на,uint256 значення) | Або [ Function Signature ](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment), або функція ` sighash ` для фільтрації функції, що викликається договором |
-| від | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | Ethereum адреса, яка надіслала транзакцію |
+| поле | Тип | Приклади | Описання |
+| ------- | ------ | --------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
+| Функція | String | 0x095ea7b3, затвердити (адресу на,uint256 значення) | Або [ Function Signature ](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment), або функція `sighash` для фільтрації функції, що викликається договором |
+| від | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | Ethereum адреса, яка надіслала транзакцію |
### Обробники
-На відміну від звичайного обробника, ви не отримаєте параметр ` SubstrateExtrinsic `, натомість ви отримаєте тип ` MoonbeamCall `, який базується на Ethers [ TransactionResponse ](https://docs.ethers.io/v5/api/providers/types/#providers-TransactionResponse).
+На відміну від звичайного обробника, ви не отримаєте параметр `SubstrateExtrinsic`, натомість ви отримаєте тип `MoonbeamCall`, який базується на Ethers [ TransactionResponse ](https://docs.ethers.io/v5/api/providers/types/#providers-TransactionResponse).
Зміни від `типу TransactionResponse`:
-- Він не має ` wait ` та ` confirmations ` властивостей
-- Властивість ` success ` додається щоб знати, якщо транзакція була успішною
-- ` args ` додається, якщо поле ` abi ` надано і аргументи можна успішно проаналізувати
+- Він не має `wait` та `confirmations` властивостей
+- Властивість `success` додається щоб знати, якщо транзакція була успішною
+- `args` додається, якщо поле `abi` надано і аргументи можна успішно проаналізувати
## Подія Moonbeam
@@ -74,7 +74,8 @@
| ---- | ------------- | -------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------- |
| теми | Масиви рядків | Transfer(адреса, індексована адреса, індексована в,uint256 значення) | Фільтр тем слідкує за фільтрами журналів Ethereum JSON-PRC, більше документації можна знайти [ here ](https://docs.ethers.io/v5/concepts/events/). |
- Примітка до тем:
+**Примітка до тем:**
+
Є кілька вдосконалень від основних фільтрів журналів:
- Теми не потрібно 0 підкладати
@@ -82,11 +83,11 @@
### Обробники
-На відміну від звичайного обробника, ви не отримаєте параметр ` SubstrateEvent `, натомість ви отримаєте ` MoonbeamEvent `, який базується на Ethers [ Log ](https://docs.ethers.io/v5/api/providers/types/#providers-Log).
+На відміну від звичайного обробника, ви не отримаєте параметр `SubstrateEvent`, натомість ви отримаєте `MoonbeamEvent`, який базується на Ethers [ Log ](https://docs.ethers.io/v5/api/providers/types/#providers-Log).
-Зміни з ` Log `:
+Зміни з `Log`:
-- ` args ` додається, якщо поле ` abi ` надано і аргументи можна успішно проаналізувати
+- `args` додається, якщо поле `abi` надано і аргументи можна успішно проаналізувати
## Приклад джерела даних
@@ -97,17 +98,17 @@ dataSources:
- kind: substrate/Moonbeam
startBlock: 752073
processor:
- file: './node_modules/@subql/contract-processors/dist/moonbeam.js'
+ file: "./node_modules/@subql/contract-processors/dist/moonbeam.js"
options:
# Must be a key of assets
abi: erc20
# Contract address (or recipient if transfer) to filter, if `null` should be for contract creation
- address: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ address: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
assets:
erc20:
- file: './erc20.abi.json'
+ file: "./erc20.abi.json"
mapping:
- file: './dist/index.js'
+ file: "./dist/index.js"
handlers:
- handler: handleMoonriverEvent
kind: substrate/MoonbeamEvent
@@ -122,11 +123,11 @@ dataSources:
# function: '0x7ff36ab500000000000000000000000000000000000000000000000000000000'
# function: approve(address,uint256)
function: approve(address to,uint256 value)
- from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
+ from: "0x6bd193ee6d2104f14f94e2ca6efefae561a4334b"
```
## Відомі обмеження
- Наразі немає можливості запитувати стан EVM у обробнику
- Немає можливості отримати квитанції про транзакції з обробниками дзвінків
-- Властивості ` blockHash ` в даний час залишаються невизначеними, натомість можна використовувати властивість ` blockNumber `
+- Властивості `blockHash` в даний час залишаються невизначеними, натомість можна використовувати властивість `blockNumber`
diff --git a/docs/uk/faqs/faqs.md b/docs/uk/faqs/faqs.md
index 36cc664224e..67afd645347 100644
--- a/docs/uk/faqs/faqs.md
+++ b/docs/uk/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery також надає безплатний хостинг проєкт
**Про SubQuery Network**
-SubQuery Network дозволяє розробникам повністю децентралізувати свій стек інфраструктури. Це відкритий, продуктивний, надійний і масштабований сервіс передачі даних для додатків. SubQuery Network індексує і надає дані світовій спільноті стимульованим і піддається перевірці способом. Після публікації вашого проєкт в SubQuery Network будь-який охочий може проіндексувати й розмістити його, надаючи дані користувачам по всьому світу швидше й надійніше.
+SubQuery Network дозволяє розробникам повністю децентралізувати свій стек інфраструктури. Це відкритий, продуктивний, надійний і масштабований сервіс передачі даних для додатків. SubQuery Network індексує і надає дані світовій спільноті стимульованим і піддається перевірці способом. Після публікації вашого проєкт в SubQuery Network будь-який охочий може проіндексувати й розмістити його, надаючи дані користувачам по всьому світу швидше й надійніше.
Більш детальна інформація [тут](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ SubQuery Network дозволяє розробникам повністю дец
## Як я можу внести або надати відгуки на SubQuery?
-Ми любимо внески та відгуки громади. Щоб внести свій внесок в код, розгалузите цікавить Вас репозиторій і внесіть свої зміни. Сформуйте PR або Pull Request. Не забудьте також протестувати. Also check out our contributions guidelines.
+Ми любимо внески та відгуки громади. Щоб внести свій внесок в код, розгалузите цікавить Вас репозиторій і внесіть свої зміни. Сформуйте PR або Pull Request. Не забудьте також протестувати. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
Щоб залишити відгук, зв'яжіться з нами за адресою hello@subquery.network або перейдіть на наш канал [discord](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Зверніть увагу, що рекомендується використовувати `--force-clean` при зміні `startBlock` в рамках маніфесту проєкту (`project.yaml`) для того, щоб почати пере індексацію з налаштованого блоку. Якщо `startBlock` змінюється без `--force-clean` проєкту, то індекси продовжать індексування за допомогою раніше налаштованих `startBlock`.
-
## Як я можу оптимізувати свій проєкт, щоб прискорити його?
Продуктивність є вирішальним фактором у кожному проєкті. На щастя, є кілька речей, які ви могли б зробити, щоб її покращити. Ось список деяких пропозицій:
@@ -89,13 +88,13 @@ subql-node -f . --force-clean --subquery-name=
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Зауважте, що кількість доступних ядер ЦП суворо обмежує використання робочих потоків. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/uk/quickstart/helloworld-localhost.md b/docs/uk/quickstart/helloworld-localhost.md
index 942ae3679e2..0323961c95f 100644
--- a/docs/uk/quickstart/helloworld-localhost.md
+++ b/docs/uk/quickstart/helloworld-localhost.md
@@ -59,7 +59,7 @@ My docker version is: Docker version 20.10.5, build 55c4c88
## 1. Ініціалізація проекту
-Перший крок при запуску з SubQuery - це запуск команди ` subql init `. Давайте ініціалізуємо стартовий проект із назвою ` subqlHelloWorld `. Зауважте, що обов'язковий лише автор. Все інше залишається порожнім внизу.
+Перший крок при запуску з SubQuery - це запуск команди `subql init`. Давайте ініціалізуємо стартовий проект із назвою `subqlHelloWorld`. Зауважте, що обов'язковий лише автор. Все інше залишається порожнім внизу.
```shell
> subql init subqlHelloWorld
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Тепер зробіть встановлення пряжі або вузла, щоб встановити різні залежності.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
An example of `yarn install`
@@ -107,10 +107,10 @@ success Saved lockfile.
## 3. Створити код
-Тепер запустіть ` yarn codegen `, щоб генерувати Typescript зі схеми GraphQL.
+Тепер запустіть `yarn codegen`, щоб генерувати Typescript зі схеми GraphQL.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
An example of `yarn codegen`
@@ -127,14 +127,14 @@ $ ./node_modules/.bin/subql codegen
✨ Done in 1.02s.
```
-** Попередження ** Коли в файл схеми вносяться зміни, будь ласка, не забудьте повторно запустити ` yarn codegen ` для відновлення каталогу типів.
+** Попередження ** Коли в файл схеми вносяться зміни, будь ласка, не забудьте повторно запустити `yarn codegen` для відновлення каталогу типів.
## 4. Створіть код
-Наступний крок - побудувати код із ` yarn build `.
+Наступний крок - побудувати код із `yarn build`.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
An example of `yarn build`
diff --git a/docs/uk/quickstart/quickstart-avalanche.md b/docs/uk/quickstart/quickstart-avalanche.md
index 15af55a872b..48d5c3668d1 100644
--- a/docs/uk/quickstart/quickstart-avalanche.md
+++ b/docs/uk/quickstart/quickstart-avalanche.md
@@ -59,8 +59,8 @@ subql init
Нарешті, у каталозі проекту виконайте наступну команду, щоб встановити залежності нового проекту.
- ``` shell компакт-диск PROJECT_NAME установка yarn ```
- ``` shell компакт-диск PROJECT_NAME npm встановити ```
+::: code-tabs @tab:active yarn `shell компакт-диск PROJECT_NAME установка yarn`
+@tab npm `shell компакт-диск PROJECT_NAME npm встановити` :::
## Внесення змін до проекту
@@ -68,7 +68,7 @@ subql init
1. Схема GraphQL в `schema.graphql`
2. Маніфест проекту в `project.yaml`
-3. Картографування функціонує в каталозі ` src / mappings / `
+3. Картографування функціонує в каталозі `src / mappings /`
Метою цього короткого посібника є адаптація стандартного стартового проекту для індексації всіх журналів транзакцій Pangolin `Approve`.
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Важливо: коли ви вносите будь-які зміни до файлу схеми, переконайтеся, що ви повторно створили каталог типів. Зробіть це зараз.**
- ``` shell кодоген пряжі ```
- ``` shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell кодоген пряжі`
+@tab npm `shell npm run-script codegen` :::
Згенеровані моделі можна знайти в каталозі `/src/types/models`. Щоб отримати додаткові відомості про файл `schema.graphql`, перегляньте нашу документацію в розділі [Build/GraphQL Schema](../build/graphql.md)
@@ -136,7 +136,7 @@ dataSources:
Перейдіть до функції відображення за замовчуванням у каталозі `src/mappings`. Ви побачите три експортовані функції: `handleBlock`, `handleLog` і `handleTransaction`. Ви можете видалити як функції `handleBlock`, так і `handleTransaction`, ми маємо справу лише з функцією `handleLog`.
-Функція `handleLog` отримує дані про події щоразу, коли подія відповідає фільтрам, які ми вказали раніше в нашому `project.yaml`. Ми збираємося оновити його, щоб обробити всі журнали транзакцій ` approval ` та зберегти їх у сутності GraphQL, які ми створили раніше.
+Функція `handleLog` отримує дані про події щоразу, коли подія відповідає фільтрам, які ми вказали раніше в нашому `project.yaml`. Ми збираємося оновити його, щоб обробити всі журнали транзакцій `approval` та зберегти їх у сутності GraphQL, які ми створили раніше.
Ви можете оновити функцію `handleLog` до наступного (зверніть увагу на додатковий імпорт):
@@ -169,7 +169,7 @@ export async function handleLog(event: AvalancheLog): Promise {
Щоб запустити ваш новий проект SubQuery, нам спочатку потрібно створити нашу роботу. Запустіть команду збірки з кореневого каталогу проекту.
- ``` shell побудова yarn ``` ``` shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell побудова yarn` @tab npm `shell npm run-script build` :::
**Важливо: щоразу, коли ви вносите зміни у свої функції відображення, вам потрібно буде перебудувати свій проект**
@@ -183,13 +183,9 @@ export async function handleLog(event: AvalancheLog): Promise {
У каталозі проекту виконайте таку команду:
- ``` shell початок yarn:docker ``` ``` shell npm run-script start:docker ```
-
-Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-
-`@subql/query`7 > і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
-
+::: code-tabs @tab:active yarn `shell початок yarn:docker` @tab npm `shell npm run-script start:docker` :::
+Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node), `@subql/query`і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
### Запитуйте свій проект
@@ -199,8 +195,6 @@ export async function handleLog(event: AvalancheLog): Promise {
Для нового початкового проекту SubQuery ви можете спробувати такий запит, щоб зрозуміти, як він працює, або [дізнатися більше про мову Query GraphQL](../run_publish/graphql.md).
-
-
```graphql
query {
pangolinApprovals(first: 5) {
@@ -217,17 +211,12 @@ query {
}
```
-
-
-
### Опублікуйте проект SubQuery
SubQuery надає безкоштовну керовану службу, коли ви можете розгорнути свій новий проект. Ви можете розгорнути його в [SubQuery Projects](https://project.subquery.network) і зробити запит за допомогою нашого [ Explorer](https://explorer.subquery.network).
[Прочитайте посібник, щоб опублікувати свій новий проект у SubQuery Projects](../run_publish/publish.md), **Зверніть увагу, що ви повинні розгорнути через IPFS**.
-
-
## Настуні кроки
Вітаємо, тепер у вас є локально запущений проект SubQuery, який приймає запити GraphQL API для передачі даних з bLuna.
diff --git a/docs/uk/quickstart/quickstart-cosmos.md b/docs/uk/quickstart/quickstart-cosmos.md
index 204dc8bc653..783ceaac015 100644
--- a/docs/uk/quickstart/quickstart-cosmos.md
+++ b/docs/uk/quickstart/quickstart-cosmos.md
@@ -44,8 +44,8 @@ Cosmos is not yet supported in SubQuery's CLI (`subql`), to start with Juno clon
Нарешті, у каталозі проекту виконайте наступну команду, щоб встановити залежності нового проекту.
- ``` оболонка компакт-диск PROJECT_NAME установка yarn ```
- ``` оболонка компакт-диск PROJECT_NAME npm встановити ```
+::: code-tabs @tab:active yarn `оболонка компакт-диск PROJECT_NAME установка yarn`
+@tab npm `оболонка компакт-диск PROJECT_NAME npm встановити` :::
## Внесення змін до проекту
@@ -53,7 +53,7 @@ Cosmos is not yet supported in SubQuery's CLI (`subql`), to start with Juno clon
1. Схема GraphQL в `schema.graphql`
2. Маніфест проекту в `project.yaml`
-3. Картографування функціонує в каталозі ` src / mappings / `
+3. Картографування функціонує в каталозі `src / mappings /`
Метою цього короткого посібника є адаптація стандартного стартового проекту, щоб почати індексацію всіх переказів із смарт-контракту bLuna.
@@ -75,8 +75,8 @@ type Vote @entity {
**Важливо: коли ви вносите будь-які зміни до файлу схеми, переконайтеся, що ви повторно створили каталог типів. Зробіть це зараз.**
- ``` оболонка кодоген yarn ```
- ``` оболонка npm run-script codegen ```
+::: code-tabs @tab:active yarn `оболонка кодоген yarn`
+@tab npm `оболонка npm run-script codegen` :::
Згенеровані моделі можна знайти в каталозі `/src/types/models`. Щоб отримати додаткові відомості про файл `schema.graphql`, перегляньте нашу документацію в розділі [Build/GraphQL Schema](../build/graphql.md)
@@ -145,7 +145,7 @@ What this is doing is receiving a CosmosMessage which includes message data on t
Щоб запустити ваш новий проект SubQuery, нам спочатку потрібно створити нашу роботу. Запустіть команду збірки з кореневого каталогу проекту.
- ``` оболонка побудова yarn ``` ``` оболонка npm run-script build ```
+::: code-tabs @tab:active yarn `оболонка побудова yarn` @tab npm `оболонка npm run-script build` :::
**Важливо: щоразу, коли ви вносите зміни у свої функції відображення, вам потрібно буде перебудувати свій проект**
@@ -159,13 +159,9 @@ What this is doing is receiving a CosmosMessage which includes message data on t
У каталозі проекту виконайте таку команду:
- ``` оболонка початок yarn ``` ``` оболонка npm run-script start:docker ```
-
-Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-
-`@subql/query`7 > і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
-
+::: code-tabs @tab:active yarn `оболонка початок yarn` @tab npm `оболонка npm run-script start:docker` :::
+Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node), `@subql/query` і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
### Запитуйте свій проект
@@ -175,14 +171,11 @@ What this is doing is receiving a CosmosMessage which includes message data on t
Для нового початкового проекту SubQuery ви можете спробувати такий запит, щоб зрозуміти, як він працює, або [дізнатися більше про мову Query GraphQL](../run_publish/graphql.md).
-
-
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
@@ -194,19 +187,14 @@ query {
}
```
-
You can see the final code of this project here at [https://github.com/jamesbayly/juno-terra-developer-fund-votes](https://github.com/jamesbayly/juno-terra-developer-fund-votes)
-
-
### Опублікуйте проект SubQuery
SubQuery надає безкоштовну керовану службу, коли ви можете розгорнути свій новий проект. Ви можете розгорнути його в [SubQuery Projects](https://project.subquery.network) і зробити запит за допомогою нашого [ Explorer](https://explorer.subquery.network).
[Прочитайте посібник, щоб опублікувати свій новий проект у SubQuery Projects](../run_publish/publish.md)
-
-
## Настуні кроки
Вітаємо, тепер у вас є локально запущений проект SubQuery, який приймає запити GraphQL API для передачі даних з bLuna.
diff --git a/docs/uk/quickstart/quickstart-polkadot.md b/docs/uk/quickstart/quickstart-polkadot.md
index d99eb073b59..fc2abfa313c 100644
--- a/docs/uk/quickstart/quickstart-polkadot.md
+++ b/docs/uk/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Під час ініціалізації проекту SubQuery вам зададуть певні запитання:
- Назва проєкту: Найменування для вашого проєкту SubQuery
-- Сімейство мереж: Сімейство блокчейну рівня 1, для індексації якого буде розроблено цей проєкт SubQuery. Використовуйте клавіші зі стрілками, щоб вибрати з доступних параметрів. Для цього посібника ми будемо використовувати *"Substrate"*
-- Network: конкретна мережа, для індексації якої буде розроблено цей проєкт SubQuery. Використовуйте клавіші зі стрілками, щоб вибрати з доступних параметрів. Для цього посібника ми будемо використовувати *"Polkadot"*
-- Шаблонний проєкт: виберіть проєкт шаблону SubQuery, який стане відправною точкою для початку розробки. Ми пропонуємо вибрати проєкт *"subql-starter"*.
-- Кінцева точка RPC: надайте URL-адресу HTTPS для запущеної кінцевої точки RPC, яка буде використовуватися за замовчуванням для цього проєкту. Ви можете швидко отримати доступ до загальнодоступних кінцевих точок для різних мереж Polkadot, створити власний приватний виділений вузол за допомогою [OnFinality](https://app.onfinality.io) або просто використовувати кінцеву точку Polkadot за замовчуванням. Цей вузол RPC повинен бути вузлом архіву (мати стан повного ланцюга). Для цього посібника ми будемо використовувати значення за замовчуванням *"https://polkadot.api.onfinality.io"*
+- Сімейство мереж: Сімейство блокчейну рівня 1, для індексації якого буде розроблено цей проєкт SubQuery. Використовуйте клавіші зі стрілками, щоб вибрати з доступних параметрів. Для цього посібника ми будемо використовувати _"Substrate"_
+- Network: конкретна мережа, для індексації якої буде розроблено цей проєкт SubQuery. Використовуйте клавіші зі стрілками, щоб вибрати з доступних параметрів. Для цього посібника ми будемо використовувати _"Polkadot"_
+- Шаблонний проєкт: виберіть проєкт шаблону SubQuery, який стане відправною точкою для початку розробки. Ми пропонуємо вибрати проєкт _"subql-starter"_.
+- Кінцева точка RPC: надайте URL-адресу HTTPS для запущеної кінцевої точки RPC, яка буде використовуватися за замовчуванням для цього проєкту. Ви можете швидко отримати доступ до загальнодоступних кінцевих точок для різних мереж Polkadot, створити власний приватний виділений вузол за допомогою [OnFinality](https://app.onfinality.io) або просто використовувати кінцеву точку Polkadot за замовчуванням. Цей вузол RPC повинен бути вузлом архіву (мати стан повного ланцюга). Для цього посібника ми будемо використовувати значення за замовчуванням _"https://polkadot.api.onfinality.io"_
- Репозиторій Git: надайте URL-адресу Git до репозиторію, в якому буде розміщено цей проєкт SubQuery (якщо він розміщений у SubQuery Explorer) або прийміть надане за замовчуванням.
- Автори: Введіть тут власника цього проєкту SubQuery (наприклад, ваше ім’я!) або прийміть надане за замовчуванням.
- Опис: надайте короткий абзац про ваш проєкт, який описує дані, які він містить, і що користувачі можуть робити з ними або прийняти надане за замовчуванням.
@@ -57,8 +57,8 @@ subql init
Нарешті, у каталозі проєкту виконайте таку команду, щоб встановити залежності нового проєкту.
- ``` оболонка компакт-диск PROJECT_NAME установка yarn ```
- ``` оболонка компакт-диск PROJECT_NAME npm встановити ```
+::: code-tabs @tab:active yarn `оболонка компакт-диск PROJECT_NAME установка yarn`
+@tab npm `оболонка компакт-диск PROJECT_NAME npm встановити` :::
## Внесення змін до проекту
@@ -66,7 +66,7 @@ subql init
1. The GraphQL Schema in `schema.graphql`
2. Маніфест проекту в `project.yaml`
-3. Картографування функціонує в каталозі ` src / mappings / `
+3. Картографування функціонує в каталозі `src / mappings /`
Метою цього короткого посібника є адаптація стандартного стартового проєкту, щоб почати індексацію всіх переказів із Polkadot.
@@ -88,8 +88,8 @@ type Transfer @entity {
**Важливо: коли ви вносите будь-які зміни до файлу схеми, переконайтеся, що ви повторно створили каталог типів.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
You'll find the generated models in the `/src/types/models` directory. Щоб отримати додаткові відомості про файл `schema.graphql`, перегляньте нашу документацію в розділі [Build/GraphQL Schema](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
Щоб запустити ваш новий проєкт SubQuery, нам спочатку потрібно побудувати нашу роботу. Запустіть команду збірки з кореневого каталогу проекту.
- ``` оболонка побудова ``` ``` оболонка npm run-script build ```
+::: code-tabs @tab:active yarn `оболонка побудова` @tab npm `оболонка npm run-script build` :::
**Важливо: щоразу, коли ви вносите зміни у свої функції відображення, вам потрібно буде перебудувати свій проєкт**
@@ -174,7 +174,7 @@ export async function handleEvent(event: SubstrateEvent): Promise {
У каталозі проєкту, виконайте таку команду:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you should see a running SubQuery node in the terminal screen.
@@ -189,10 +189,7 @@ It may take some time to download the required packages ([`@subql/node`](https:/
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/uk/quickstart/quickstart-terra.md b/docs/uk/quickstart/quickstart-terra.md
index b7734817483..a7f7136e368 100644
--- a/docs/uk/quickstart/quickstart-terra.md
+++ b/docs/uk/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Під час ініціалізації проекту SubQuery вам зададуть певні запитання:
- Project Name: A name for your SubQuery project
-- Сімейство мереж. Сімейство мереж блокчейн рівня 1, для індексації якого буде розроблено цей проект SubQuery, використовуйте клавіші зі стрілками на клавіатурі, щоб вибрати один із варіантів, для цього посібника ми будемо використовувати *"Terra"*
-- Мережа: конкретна мережа, для індексації якої буде розроблено цей проект SubQuery. Використовуйте клавіші зі стрілками на клавіатурі, щоб вибрати один із параметрів, для цього посібника ми будемо використовувати *"Terra"*
-- Шаблон: виберіть шаблон проекту SubQuery, який буде відправною точкою для початку розробки, ми пропонуємо вибрати *"Початковий проект"*
+- Сімейство мереж. Сімейство мереж блокчейн рівня 1, для індексації якого буде розроблено цей проект SubQuery, використовуйте клавіші зі стрілками на клавіатурі, щоб вибрати один із варіантів, для цього посібника ми будемо використовувати _"Terra"_
+- Мережа: конкретна мережа, для індексації якої буде розроблено цей проект SubQuery. Використовуйте клавіші зі стрілками на клавіатурі, щоб вибрати один із параметрів, для цього посібника ми будемо використовувати _"Terra"_
+- Шаблон: виберіть шаблон проекту SubQuery, який буде відправною точкою для початку розробки, ми пропонуємо вибрати _"Початковий проект"_
- Репозиторій Git (необов’язково): надайте URL-адресу Git до репозиторію, в якому буде розміщено цей проект SubQuery (якщо він розміщено в SubQuery Explorer)
-- Кінцева точка RPC (обов’язково): надайте URL-адресу HTTPS для запущеної кінцевої точки RPC, яка буде використовуватися за замовчуванням для цього проекту. Цей вузол RPC повинен бути вузлом архіву (мати стан повного ланцюга). Для цього посібника ми будемо використовувати значення за замовчуванням *"https://terra-columbus-5.beta.api.onfinality.io"*
+- Кінцева точка RPC (обов’язково): надайте URL-адресу HTTPS для запущеної кінцевої точки RPC, яка буде використовуватися за замовчуванням для цього проекту. Цей вузол RPC повинен бути вузлом архіву (мати стан повного ланцюга). Для цього посібника ми будемо використовувати значення за замовчуванням _"https://terra-columbus-5.beta.api.onfinality.io"_
- Автори (обов’язково): Введіть тут власника цього проекту SubQuery (наприклад, ваше ім’я!)
- Опис (необов’язково): ви можете надати короткий абзац про ваш проект, який описує, які дані він містить і що користувачі можуть з ними робити
- Версія (обов’язково): введіть користувацький номер версії або використовуйте стандартний (`1.0.0`)
@@ -59,8 +59,8 @@ subql init
Нарешті, у каталозі проекту виконайте наступну команду, щоб встановити залежності нового проекту.
- ``` оболонка компакт-диск PROJECT_NAME установка yarn ```
- ``` оболонка компакт-диск PROJECT_NAME npm встановити ```
+::: code-tabs @tab:active yarn `оболонка компакт-диск PROJECT_NAME установка yarn`
+@tab npm `оболонка компакт-диск PROJECT_NAME npm встановити` :::
## Внесення змін до проекту
@@ -68,7 +68,7 @@ subql init
1. Схема GraphQL в `schema.graphql`
2. Маніфест проекту в `project.yaml`
-3. Картографування функціонує в каталозі ` src / mappings / `
+3. Картографування функціонує в каталозі `src / mappings /`
Метою цього короткого посібника є адаптація стандартного стартового проекту, щоб почати індексацію всіх переказів із смарт-контракту bLuna.
@@ -91,8 +91,8 @@ type Transfer @entity {
**Важливо: коли ви вносите будь-які зміни до файлу схеми, переконайтеся, що ви повторно створили каталог типів. Зробіть це зараз.**
- ``` оболонка кодоген yarn ```
- ``` оболонка npm run-script codegen ```
+::: code-tabs @tab:active yarn `оболонка кодоген yarn`
+@tab npm `оболонка npm run-script codegen` :::
Згенеровані моделі можна знайти в каталозі `/src/types/models`. Щоб отримати додаткові відомості про файл `schema.graphql`, перегляньте нашу документацію в розділі [Build/GraphQL Schema](../build/graphql.md)
@@ -100,22 +100,22 @@ type Transfer @entity {
Файл маніфесту проекту (`project.yaml`) можна розглядати як точку входу до вашого проекту, і він визначає більшість деталей про те, як SubQuery буде індексувати та перетворювати дані ланцюга.
-Ми не будемо робити багато змін у файлі маніфесту, оскільки він уже налаштований правильно, але нам потрібно змінити наші обробники. Пам’ятайте, що ми плануємо індексувати всі події передачі Terra, тому нам потрібно оновити розділ ` datasources `, щоб прочитати наступне.
+Ми не будемо робити багато змін у файлі маніфесту, оскільки він уже налаштований правильно, але нам потрібно змінити наші обробники. Пам’ятайте, що ми плануємо індексувати всі події передачі Terra, тому нам потрібно оновити розділ `datasources`, щоб прочитати наступне.
```yaml
відображення:
- файл: ./dist/index.js
- обробники:
- - обробник: handleEvent
- вид: terra/EventHandler
- # це спрацює для всіх подій, які відповідають наступній умові фільтра смарт-контракту
- фільтр:
- тип: передача
- messageFilter:
- тип: /terra.wasm.v1beta1.MsgExecuteContract
- значення:
- # Ми підписуємось на смарт-контракт bLuna (наприклад, передаємо лише події з цього контракту)
- договір: terra1j66jatn3k50hjtg2xemnjm8s7y8dws9xqa5y8w
+ файл: ./dist/index.js
+ обробники:
+ - обробник: handleEvent
+ вид: terra/EventHandler
+ # це спрацює для всіх подій, які відповідають наступній умові фільтра смарт-контракту
+ фільтр:
+ тип: передача
+ messageFilter:
+ тип: /terra.wasm.v1beta1.MsgExecuteContract
+ значення:
+ # Ми підписуємось на смарт-контракт bLuna (наприклад, передаємо лише події з цього контракту)
+ договір: terra1j66jatn3k50hjtg2xemnjm8s7y8dws9xqa5y8w
```
Це означає, що ми запускатимемо функцію відображення `handleEvent` кожного разу, коли буде подія `transfer` зі смарт-контракту bLuna.
@@ -174,7 +174,7 @@ export асинхронну функцію handleEvent(
Щоб запустити ваш новий проект SubQuery, нам спочатку потрібно створити нашу роботу. Запустіть команду збірки з кореневого каталогу проекту.
- ``` оболонка побудова yarn ``` ``` оболонка npm run-script build ```
+::: code-tabs @tab:active yarn `оболонка побудова yarn` @tab npm `оболонка npm run-script build` :::
**Важливо: щоразу, коли ви вносите зміни у свої функції відображення, вам потрібно буде перебудувати свій проект**
@@ -188,13 +188,9 @@ export асинхронну функцію handleEvent(
У каталозі проекту виконайте таку команду:
- ``` оболонка початок yarn ``` ``` оболонка npm run-script start:docker ```
-
-Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node),
-
-`@subql/query`7 > і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
-
+::: code-tabs @tab:active yarn `оболонка початок yarn` @tab npm `оболонка npm run-script start:docker` :::
+Завантаження необхідних пакетів може зайняти деякий час ([`@subql/node`](https://www.npmjs.com/package/@subql/node), `@subql/query` і Postgres) вперше, але незабаром ви побачите запущений вузол SubQuery. Будьте терплячі тут.
### Запитуйте свій проект
@@ -204,8 +200,6 @@ export асинхронну функцію handleEvent(
Для нового початкового проекту SubQuery ви можете спробувати такий запит, щоб зрозуміти, як він працює, або [дізнатися більше про мову Query GraphQL](../run_publish/graphql.md).
-
-
```graphql
{
запит {
@@ -226,17 +220,12 @@ export асинхронну функцію handleEvent(
}
```
-
-
-
### Опублікуйте проект SubQuery
SubQuery надає безкоштовну керовану службу, коли ви можете розгорнути свій новий проект. Ви можете розгорнути його в [SubQuery Projects](https://project.subquery.network) і зробити запит за допомогою нашого [ Explorer](https://explorer.subquery.network).
[Прочитайте посібник, щоб опублікувати свій новий проект у SubQuery Projects](../run_publish/publish.md)
-
-
## Настуні кроки
Вітаємо, тепер у вас є локально запущений проект SubQuery, який приймає запити GraphQL API для передачі даних з bLuna.
diff --git a/docs/uk/quickstart/quickstart.md b/docs/uk/quickstart/quickstart.md
index 403c8d3e570..e5f77db459c 100644
--- a/docs/uk/quickstart/quickstart.md
+++ b/docs/uk/quickstart/quickstart.md
@@ -89,10 +89,10 @@ HelloWorld is ready
Нарешті, виконайте таку команду, щоб встановити залежності нового проєкту з каталогу нового проєкту.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
-Тепер ви ініціалізували свій перший проєкт SubQuery всього за кілька простих кроків. Давайте тепер налаштуємо стандартний шаблон проєкту для конкретного цікавить блокчейну. Це допоможе вам краще зрозуміти команди
+Тепер ви ініціалізували свій перший проєкт SubQuery всього за кілька простих кроків. Давайте тепер налаштуємо стандартний шаблон проєкту для конкретного цікавить блокчейну. Це допоможе вам краще зрозуміти команди
## 3. Внесіть зміни до свого проєкту
@@ -102,4 +102,4 @@ HelloWorld is ready
2. Маніфест проєкту в `project.yaml`.
3. Функції відбивання в `src/mappings/` каталогу.
-SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
\ No newline at end of file
+SubQuery supports various blockchain networks and provides a dedicated guide for each of them. Select your preferred blockchain under 2. Specific Chains and continue the quick start guide.
diff --git a/docs/uk/run_publish/connect.md b/docs/uk/run_publish/connect.md
index 8ed3e3419e0..24f18776d5b 100644
--- a/docs/uk/run_publish/connect.md
+++ b/docs/uk/run_publish/connect.md
@@ -2,10 +2,10 @@
Як тільки ваше розгортання буде успішно завершено і наші вузли проіндексують ваші дані з ланцюжка, ви зможете увімкнутися до свого проєкту через відбиваний кінцеву точку запиту.
-![Проєкт розгортається і синхронізується](/assets/img/projects-deploy-sync.png)
+![Проєкт розгортається і синхронізується](/assets/img/projects_deploy_sync.png)
Крім того, ви можете натиснути на три точки поруч із заголовком проєкту та переглянути його на SubQuery Explorer. Там ви можете використовувати ігровий майданчик в браузері, щоб почати роботу.
-![Проєкти в провіднику вкладених SubQuery](/assets/img/projects-explorer.png)
+![Проєкти в провіднику вкладених SubQuery](/assets/img/projects_explorer.png)
::: інформаційна записка Дізнайтеся більше про [GraphQL Query language.](./graphql.md):::
diff --git a/docs/uk/run_publish/references.md b/docs/uk/run_publish/references.md
index 67404e63ea5..c4267f7798b 100644
--- a/docs/uk/run_publish/references.md
+++ b/docs/uk/run_publish/references.md
@@ -21,11 +21,11 @@ COMMANDS
Ця команда використовує webpack для створення пакету проєкту subquery.
-| Опція | Описання |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | локальна тека проєкту subquery (якщо вона ще не знаходиться в теці) |
-| -o, --output | вкажіть вихідну теку збірки, наприклад, build-folder |
-| --mode=(production | prod | development | dev) | [ default: production ] |
+| Опція | Описання |
+| ------------------ | ------------------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | локальна тека проєкту subquery (якщо вона ще не знаходиться в теці) |
+| -o, --output | вкажіть вихідну теку збірки, наприклад, build-folder |
+| --mode=(production | prod | development | dev) | [ default: production ] |
- За допомогою `subql build` ви можете вказати додаткові точки входу в поле exports, хоча він завжди буде будуватися `index.Ts` автоматично.
@@ -106,7 +106,7 @@ Options:
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
diff --git a/docs/uk/run_publish/run.md b/docs/uk/run_publish/run.md
index 2267da21eb0..875de8143a6 100644
--- a/docs/uk/run_publish/run.md
+++ b/docs/uk/run_publish/run.md
@@ -4,7 +4,7 @@
## Використовувати Docker
-Альтернативним рішенням є запуск Docker Container, визначеного файлом `docker-compose.yml`. Для нового проєкт, який був тільки що ініціалізований, вам не потрібно буде нічого змінювати.
+Альтернативним рішенням є запуск **Docker Container**, визначеного файлом `docker-compose.yml`. Для нового проєкт, який був тільки що ініціалізований, вам не потрібно буде нічого змінювати.
У каталозі проекту виконайте таку команду:
@@ -18,7 +18,7 @@ docker-compose pull && docker-compose up
Вимога:
-- База даних [Postgres](https://www.postgresql.org/) (версія 12 або вище). Поки [SubQuery](run.md#start-a-local-subquery-node) індексує блокчейн, витягнуті дані зберігаються в зовнішньому екземплярі бази даних.
+- База даних [Postgres](https://www.postgresql.org/) (версія 12 або вище). Поки [SubQuery](run.md#start-a-local-subquery-node) індексує блокчейн, витягнуті дані зберігаються в зовнішньому екземплярі бази даних.
Вузол SubQuery - це реалізація, яка витягує дані блокчейна на основі Substrate/Polkadot відповідно до проекту SubQuery і зберігає їх в базі даних Postgres.
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Установка
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Після встановлення ви можете запустити вузол за допомогою наступної команди:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Вкажіть шлях до локального проекту
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ subql-node -f your-project-path
#### Вкажіть файл конфігурації
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ Result:
#### Запуск в локальному режимі
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Перемикання на локальну модель створить таблиці Postgres у схемі за замовчуванням `public`.
diff --git a/docs/uk/run_publish/sandbox.md b/docs/uk/run_publish/sandbox.md
index 47e9bb20b76..ca341acd300 100644
--- a/docs/uk/run_publish/sandbox.md
+++ b/docs/uk/run_publish/sandbox.md
@@ -10,10 +10,8 @@
- Не застрахований від багатьох відомих методів нападу.
-
## Обмеження
-- Щоб обмежити доступ до певних вбудованих модулів, лише ` Assert `, ` buffer `, ` crypto `, ` util ` та ` path < / 0> білі.
-
Ми підтримуємо сторонні модулі , написані в CommonJS та hybrid бібліотек, таких як @ polkadot / * `, які використовують ESM як за замовчуванням.
-
-- Будь-які модулі, що використовують ` HTTP ` та ` WebSocket `, заборонені.
+- Щоб обмежити доступ до певних вбудованих модулів, лише `Assert`, `buffer`, `crypto`, `util` та ` path білі.
+- Ми підтримуємо [third-party-libraries](../create/mapping/polkadot.md#third-party-libraries) сторонні модулі, написані в **CommonJS** та **hybrid** бібліотек, таких як `@ polkadot / *`, які використовують ESM як за замовчуванням.
+- Будь-які модулі, що використовують `HTTP` та `WebSocket`, заборонені.
diff --git a/docs/uk/run_publish/upgrade.md b/docs/uk/run_publish/upgrade.md
index 8504548a31e..e431b153f64 100644
--- a/docs/uk/run_publish/upgrade.md
+++ b/docs/uk/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
Після того, як ваше розгортання успішно завершиться і наші вузли індексують ваші дані з ланцюга, ви зможете підключитися до вашого проекту через відображену кінцеву точку GraphQL Query.
-![Проєкт розгортається і синхронізується](/assets/img/projects-deploy-sync.png)
+![Проєкт розгортається і синхронізується](/assets/img/projects_deploy_sync.png)
Крім того, ви можете натиснути на три точки поруч із заголовком проєкту та переглянути його на SubQuery Explorer. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Проєкти в провіднику вкладених SubQuery](/assets/img/projects-explorer.png)
+![Проєкти в провіднику вкладених SubQuery](/assets/img/projects_explorer.png)
::: інформаційна записка Дізнайтеся більше про [GraphQL Query language.](./graphql.md):::
diff --git a/docs/vi/README.md b/docs/vi/README.md
index c3963995365..a59612e4023 100644
--- a/docs/vi/README.md
+++ b/docs/vi/README.md
@@ -4,7 +4,7 @@
Xây dựng dApps nhanh hơn với Học viện SubQuery
-
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
+
Explore and implement your own efficient custom open-source API between your decentralised data and tools to query data faster and save you time.
SubQuery now supports Polkadot, Avalanche, Cosmos, and Algorand.
@@ -12,7 +12,7 @@
Get a Kick-Start With Our Quick Start Guide
-
Build your first SubQuery project in less than 10 mins with simple guided steps.
+
Build your first SubQuery project in less than 10 mins with simple guided steps.
Start querying data for your dApps on your most loved blockchain network using our starter projects. Explore and modify important files, and understand how SubQuery works.
@@ -134,8 +134,7 @@
-
-
+
diff --git a/docs/vi/build/install.md b/docs/vi/build/install.md
index c4799ef4f2f..795f34ebdc8 100644
--- a/docs/vi/build/install.md
+++ b/docs/vi/build/install.md
@@ -8,28 +8,30 @@ Có nhiều thành phần cần thiết khi bạn muốn tạo một dự án s
Cài đặt SubQuery CLI trên toàn cầu trên thiết bị đầu cuối (terminal) của bạn bằng cách sử dụng Yarn hoặc NPM:
- ```bash npm install -g @subql/cli ```
- ```shell yarn global add @subql/cli ```
+::: code-tabs @tab npm `bash npm install -g @subql/cli `
+@tab:active yarn `shell yarn global add @subql/cli ` :::
Sau đó, bạn có thể chạy help để xem các lệnh có sẵn và cách sử dụng do CLI cung cấp:
```shell
subql help
```
+
## Cài đặt @subql/node
Node SubQuery là một phương thức để trích xuất dữ liệu Blockchain trên nền tảng Substrate cho mỗi dự án sử dụng SubQuery và lưu nó vào cơ sở dữ liệu Postgres.
Cài đặt nút SubQuery trên toàn cầu trên thiết bị đầu cuối của bạn bằng cách sử dụng Yarn hoặc NPM:
- ```bash npm install -g @subql/node ```
- ```shell yarn global add @subql/node ```
+::: code-tabs @tab npm `bash npm install -g @subql/node `
+@tab:active yarn `shell yarn global add @subql/node ` :::
Sau khi cài đặt, bạn có thể bắt đầu một node với:
```shell
subql-node
```
+
> Lưu ý: Nếu bạn đang sử dụng Docker hoặc lưu trữ dự án của mình trên SubQuery Projects, bạn có thể bỏ qua bước này. Bởi vì SubQuery Node đã được cung cấp trong Docker Container và cơ sở hạ tầng lưu trữ.
## Cài đặt @subql/query
@@ -38,7 +40,7 @@ Thư viện truy vấn SubQuery cung cấp dịch vụ cho phép bạn truy vấ
Cài đặt truy vấn SubQuery trên toàn cầu trên thiết bị đầu cuối của bạn bằng cách sử dụng Yarn hoặc NPM:
- ```bash npm install -g @subql/query ```
- ```shell yarn global add @subql/query ```
+::: code-tabs @tab npm `bash npm install -g @subql/query `
+@tab:active yarn `shell yarn global add @subql/query ` :::
-> Lưu ý: Nếu bạn đang sử dụng Docker hoặc lưu trữ dự án của mình trên SubQuery Projects, bạn cũng có thể bỏ qua bước này. Bởi vì SubQuery Node đã được cung cấp trong Docker Container và cơ sở hạ tầng lưu trữ.
\ No newline at end of file
+> Lưu ý: Nếu bạn đang sử dụng Docker hoặc lưu trữ dự án của mình trên SubQuery Projects, bạn cũng có thể bỏ qua bước này. Bởi vì SubQuery Node đã được cung cấp trong Docker Container và cơ sở hạ tầng lưu trữ.
diff --git a/docs/vi/build/introduction.md b/docs/vi/build/introduction.md
index f8b0a5e210d..a4e9569404d 100644
--- a/docs/vi/build/introduction.md
+++ b/docs/vi/build/introduction.md
@@ -51,8 +51,8 @@ Thao tác này sẽ tạo một thư mục mới (hoặc cập nhật thư mục
Chạy lệnh xây dựng từ thư mục gốc của dự án.
- `shell yarn build `
- `bash npm run-script build `
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
### Tùy chọn xây dựng thay thế
diff --git a/docs/vi/build/manifest.md b/docs/vi/build/manifest.md
index 0a858ed6f9c..a676ec08b0c 100644
--- a/docs/vi/build/manifest.md
+++ b/docs/vi/build/manifest.md
@@ -4,8 +4,8 @@ Tệp Manifest `project.yaml` có thể được xem như một điểm đầu v
Tệp kê khai có thể ở định dạng YAML hoặc JSON. Trong tài liệu này, chúng tôi sẽ sử dụng YAML trong tất cả các ví dụ. Dưới đây là ví dụ tiêu chuẩn về `project.yaml` cơ bản.
- ``` yml specVersion: 0.2.0 name: example-project #tên của dự án version: 1.0.0 #phiên bản của dự án description: '' #miêu tả dự án của bạn repository: 'https://github.com/subquery/subql-starter' # địa chỉ kho lưu trữ Git cho dự án của bạn schema: file: ./schema.graphql # Vị trí file GraphQL schema network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Hàm băm gốc của mạng endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Phần bổ sung tùy chọn điểm cuối HTTP của full chain dictionary nhằm tăng tốc độ xử lý dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Thao tác này sẽ thay đổi block chỉ mục khởi đầu, đặt mức giá trị cao hơn để bỏ qua các block khởi đầu với ít dữ liệu hơn. mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Phần filter (lọc) này là tùy chọn, có hay không cũng được module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
- ``` yml specVersion: "0.0.1" description: '' # Miêu tả dự án của bạn repository: 'https://github.com/subquery/subql-starter' # Địa chỉ kho lưu trữ Git cho dự án của bạn schema: ./schema.graphql #Vị trí file GraphQL schema network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Tùy chọn này giúp cung cấp điểm cuối HTTP của full chain dictionary nhằm tăng tốc độ xử lý dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Thao tác này sẽ thay đổi block chỉ mục khởi đầu, đặt mức giá trị cao hơn để bỏ qua các block khởi đầu với ít dữ liệu hơn. mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter (lọc) là một bổ sung tùy chọn (có hay không cũng được), nhưng nên có để tăng tốc độ xử lý sự kiện module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+::: code-tabs @tab v0.2.0 ` yml specVersion: 0.2.0 name: example-project #tên của dự án version: 1.0.0 #phiên bản của dự án description: '' #miêu tả dự án của bạn repository: 'https://github.com/subquery/subql-starter' # địa chỉ kho lưu trữ Git cho dự án của bạn schema: file: ./schema.graphql # Vị trí file GraphQL schema network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' # Hàm băm gốc của mạng endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Phần bổ sung tùy chọn điểm cuối HTTP của full chain dictionary nhằm tăng tốc độ xử lý dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - kind: substrate/Runtime startBlock: 1 # Thao tác này sẽ thay đổi block chỉ mục khởi đầu, đặt mức giá trị cao hơn để bỏ qua các block khởi đầu với ít dữ liệu hơn. mapping: file: "./dist/index.js" handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Phần filter (lọc) này là tùy chọn, có hay không cũng được module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ````
+@tab v0.0.1 ` yml specVersion: "0.0.1" description: '' # Miêu tả dự án của bạn repository: 'https://github.com/subquery/subql-starter' # Địa chỉ kho lưu trữ Git cho dự án của bạn schema: ./schema.graphql #Vị trí file GraphQL schema network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' # Tùy chọn này giúp cung cấp điểm cuối HTTP của full chain dictionary nhằm tăng tốc độ xử lý dictionary: 'https://api.subquery.network/sq/subquery/dictionary-polkadot' dataSources: - name: main kind: substrate/Runtime startBlock: 1 # Thao tác này sẽ thay đổi block chỉ mục khởi đầu, đặt mức giá trị cao hơn để bỏ qua các block khởi đầu với ít dữ liệu hơn. mapping: handlers: - handler: handleBlock kind: substrate/BlockHandler - handler: handleEvent kind: substrate/EventHandler filter: #Filter (lọc) là một bổ sung tùy chọn (có hay không cũng được), nhưng nên có để tăng tốc độ xử lý sự kiện module: balances method: Deposit - handler: handleCall kind: substrate/CallHandler ```` :::
## Di chuyển từ v0.0.1 sang v0.2.0
@@ -29,15 +29,15 @@ Theo mặc định, CLI sẽ tạo các dự án SubQuery theo phiên bản v0.2
USAGE $ subql init [PROJECTNAME]
-ARGUMENTS PROJECTNAME Give the starter project name
+ARGUMENTS PROJECTNAME Give the starter project name
| Các Tùy chọn | Mô tả |
-| ----------------------- | ---------------------------------------------------------------------------- |
+| ----------------------- | ---------------------------------------------------------------------------- | ------------------------------------------ |
| -f, --force | |
| -l, --location=location | thư mục cục bộ để chứa dự án tạo ra |
| -install-dependencies | Cài đặt các phần phụ thuộc |
| --npm | Buộc sử dụng NPM thay vì yarn, chỉ hoạt động với `install-dependencies` flag |
-| --specVersion=0.0.1 | 0.2.0 [mặc định: 0.2.0] | Phiên bản đặc tả sẽ được sử dụng bởi dự án |
+| --specVersion=0.0.1 | 0.2.0 [mặc định: 0.2.0] | Phiên bản đặc tả sẽ được sử dụng bởi dự án |
## Tổng quan
@@ -72,19 +72,19 @@ ARGUMENTS PROJECTNAME Give the starter project name
### Thông số kỹ thuật Data Source
Định nghĩa phần dữ liệu sẽ được lọc và trích xuất và vị trí của trình xử lý hàm ánh xạ để áp dụng chuyển đổi dữ liệu.
-| Trường | v0.0.1 | v0.2.0 | Mô tả |
+| Trường | v0.0.1 | v0.2.0 | Mô tả |
| -------------- | --------------------------------------------------------- | -------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **name** | String | String | Tên của nguồn dữ liệu |
-| **kind** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | Chúng tôi hỗ trợ các kiểu dữ liệu mặc định của Substrate runtime, chẳng hạn như khối, sự kiện và phần bổ sung (gọi). Từ v0.2.0, chúng tôi hỗ trợ dữ liệu runtime tùy chỉnh, chẳng hạn như smart contract. |
-| **startBlock** | Integer | Integer | Thao tác này sẽ thay đổi khối bắt đầu lập chỉ mục, đặt khối này cao hơn để bỏ qua khối ban đầu với ít dữ liệu hơn |
-| **mapping** | Thông số kỹ thuật ánh xạ | Thông số kỹ thuật ánh xạ | |
-| **filter** | [network-filters](./manifest/#network-filters) | String | Lọc nguồn dữ liệu để thực thi theo tên thông số điểm cuối mạng |
+| **name** | String | String | Tên của nguồn dữ liệu |
+| **kind** | [substrate/Runtime](./manifest/#data-sources-and-mapping) | substrate/Runtime, [substrate/CustomDataSource](./manifest/#custom-data-sources) | Chúng tôi hỗ trợ các kiểu dữ liệu mặc định của Substrate runtime, chẳng hạn như khối, sự kiện và phần bổ sung (gọi). Từ v0.2.0, chúng tôi hỗ trợ dữ liệu runtime tùy chỉnh, chẳng hạn như smart contract. |
+| **startBlock** | Integer | Integer | Thao tác này sẽ thay đổi khối bắt đầu lập chỉ mục, đặt khối này cao hơn để bỏ qua khối ban đầu với ít dữ liệu hơn |
+| **mapping** | Thông số kỹ thuật ánh xạ | Thông số kỹ thuật ánh xạ | |
+| **filter** | [network-filters](./manifest/#network-filters) | String | Lọc nguồn dữ liệu để thực thi theo tên thông số điểm cuối mạng |
### Thông số kỹ thuật ánh xạ
-| Trường | v0.0.1 | v0.2.0 | Mô tả |
-| ---------------------- | -------------------------------------------- | --------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| **file** | String | String | Đường dẫn đến mục nhập ánh xạ |
+| Trường | v0.0.1 | v0.2.0 | Mô tả |
+| ---------------------- | -------------------------------------------- | --------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **file** | String | String | Đường dẫn đến mục nhập ánh xạ |
| **handlers & filters** | [Default handlers and filters](#schema-spec) | Default handlers and filters, [Custom handlers and filters](#custom-data-sources) | Liệt kê tất cả [hàm ánh xạ](./mapping/polkadot.md) và các hàm xử lý tương ứng của chúng, với các bộ lọc ánh xạ bổ sung.
Đối với hàm xử lý ánh xạ runtime tùy chỉnh, vui lòng xem [Nguồn dữ liệu tùy chỉnh](#custom-data-sources) |
## Nguồn dữ liệu và ánh xạ
@@ -105,8 +105,8 @@ Bảng sau giải thích các bộ lọc được hỗ trợ bởi các trình x
**Dự án SubQuery của bạn sẽ hiệu quả hơn nhiều khi bạn sử dụng trình xử lý sự kiện và cuộc gọi với các bộ lọc ánh xạ thích hợp**
-| Hàm xử lý | Bộ lọc được hỗ trợ |
-| ------------------------------------------ | ---------------------------- |
+| Hàm xử lý | Bộ lọc được hỗ trợ |
+| --------------------------------------------------- | ---------------------------- |
| [BlockHandler](./mapping/polkadot.md#block-handler) | `specVersion` |
| [EventHandler](./mapping/polkadot.md#event-handler) | `module`,`method` |
| [CallHandler](./mapping/polkadot.md#call-handler) | `module`,`method` ,`success` |
@@ -154,8 +154,8 @@ Chúng tôi hỗ trợ các kiểu bổ sung được sử dụng bởi các mod
Trong ví dụ v0.2.0 bên dưới, `network.chaintypes` đang trỏ đến một tệp có tất cả các loại tùy chỉnh được nhúng vào, Đây là tệp chainpec tiêu chuẩn khai báo các kiểu cụ thể được hỗ trợ bởi chuỗi khối này trong cả định dạng `.json`, `.yaml` hoặc `.js`.
- `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
- `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true`
+::: code-tabs @tab v0.2.0 `yml network: genesisHash: '0x91b171bb158e2d3848fa23a9f1c25182fb8e20313b2c1eb49219da7a70ce90c3' endpoint: 'ws://host.kittychain.io/public-ws' chaintypes: file: ./types.json # The relative filepath to where custom types are stored ...`
+@tab v0.0.1 `yml ... network: endpoint: "ws://host.kittychain.io/public-ws" types: { "KittyIndex": "u32", "Kitty": "[u8; 16]" } # typesChain: { chain: { Type5: 'example' } } # typesSpec: { spec: { Type6: 'example' } } dataSources: - name: runtime kind: substrate/Runtime startBlock: 1 filter: #Optional specName: kitty-chain mapping: handlers: - handler: handleKittyBred kind: substrate/CallHandler filter: module: kitties method: breed success: true` :::
Để sử dụng typescript cho các loại chuỗi của bạn, hãy bao gồm tệp đó trong thư mục `src` (ví dụ: `./src/types.ts`), chạy `yarn build` và sau đó trỏ đến tệp js đã tạo nằm trong thư mục `dist`.
@@ -172,7 +172,7 @@ Những điều cần lưu ý khi sử dụng tệp loại chuỗi có phần m
Đây là ví dụ về tệp loại chuỗi `.ts`:
- `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; `
+::: code-tabs @tab types.ts `ts import { typesBundleDeprecated } from "moonbeam-types-bundle" export default { typesBundle: typesBundleDeprecated }; ` :::
## Nguồn dữ liệu tùy chỉnh
@@ -198,6 +198,6 @@ Người dùng có thể thêm `filter` trên `dataSources` để quyết địn
Dưới đây là một ví dụ hiển thị các nguồn dữ liệu khác nhau cho cả mạng Polkadot và Kusama.
- `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
+::: code-tabs @tab v0.0.1 `yaml --- network: endpoint: 'wss://polkadot.api.onfinality.io/public-ws' #Create a template to avoid redundancy definitions: mapping: &mymapping handlers: - handler: handleBlock kind: substrate/BlockHandler dataSources: - name: polkadotRuntime kind: substrate/Runtime filter: #Optional specName: polkadot startBlock: 1000 mapping: *mymapping #use template here - name: kusamaRuntime kind: substrate/Runtime filter: specName: kusama startBlock: 12000 mapping: *mymapping # can reuse or change `
-
+:::
diff --git a/docs/vi/build/mapping.md b/docs/vi/build/mapping.md
index a01bea5b81c..a50a0359a23 100644
--- a/docs/vi/build/mapping.md
+++ b/docs/vi/build/mapping.md
@@ -67,9 +67,9 @@ Mục tiêu của chúng tôi là cung cấp tất cả các nguồn dữ liệu
Đây là những giao diện chúng tôi hiện đang hỗ trợ:
-- [api.query. <module>. <method>()](https://polkadot.js.org/docs/api/start/api.query) sẽ truy vấn khối hiện tại.
-- [api.query. <module>. <method>.multi ()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) sẽ thực hiện nhiều truy vấn loại giống nhau tại khối hiện tại.
-- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) sẽ thực hiện nhiều truy vấn khác nhau tại khối hiện tại.
+- [api.query. <module>. <method>()](https://polkadot.js.org/docs/api/start/api.query) sẽ truy vấn khối **hiện tại**.
+- [api.query. <module>. <method>.multi ()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-same-type) sẽ thực hiện nhiều truy vấn loại **giống nhau** tại khối hiện tại.
+- [api.queryMulti()](https://polkadot.js.org/docs/api/start/api.query.multi/#multi-queries-distinct-types) sẽ thực hiện nhiều truy vấn **khác nhau** tại khối hiện tại.
Đây là những giao diện mà hiện tại chúng tôi **KHÔNG** hỗ trợ:
diff --git a/docs/vi/build/substrate-evm.md b/docs/vi/build/substrate-evm.md
index c4e389eda9f..e8eb79d8c3b 100644
--- a/docs/vi/build/substrate-evm.md
+++ b/docs/vi/build/substrate-evm.md
@@ -14,7 +14,7 @@ Các mạng được hỗ trợ:
## Bắt đầu
-1. Thêm custom data source dưới dạng dependency ` fiber add @ subql / contract-processors `
+1. Thêm custom data source dưới dạng dependency `fiber add @ subql / contract-processors`
2. Thêm một custom data source như được mô tả bên dưới
3. Thêm handlers cho custom data source vào code của bạn
@@ -28,10 +28,10 @@ Các mạng được hỗ trợ:
### Tuỳ chọn Processor
-| Trường | Kiểu dữ liệu | Bắt buộc | Mô tả |
-| ------- | ---------------- | -------- | -------------------------------------------------------------------------------------------------------------------- |
-| abi | String | Không | ABI được bộ xử lý sử dụng để phân tích cú pháp các đối số. Phải là key của `assets` |
-| address | String or `null` | Không | Địa chỉ hợp đồng, nơi mà bắt đầu sự kiện hoặc cuộc gọi được thực hiện tới. ` null ` sẽ bắt các lệnh gọi tạo hợp đồng |
+| Trường | Kiểu dữ liệu | Bắt buộc | Mô tả |
+| ------- | ---------------- | -------- | ------------------------------------------------------------------------------------------------------------------ |
+| abi | String | Không | ABI được bộ xử lý sử dụng để phân tích cú pháp các đối số. Phải là key của `assets` |
+| address | String or `null` | Không | Địa chỉ hợp đồng, nơi mà bắt đầu sự kiện hoặc cuộc gọi được thực hiện tới. `null` sẽ bắt các lệnh gọi tạo hợp đồng |
## MoonbeamCall
@@ -44,10 +44,10 @@ Hoạt động giống như [substrate/CallHandler](../create/mapping/#call-hand
### Call Filters
-| Trường | Kiểu dữ liệu | Các ví dụ | Mô tả |
-| -------- | ------------ | --------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| function | String | 0x095ea7b3, approve(address to,uint256 value) | Hoặc chuỗi [Function Signature](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment) hoặc hàm ` sighash ` dùng để lọc hàm được gọi trên hợp đồng |
-| from | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | Một địa chỉ Ethereum đã gửi giao dịch |
+| Trường | Kiểu dữ liệu | Các ví dụ | Mô tả |
+| -------- | ------------ | --------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| function | String | 0x095ea7b3, approve(address to,uint256 value) | Hoặc chuỗi [Function Signature](https://docs.ethers.io/v5/api/utils/abi/fragments/#FunctionFragment) hoặc hàm `sighash` dùng để lọc hàm được gọi trên hợp đồng |
+| from | String | 0x6bd193ee6d2104f14f94e2ca6efefae561a4334b | Một địa chỉ Ethereum đã gửi giao dịch |
### Handlers
@@ -74,7 +74,8 @@ Hoạt động giống như [substrate/EventHandler](../create/mapping/#event-ha
| ------ | ------------ | --------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------- |
| topics | String array | Transfer(address indexed from,address indexed to,uint256 value) | Bộ lọc chủ đề tuân theo bộ lọc nhật ký Ethereum JSON-PRC, bạn có thể tìm thêm tài liệu [tại đây](https://docs.ethers.io/v5/concepts/events/). |
-Ghi chú về các topic:
+**Ghi chú về các topic:**
+
Có một vài cải tiến từ các bộ lọc nhật ký cơ bản:
- Các Topics không cần thêm 0
@@ -123,7 +124,7 @@ dataSources:
# function: approve(address,uint256)
function: approve(address to,uint256 value)
from: '0x6bd193ee6d2104f14f94e2ca6efefae561a4334b'
-
+
Text
Xpath: /pre/code
```
diff --git a/docs/vi/faqs/faqs.md b/docs/vi/faqs/faqs.md
index 1dc9bf2b726..efe79da0385 100644
--- a/docs/vi/faqs/faqs.md
+++ b/docs/vi/faqs/faqs.md
@@ -16,7 +16,7 @@ SubQuery also provides free, production grade hosting of projects for developers
**Mạng SubQuery**
-The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. Mạng SubQuery lập chỉ mục và dữ liệu dịch vụ cho cộng đồng toàn cầu theo cách được khuyến khích và có thể xác minh. Sau khi xuất bản dự án của bạn lên Mạng SubQuery, bất kỳ ai cũng có thể lập chỉ mục và lưu trữ nó - cung cấp dữ liệu cho người dùng trên toàn thế giới nhanh hơn và đáng tin cậy hơn.
+The SubQuery Network allows developers to completely decentralise their infrastructure stack. It is the most open, performant, reliable, and scalable data service for dApps. Mạng SubQuery lập chỉ mục và dữ liệu dịch vụ cho cộng đồng toàn cầu theo cách được khuyến khích và có thể xác minh. Sau khi xuất bản dự án của bạn lên Mạng SubQuery, bất kỳ ai cũng có thể lập chỉ mục và lưu trữ nó - cung cấp dữ liệu cho người dùng trên toàn thế giới nhanh hơn và đáng tin cậy hơn.
More information [here](/subquery_network/introduction.md).
@@ -26,7 +26,7 @@ The best way to get started with SubQuery is to try out our [Hello World tutoria
## Làm cách nào để tôi có thể đóng góp hoặc đưa ra phản hồi cho SubQuery?
-Chúng tôi rất mong nhận được ý kiến đóng góp và phản hồi từ cộng đồng. To contribute the code, fork the repository of your interest and make your changes. Sau đó hãy sử dụng chức năng Pull Request hay gọi tắt là PR. Don't forget to test as well. Also check out our contributions guidelines.
+Chúng tôi rất mong nhận được ý kiến đóng góp và phản hồi từ cộng đồng. To contribute the code, fork the repository of your interest and make your changes. Sau đó hãy sử dụng chức năng Pull Request hay gọi tắt là PR. Don't forget to test as well. Also check out our [contributions guidelines](../miscellaneous/contributing.html).
To give feedback, contact us at hello@subquery.network or jump onto our [discord channel](https://discord.com/invite/78zg8aBSMG).
@@ -76,7 +76,6 @@ subql-node -f . --force-clean --subquery-name=
Note that it is recommended to use `--force-clean` when changing the `startBlock` within the project manifest (`project.yaml`) in order to begin reindexing from the configured block. If `startBlock` is changed without a `--force-clean` of the project, then the indexer will continue indexing with the previously configured `startBlock`.
-
## How can I optimise my project to speed it up?
Performance is a crucial factor in each project. Fortunately, there are several things you could do to improve it. Here is the list of some suggestions:
@@ -89,13 +88,13 @@ Performance is a crucial factor in each project. Fortunately, there are several
- Set the start block to when the contract was initialised.
- Always use a [dictionary](../tutorials_examples/dictionary.html#how-does-a-subquery-dictionary-work) (we can help create one for your new network).
- Optimise your schema design, keep it as simple as possible.
- - Try to reduce unnecessary fields and columns.
- - Create indexes as needed.
+ - Try to reduce unnecessary fields and columns.
+ - Create indexes as needed.
- Use parallel/batch processing as often as possible.
- - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
- - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
- - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
+ - Use `api.queryMulti()` to optimise Polkadot API calls inside mapping functions and query them in parallel. This is a faster way than a loop.
+ - Use `Promise.all()`. In case of multiple async functions, it is better to execute them and resolve in parallel.
+ - If you want to create a lot of entities within a single handler, you can use `store.bulkCreate(entityName: string, entities: Entity[])`. You can create them in parallel, no need to do this one by one.
- Making API calls to query state can be slow. You could try to minimise calls where possible and to use `extrinsic/transaction/event` data.
- Use `worker threads` to move block fetching and block processing into its own worker thread. It could speed up indexing by up to 4 times (depending on the particular project). You can easily enable it using the `-workers=` flag. Note that the number of available CPU cores strictly limits the usage of worker threads. For now, it is only available for Substrate and Cosmos and will soon be integrated for Avalanche.
- Note that `JSON.stringify` doesn’t support native `BigInts`. Our logging library will do this internally if you attempt to log an object. We are looking at a workaround for this.
-- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
\ No newline at end of file
+- Use a convenient `modulo` filter to run a handler only once to a specific block. This filter allows handling any given number of blocks, which is extremely useful for grouping and calculating data at a set interval. For instance, if modulo is set to 50, the block handler will run on every 50 blocks. It provides even more control over indexing data to developers and can be implemented like so below in your project manifest.
diff --git a/docs/vi/miscellaneous/contributing.md b/docs/vi/miscellaneous/contributing.md
index 6f5d453bf34..3ce9f50226f 100644
--- a/docs/vi/miscellaneous/contributing.md
+++ b/docs/vi/miscellaneous/contributing.md
@@ -2,7 +2,7 @@
Chào mừng và chân thành cảm ơn bạn đã cân nhắc đóng góp cho dự án SubQuery! Cùng nhau, chúng ta có thể mở đường cho một tương lai phi tập trung hơn.
-::: info Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
+::: tip Note This documentation is actively maintained by the SubQuery team. We welcome your contributions. You can do so by forking our GitHub project and making changes to all the documentation markdown files under the `docs` directory. :::
Sau đây là một tập hợp các nguyên tắc (không phải quy tắc) để đóng góp vào SubQuery. Following these guidelines will help us make the contribution process easy and effective for everyone involved. It also communicates that you agree to respect the time of the developers managing and developing this project. In return, we will reciprocate that respect by addressing your issue, considering changes, collaborating on improvements, and helping you finalise your pull requests.
@@ -14,8 +14,8 @@ We take our open source community projects and responsibility seriously and hold
Contributions to our repositories are made through Issues and Pull Requests (PRs). A few general guidelines that cover both:
-* Tìm kiếm các Vấn đề và PR hiện có trước khi tự làm.
-* Chúng tôi làm việc chăm chỉ để đảm bảo các vấn đề được xử lý kịp thời nhưng tùy thuộc vào mức độ ảnh hưởng, có thể mất một khoảng thời gian để điều tra nguyên nhân gốc rễ. Đề cập @ thân thiện trong chuỗi nhận xét cho người gửi hoặc người đóng góp có thể giúp thu hút sự chú ý nếu vấn đề của bạn đang bị chặn.
+- Tìm kiếm các Vấn đề và PR hiện có trước khi tự làm.
+- Chúng tôi làm việc chăm chỉ để đảm bảo các vấn đề được xử lý kịp thời nhưng tùy thuộc vào mức độ ảnh hưởng, có thể mất một khoảng thời gian để điều tra nguyên nhân gốc rễ. Đề cập @ thân thiện trong chuỗi nhận xét cho người gửi hoặc người đóng góp có thể giúp thu hút sự chú ý nếu vấn đề của bạn đang bị chặn.
## Cách đóng góp
@@ -23,32 +23,32 @@ Contributions to our repositories are made through Issues and Pull Requests (PRs
Bugs are tracked as GitHub issues. When logging an issue, explain the problem and include additional details to help maintainers reproduce the problem:
-* Sử dụng tiêu đề rõ ràng và mang tính mô tả cho vấn đề để xác định vấn đề.
-* Mô tả các bước chính xác để tái tạo vấn đề.
-* Mô tả trạng thái bạn đã quan sát được sau khi làm theo các bước.
-* Giải thích hành vi nào bạn muốn thấy và tại sao.
-* Bao gồm ảnh chụp màn hình nếu có thể.
+- Sử dụng tiêu đề rõ ràng và mang tính mô tả cho vấn đề để xác định vấn đề.
+- Mô tả các bước chính xác để tái tạo vấn đề.
+- Mô tả trạng thái bạn đã quan sát được sau khi làm theo các bước.
+- Giải thích hành vi nào bạn muốn thấy và tại sao.
+- Bao gồm ảnh chụp màn hình nếu có thể.
### Gửi Pull Requests
In general, we follow the "fork-and-pull" Git workflow:
-* Fork the repository to your own Github account.
-* Clone the project to your machine.
-* Create a branch locally with a succinct but descriptive name.
-* Commit changes to the branch.
-* Following any formatting and testing guidelines specific to this repo.
-* Push changes to your fork.
-* Open a PR in our repository.
+- Fork the repository to your own Github account.
+- Clone the project to your machine.
+- Create a branch locally with a succinct but descriptive name.
+- Commit changes to the branch.
+- Following any formatting and testing guidelines specific to this repo.
+- Push changes to your fork.
+- Open a PR in our repository.
## Quy ước mã hóa
### Thông báo cam kết Git
-* Use the present tense ("Add feature" not "Added feature").
-* Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
-* Limit the first line to 72 characters or less.
+- Use the present tense ("Add feature" not "Added feature").
+- Use the imperative mood ("Move cursor to..." not "Moves cursor to...").
+- Limit the first line to 72 characters or less.
### Hướng dẫn định kiểu JavaScript
-* All JavaScript code is linted with Prettier and ESLint.
+- All JavaScript code is linted with Prettier and ESLint.
diff --git a/docs/vi/quickstart/helloworld-localhost.md b/docs/vi/quickstart/helloworld-localhost.md
index 9bfaf0bafa0..b9af1d7ca33 100644
--- a/docs/vi/quickstart/helloworld-localhost.md
+++ b/docs/vi/quickstart/helloworld-localhost.md
@@ -88,8 +88,8 @@ cd subqlHelloWorld
Bây giờ cài đặt yarn hoặc node để cài các phụ thuộc khác nhau.
- ```shell yarn install ```
- ```bash npm install ```
+::: code-tabs @tab:active yarn `shell yarn install `
+@tab npm `bash npm install ` :::
Ví dụ` yarn install`
@@ -109,8 +109,8 @@ success Saved lockfile.
Bây giờ, hãy chạy `yarn codegen` để tạo Typescript từ sơ đồ GraphQL.
- ```shell yarn codegen ```
- ```bash npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `bash npm run-script codegen ` :::
Ví dụ` yarn codegen`
@@ -133,8 +133,8 @@ $ ./node_modules/.bin/subql codegen
Bước tiếp theo là xây dựng mã với `yarn build`.
- ```shell yarn build ```
- ```bash npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build `
+@tab npm `bash npm run-script build ` :::
Ví dụ `yarn build`
diff --git a/docs/vi/quickstart/quickstart-avalanche.md b/docs/vi/quickstart/quickstart-avalanche.md
index c84363246ad..9e1ff6cb1bd 100644
--- a/docs/vi/quickstart/quickstart-avalanche.md
+++ b/docs/vi/quickstart/quickstart-avalanche.md
@@ -59,15 +59,15 @@ Sau khi quá trình khởi tạo hoàn tất, bạn sẽ thấy một thư mục
Cuối cùng, trong thư mục dự án, chạy lệnh sau để cài đặt các phụ thuộc của dự án mới.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Thực hiện các thay đổi trên dự án của bạn
Trong gói khởi đầu mà bạn vừa khởi tạo, chúng tôi đã cung cấp cấu hình tiêu chuẩn cho dự án của bạn. Bạn sẽ làm việc chủ yếu trên các tệp sau:
1. Lược đồ GraphQL ở `schema.graphql`
-2. Tệp Kê khai dự án ở ` project.yaml `
+2. Tệp Kê khai dự án ở `project.yaml`
3. Các chức năng ánh xạ trong thư mục `src/mappings/`
Mục tiêu của hướng dẫn nhanh này là điều chỉnh dự án khởi động tiêu chuẩn để lập chỉ mục tất cả `Phê duyệt` nhật ký giao dịch của Pangolin.
@@ -92,8 +92,8 @@ type PangolinApproval @entity {
**Quan trọng: Khi bạn thực hiện bất kỳ thay đổi nào đối với tệp lược đồ, hãy đảm bảo rằng bạn tạo lại thư mục types của mình. Thực hiện ngay bây giờ.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Bạn sẽ tìm thấy các model đã tạo trong thư mục `/src/types/models`. Để biết thêm thông tin về tệp `schema.graphql`, hãy xem tài liệu của chúng tôi trong [Lược đồ Build/GraphQL ](../build/graphql.md)
@@ -169,7 +169,7 @@ Hàm này đang nhận nhật ký của Avalanche bao gồm dữ liệu truyền
Để chạy Dự án SubQuery mới của bạn trước tiên chúng tôi cần xây dựng công việc của mình. Chạy lệnh xây dựng từ thư mục gốc của dự án.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Quan trọng: Bất cứ khi nào bạn thực hiện các thay đổi đối với các hàm ánh xạ của mình, bạn sẽ cần phải xây dựng lại dự án của mình**
@@ -183,7 +183,7 @@ Tất cấu hình điều khiển cách chạy nút SubQuery được xác đị
Trong thư mục dự án chạy lệnh sau:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Có thể mất một chút thời gian để tải xuống các gói cần thiết ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), và Postgres) cho lần đầu tiên, nhưng bạn sẽ sớm thấy một node SubQuery đang chạy. Hãy kiên nhẫn ở bước này.
diff --git a/docs/vi/quickstart/quickstart-cosmos.md b/docs/vi/quickstart/quickstart-cosmos.md
index ea824db4f7c..1211cf8077d 100644
--- a/docs/vi/quickstart/quickstart-cosmos.md
+++ b/docs/vi/quickstart/quickstart-cosmos.md
@@ -44,15 +44,15 @@ Sau khi quá trình khởi tạo hoàn tất, bạn sẽ thấy một thư mục
Cuối cùng, trong thư mục dự án, chạy lệnh sau để cài đặt các phụ thuộc của dự án mới.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Thực hiện các thay đổi trên dự án của bạn
Trong gói khởi đầu mà bạn vừa khởi tạo, chúng tôi đã cung cấp cấu hình tiêu chuẩn cho dự án của bạn. Bạn sẽ làm việc chủ yếu trên các tệp sau:
1. Lược đồ GraphQL ở `schema.graphql`
-2. Tệp Kê khai dự án ở ` project.yaml `
+2. Tệp Kê khai dự án ở `project.yaml`
3. Các chức năng ánh xạ trong thư mục `src/mappings/`
Mục tiêu của hướng dẫn bắt đầu nhanh này là điều chỉnh dự án khởi đầu tiêu chuẩn để bắt đầu lập chỉ mục tất cả các giao dịch từ hợp đồng thông minh bLuna.
@@ -75,8 +75,8 @@ type Vote @entity {
**Quan trọng: Khi bạn thực hiện bất kỳ thay đổi nào đối với tệp lược đồ, hãy đảm bảo rằng bạn tạo lại thư mục types của mình. Thực hiện ngay bây giờ.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Bạn sẽ tìm thấy các model đã tạo trong `thư mục /src/types/models`. Để biết thêm thông tin về tệp `schema.graphql`, hãy xem tài liệu của chúng tôi trong [Lược đồ Build/GraphQL ](../build/graphql.md)
@@ -145,7 +145,7 @@ Hàm này đang nhận CosmosMessage bao gồm dữ liệu tin nhắn trên tr
Để chạy Dự án SubQuery mới của bạn trước tiên chúng tôi cần xây dựng công việc của mình. Chạy lệnh xây dựng từ thư mục gốc của dự án.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Quan trọng: Bất cứ khi nào bạn thực hiện các thay đổi đối với các hàm ánh xạ của mình, bạn sẽ cần phải xây dựng lại dự án của mình**
@@ -159,7 +159,7 @@ Tất cả cấu hình kiểm soát cách chạy node SubQuery được định
Trong thư mục dự án chạy lệnh sau:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Có thể mất một chút thời gian để tải xuống các gói cần thiết ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), và Postgres) cho lần đầu tiên, nhưng bạn sẽ sớm thấy một node SubQuery đang chạy. Hãy kiên nhẫn ở bước này.
@@ -173,10 +173,9 @@ Bạn sẽ thấy một sân chơi GraphQL đang hiển thị trong explorer và
```graphql
query {
- votes(
+ votes(
first: 5
- orderBy: BLOCK_HEIGHT_DESC
- # filter: {proposalID: {equalTo: "4"}}
+ orderBy: BLOCK_HEIGHT_DESC # filter: {proposalID: {equalTo: "4"}}
) {
nodes {
id
diff --git a/docs/vi/quickstart/quickstart-polkadot.md b/docs/vi/quickstart/quickstart-polkadot.md
index 5a55d2b1dee..65d890f7ffa 100644
--- a/docs/vi/quickstart/quickstart-polkadot.md
+++ b/docs/vi/quickstart/quickstart-polkadot.md
@@ -43,10 +43,10 @@ subql init
Bạn sẽ được hỏi một số câu hỏi khi dự án SubQuery được khởi tạo:
- Project name: Tên dự án SubQuery của bạn
-- Network family: Một mạng blockchain layer-1 mà dự án SubQuery này sẽ được phát triển để lập chỉ mục. Sử dụng các phím mũi tên để chọn từ các tùy chọn có sẵn. Đối với hướng dẫn này, chúng tôi sẽ sử dụng *"Substrate"*
-- Network: Network cụ thể mà dự án SubQuery này sẽ được phát triển để lập chỉ mục. Sử dụng các phím mũi tên để chọn từ các tùy chọn có sẵn. Đối với hướng dẫn này, chúng tôi sẽ sử dụng *"Polkadot"*
-- Template project: Chọn một dự án mẫu SubQuery sẽ cung cấp một điểm khởi đầu để bắt đầu phát triển. Chúng tôi khuyên bạn nên chọn dự án *"subql-starter"*.
-- RPC endpoint: Cung cấp HTTPS URL cho RPC endpoint đang chạy, sẽ được sử dụng mặc định cho dự án này. Bạn có thể nhanh chóng truy cập các điểm cuối công khai cho các mạng Polkadot khác nhau, tạo node chuyên dụng riêng của mình bằng cách sử dụng [OnFinality](https://app.onfinality.io) hoặc chỉ sử dụng điểm cuối Polkadot mặc định. Nút RPC này phải là một nút lưu trữ (có trạng thái chuỗi đầy đủ). Đối với hướng dẫn này, chúng tôi sẽ sử dụng giá trị mặc định *"https://polkadot.api.onfinality.io"*
+- Network family: Một mạng blockchain layer-1 mà dự án SubQuery này sẽ được phát triển để lập chỉ mục. Sử dụng các phím mũi tên để chọn từ các tùy chọn có sẵn. Đối với hướng dẫn này, chúng tôi sẽ sử dụng _"Substrate"_
+- Network: Network cụ thể mà dự án SubQuery này sẽ được phát triển để lập chỉ mục. Sử dụng các phím mũi tên để chọn từ các tùy chọn có sẵn. Đối với hướng dẫn này, chúng tôi sẽ sử dụng _"Polkadot"_
+- Template project: Chọn một dự án mẫu SubQuery sẽ cung cấp một điểm khởi đầu để bắt đầu phát triển. Chúng tôi khuyên bạn nên chọn dự án _"subql-starter"_.
+- RPC endpoint: Cung cấp HTTPS URL cho RPC endpoint đang chạy, sẽ được sử dụng mặc định cho dự án này. Bạn có thể nhanh chóng truy cập các điểm cuối công khai cho các mạng Polkadot khác nhau, tạo node chuyên dụng riêng của mình bằng cách sử dụng [OnFinality](https://app.onfinality.io) hoặc chỉ sử dụng điểm cuối Polkadot mặc định. Nút RPC này phải là một nút lưu trữ (có trạng thái chuỗi đầy đủ). Đối với hướng dẫn này, chúng tôi sẽ sử dụng giá trị mặc định _"https://polkadot.api.onfinality.io"_
- Git repository: Cung cấp Git URL cho repo mà dự án SubQuery này sẽ được lưu trữ (khi được lưu trữ trong SubQuery Explorer) hoặc chấp nhận giá trị mặc định được cung cấp.
- Authors: Nhập chủ sở hữu của dự án SubQuery này tại đây (ví dụ: tên của bạn!) Hoặc chấp nhận giá trị mặc định đã cung cấp.
- Description: Cung cấp một đoạn giới thiệu ngắn về dự án của bạn, mô tả dự án chứa dữ liệu gì và người dùng có thể làm gì với dự án đó hoặc chấp nhận giá trị mặc định đã cung cấp.
@@ -57,14 +57,14 @@ Sau khi quá trình khởi tạo hoàn tất, bạn sẽ thấy một thư mục
Cuối cùng, trong thư mục dự án, chạy lệnh sau để cài đặt các phụ thuộc của dự án mới.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Thực hiện các thay đổi đối với dự án của bạn
Trong gói khởi động vừa được khởi tạo, một cấu hình tiêu chuẩn đã được cung cấp. Đó là:
-1. Lược đồ GraphQL trong ` schema.graphql `
+1. Lược đồ GraphQL trong `schema.graphql`
2. Tệp Kê khai dự án trong `project.yaml`
3. Các hàm ánh xạ trong thư mục `src/mappings/`
@@ -88,8 +88,8 @@ type Transfer @entity {
**Quan trọng: Khi bạn thực hiện bất kỳ thay đổi nào đối với tệp lược đồ, hãy đảm bảo rằng bạn tạo lại thư mục types của mình.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Bạn sẽ tìm thấy các mô hình đã tạo trong thư mục `/src/types/models` Để biết thêm thông tin về tệp `schema.graphql`, hãy xem tài liệu của chúng tôi trong [Lược đồ Build/GraphQL ](../build/graphql.md)
@@ -133,22 +133,22 @@ import { Transfer } from "../types";
import { Balance } from "@polkadot/types/interfaces";
export async function handleEvent(event: SubstrateEvent): Promise {
- // Get data from the event
- // The balances.transfer event has the following payload \[from, to, value\]
- // logger.info(JSON.stringify(event));
- const from = event.event.data[0];
- const to = event.event.data[1];
- const amount = event.event.data[2];
-
- // Create the new transfer entity
- const transfer = new Transfer(
- `${event.block.block.header.number.toNumber()}-${event.idx}`,
- );
- transfer.blockNumber = event.block.block.header.number.toBigInt();
- transfer.from = from.toString();
- transfer.to = to.toString();
- transfer.amount = (amount as Balance).toBigInt();
- await transfer.save();
+ // Get data from the event
+ // The balances.transfer event has the following payload \[from, to, value\]
+ // logger.info(JSON.stringify(event));
+ const from = event.event.data[0];
+ const to = event.event.data[1];
+ const amount = event.event.data[2];
+
+ // Create the new transfer entity
+ const transfer = new Transfer(
+ `${event.block.block.header.number.toNumber()}-${event.idx}`
+ );
+ transfer.blockNumber = event.block.block.header.number.toBigInt();
+ transfer.from = from.toString();
+ transfer.to = to.toString();
+ transfer.amount = (amount as Balance).toBigInt();
+ await transfer.save();
}
```
@@ -160,7 +160,7 @@ Hàm này đang nhận SubstrateEvent bao gồm dữ liệu truyền tải trên
Để chạy Dự án SubQuery mới, trước tiên chúng ta cần xây dựng công việc của mình. Chạy lệnh xây dựng từ thư mục gốc của dự án.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Quan trọng: Bất cứ khi nào bạn thực hiện các thay đổi đối với các hàm ánh xạ của mình, bạn sẽ cần phải xây dựng lại dự án của mình**
@@ -174,7 +174,7 @@ Tất cả cấu hình kiểm soát cách chạy node SubQuery được định
Trong thư mục dự án, hãy chạy lệnh sau:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Có thể mất một chút thời gian để tải xuống các gói cần thiết ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), và Postgres) cho lần đầu tiên, nhưng bạn sẽ sớm thấy một nút SubQuery đang chạy trong màn hình đầu cuối.
@@ -189,10 +189,7 @@ Bạn sẽ thấy một playground GraphQL trong trình duyệt và các lược
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: AMOUNT_DESC
- ) {
+ transfers(first: 10, orderBy: AMOUNT_DESC) {
nodes {
id
amount
diff --git a/docs/vi/quickstart/quickstart-terra.md b/docs/vi/quickstart/quickstart-terra.md
index 5382ffa3fa4..c9f14f9c05f 100644
--- a/docs/vi/quickstart/quickstart-terra.md
+++ b/docs/vi/quickstart/quickstart-terra.md
@@ -45,11 +45,11 @@ subql init
Bạn sẽ được hỏi một số câu hỏi khi dự án SubQuery được khởi tạo:
- Project Name: Tên dự án SubQuery của bạn
-- Network Family: Mạng blockchain layer-1 mà dự án Subquery sẽ được phát triển để lập chỉ mục, dùng dấu mũi tên để di chuyển giữa các lựa chọn, trong bài hướng dẫn này, chúng ta sẽ sử dụng *"Terra"*
-- Network: Mạng cụ thể mà dự án SubQuery này sẽ được phát triển để lập chỉ mục, dùng phím mũi tên để di chuyển giữa các lựa chọn, trong bài hướng dẫn này chúng ta sẽ dùng *"Terra"*
-- Template: Chọn mẫu dự án SubQuery để bắt đầu phát triển, chúng tôi gợi ý bạn chọn *"Starter project"*
+- Network Family: Mạng blockchain layer-1 mà dự án Subquery sẽ được phát triển để lập chỉ mục, dùng dấu mũi tên để di chuyển giữa các lựa chọn, trong bài hướng dẫn này, chúng ta sẽ sử dụng _"Terra"_
+- Network: Mạng cụ thể mà dự án SubQuery này sẽ được phát triển để lập chỉ mục, dùng phím mũi tên để di chuyển giữa các lựa chọn, trong bài hướng dẫn này chúng ta sẽ dùng _"Terra"_
+- Template: Chọn mẫu dự án SubQuery để bắt đầu phát triển, chúng tôi gợi ý bạn chọn _"Starter project"_
- Git repository (Tùy chọn): Cung cấp URL Git cho kho lưu trữ dự án SubQuery này (khi được lưu trữ trong SubQuery Explorer)
-- RPC endpoint (Bắt buộc): Cung cấp URL HTTPS cho điểm cuối RPC đang chạy, sẽ được sử dụng mặc định cho dự án này. Nút RPC này phải là một nút lưu trữ (có trạng thái chuỗi đầy đủ). Đối với hướng dẫn này, chúng tôi sẽ sử dụng giá trị mặc định *"https://terra-columbus-5.beta.api.onfinality.io"*
+- RPC endpoint (Bắt buộc): Cung cấp URL HTTPS cho điểm cuối RPC đang chạy, sẽ được sử dụng mặc định cho dự án này. Nút RPC này phải là một nút lưu trữ (có trạng thái chuỗi đầy đủ). Đối với hướng dẫn này, chúng tôi sẽ sử dụng giá trị mặc định _"https://terra-columbus-5.beta.api.onfinality.io"_
- Authors (Bắt buộc): Nhập chủ sở hữu của dự án SubQuery này tại đây (ví dụ: tên bạn!)
- Description (Tùy chọn): Bạn có thể cung cấp một đoạn giới thiệu ngắn về dự án của mình, mô tả dự án chứa dữ liệu gì và người dùng có thể làm gì với dự án
- Version (Bắt buộc): Nhập số phiên bản tùy chỉnh hoặc sử dụng giá trị mặc định (`1.0.0`)
@@ -59,15 +59,15 @@ Sau khi quá trình khởi tạo hoàn tất, bạn sẽ thấy một thư mục
Cuối cùng, trong thư mục dự án, chạy lệnh sau để cài đặt các phụ thuộc của dự án mới.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
## Thực hiện các thay đổi trên dự án của bạn
Trong gói khởi đầu mà bạn vừa khởi tạo, chúng tôi đã cung cấp cấu hình tiêu chuẩn cho dự án của bạn. Bạn sẽ làm việc chủ yếu trên các tệp sau:
1. Lược đồ GraphQL ở `schema.graphql`
-2. Tệp Kê khai dự án ở ` project.yaml `
+2. Tệp Kê khai dự án ở `project.yaml`
3. Các chức năng ánh xạ trong thư mục `src/mappings/`
Mục tiêu của hướng dẫn bắt đầu nhanh này là điều chỉnh dự án khởi đầu tiêu chuẩn để bắt đầu lập chỉ mục tất cả các giao dịch từ hợp đồng thông minh bLuna.
@@ -91,8 +91,8 @@ type Transfer @entity {
**Quan trọng: Khi bạn thực hiện bất kỳ thay đổi nào đối với tệp lược đồ, hãy đảm bảo rằng bạn tạo lại thư mục types của mình. Thực hiện ngay bây giờ.**
- ```shell yarn codegen ```
- ```shell npm run-script codegen ```
+::: code-tabs @tab:active yarn `shell yarn codegen `
+@tab npm `shell npm run-script codegen ` :::
Bạn sẽ tìm thấy các model đã tạo trong `thư mục /src/types/models`. Để biết thêm thông tin về tệp `schema.graphql`, hãy xem tài liệu của chúng tôi trong [Lược đồ Build/GraphQL ](../build/graphql.md)
@@ -143,30 +143,30 @@ import { MsgExecuteContract } from "@terra-money/terra.js";
export async function handleEvent(
event: TerraEvent
): Promise {
- // Print debugging data from the event
- // logger.info(JSON.stringify(event));
-
- // Create the new transfer entity with a unique ID
- const transfer = new Transfer(
- `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
- );
- transfer.blockHeight = BigInt(event.block.block.block.header.height);
- transfer.txHash = event.tx.tx.txhash;
- for (const attr of event.event.attributes) {
- switch (attr.key) {
- case "sender":
- transfer.sender = attr.value;
- break;
- case "recipient":
- transfer.recipient = attr.value;
- break;
- case "amount":
- transfer.amount = attr.value;
- break;
- default:
- }
+ // Print debugging data from the event
+ // logger.info(JSON.stringify(event));
+
+ // Create the new transfer entity with a unique ID
+ const transfer = new Transfer(
+ `${event.tx.tx.txhash}-${event.msg.idx}-${event.idx}`
+ );
+ transfer.blockHeight = BigInt(event.block.block.block.header.height);
+ transfer.txHash = event.tx.tx.txhash;
+ for (const attr of event.event.attributes) {
+ switch (attr.key) {
+ case "sender":
+ transfer.sender = attr.value;
+ break;
+ case "recipient":
+ transfer.recipient = attr.value;
+ break;
+ case "amount":
+ transfer.amount = attr.value;
+ break;
+ default:
}
- await transfer.save();
+ }
+ await transfer.save();
}
```
@@ -178,7 +178,7 @@ Hàm này đang nhận SubstrateEvent bao gồm dữ liệu truyền tải trên
Để chạy Dự án SubQuery mới của bạn trước tiên chúng tôi cần xây dựng công việc của mình. Chạy lệnh xây dựng từ thư mục gốc của dự án.
- ```shell yarn build ``` ```shell npm run-script build ```
+::: code-tabs @tab:active yarn `shell yarn build ` @tab npm `shell npm run-script build ` :::
**Quan trọng: Bất cứ khi nào bạn thực hiện các thay đổi đối với các hàm ánh xạ của mình, bạn sẽ cần phải xây dựng lại dự án của mình**
@@ -192,7 +192,7 @@ Tất cả cấu hình kiểm soát cách chạy node SubQuery được định
Trong thư mục dự án chạy lệnh sau:
- ```shell yarn start:docker ``` ```shell npm run-script start:docker ```
+::: code-tabs @tab:active yarn `shell yarn start:docker ` @tab npm `shell npm run-script start:docker ` :::
Có thể mất một chút thời gian để tải xuống các gói cần thiết ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), và Postgres) cho lần đầu tiên, nhưng bạn sẽ sớm thấy một node SubQuery đang chạy. Hãy kiên nhẫn ở bước này.
@@ -207,10 +207,7 @@ Bạn sẽ thấy một sân chơi GraphQL đang hiển thị trong explorer và
```graphql
{
query {
- transfers(
- first: 10,
- orderBy: ID_DESC
- ) {
+ transfers(first: 10, orderBy: ID_DESC) {
nodes {
id
txHash
diff --git a/docs/vi/quickstart/quickstart.md b/docs/vi/quickstart/quickstart.md
index fded447ca36..a098fb34ed6 100644
--- a/docs/vi/quickstart/quickstart.md
+++ b/docs/vi/quickstart/quickstart.md
@@ -89,8 +89,8 @@ Sau khi quá trình khởi tạo hoàn tất, bạn sẽ thấy một thư mục
Cuối cùng, chạy lệnh sau để cài đặt các phụ thuộc từ bên trong thư mục của dự án mới.
- ```shell cd PROJECT_NAME yarn install ```
- ```shell cd PROJECT_NAME npm install ```
+::: code-tabs @tab:active yarn `shell cd PROJECT_NAME yarn install `
+@tab npm `shell cd PROJECT_NAME npm install ` :::
Bây giờ bạn đã khởi tạo dự án SubQuery đầu tiên của mình chỉ với một vài bước đơn giản. Bây giờ chúng ta hãy tùy chỉnh dự án mẫu chuẩn cho một chuỗi khối cụ thể mà bạn quan tâm.
@@ -104,4 +104,4 @@ Có 3 tệp quan trọng cần được sửa đổi. Đó là:
2. Tệp Kê khai dự án trong `project.yaml`.
3. Các hàm ánh xạ trong thư mục `src/mappings/`.
-SubQuery hỗ trợ các mạng blockchain khác nhau và cung cấp hướng dẫn riêng cho từng mạng. Chọn blockchain ưa thích của bạn tối đa là 2. Chuỗi cụ thể và tiếp tục hướng dẫn bắt đầu nhanh.
\ No newline at end of file
+SubQuery hỗ trợ các mạng blockchain khác nhau và cung cấp hướng dẫn riêng cho từng mạng. Chọn blockchain ưa thích của bạn tối đa là 2. Chuỗi cụ thể và tiếp tục hướng dẫn bắt đầu nhanh.
diff --git a/docs/vi/run_publish/connect.md b/docs/vi/run_publish/connect.md
index 33822d38bf1..70763e7b11f 100644
--- a/docs/vi/run_publish/connect.md
+++ b/docs/vi/run_publish/connect.md
@@ -2,10 +2,10 @@
Once your deployment has succesfully completed and our nodes have indexed your data from the chain, you'll be able to connect to your project via the displayed Query endpoint.
-![Các dự án đang được triển khai và đồng bộ](/assets/img/projects-deploy-sync.png)
+![Các dự án đang được triển khai và đồng bộ](/assets/img/projects_deploy_sync.png)
Ngoài ra, bạn có thể nhấp vào ba dấu chấm bên cạnh tiêu đề dự án của mình và xem nó trên SubQuery Explorer. There you can use the in browser playground to get started.
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/vi/run_publish/query.md b/docs/vi/run_publish/query.md
index 663322b4171..a6f24c53b22 100644
--- a/docs/vi/run_publish/query.md
+++ b/docs/vi/run_publish/query.md
@@ -12,4 +12,4 @@ Bạn cũng sẽ lưu ý rằng SubQuery Explorer cung cấp một sân chơi đ
On the top right of the playground, you'll find a _Docs_ button that will open a documentation draw. Tài liệu này được tạo tự động và giúp bạn tìm thấy những thực thể và phương pháp nào bạn có thể truy vấn.
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/vi/run_publish/references.md b/docs/vi/run_publish/references.md
index 71de81cf4e2..b7f86c7a6bd 100644
--- a/docs/vi/run_publish/references.md
+++ b/docs/vi/run_publish/references.md
@@ -21,10 +21,10 @@ COMMANDS
This command is uses webpack to generate a bundle of a subquery project.
-| Options | Mô tả |
-| ------------------ | ---------------------------------------------------------------------------------------------------------- |
-| -l, --location | local folder of subquery project (if not in folder already) |
-| -o, --output | specify output folder of build e.g. build-folder |
+| Options | Mô tả |
+| ------------------ | ----------------------------------------------------------- | ----------- | ---- | ----------------------- |
+| -l, --location | local folder of subquery project (if not in folder already) |
+| -o, --output | specify output folder of build e.g. build-folder |
| --mode=(production | prod | development | dev) | [ default: production ] |
- With `subql build` you can specify additional entry points in exports field although it will always build `index.ts` automatically.
@@ -106,7 +106,7 @@ This displays the current version.
### reindex
-:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-:v1.10.0` or above. :::
+:::warning In order to use this command, you require `@subql/node:v1.10.0`/`@subql/node-YOURNETWORK:v1.10.0` or above. :::
When using reindex command, historical must be enabled for the targeted project (`--disable-historical=false`). After starting the project, it would print out a log stating if historical is enabled or not.
@@ -122,7 +122,7 @@ If the `targetHeight` is less than the declared starting height, it will execute
subql-node -f /example/subql-project reindex --targetHeight=30
```
-::: info Note
+::: tip Note
Once the command is executed and the state has been rolled back the the specified height, the application will exit. You can then start up the indexer to proceed again from this height.
:::
@@ -134,7 +134,7 @@ This command forces the project schemas and tables to be regenerated. It is help
`-f`, `--subquery` flag must be passed in, to set path of the targeted project.
-::: info Note Similar to `reindex` command, the application would exit upon completion. :::
+::: tip Note Similar to `reindex` command, the application would exit upon completion. :::
```shell
subql-node -f /example/subql-project force-clean
@@ -346,7 +346,7 @@ This will move block fetching and processing into a worker. By default, this fea
It is at an early experimental stage at the moment, but we plan to enable it by default. :::
-::: info Note
+::: tip Note
This feature is available for Substrate and Cosmos, and soon will be integrated for Avalanche.
:::
diff --git a/docs/vi/run_publish/run.md b/docs/vi/run_publish/run.md
index 55bf75c1029..c4a78773653 100644
--- a/docs/vi/run_publish/run.md
+++ b/docs/vi/run_publish/run.md
@@ -4,7 +4,7 @@ This guide works through how to run a local SubQuery node on your infrastructure
## Using Docker
-An alternative solution is to run a Docker Container, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
+An alternative solution is to run a **Docker Container**, defined by the `docker-compose.yml` file. For a new project that has been just initialised you won't need to change anything here.
Under the project directory run the following command:
@@ -12,7 +12,7 @@ Under the project directory run the following command:
docker-compose pull && docker-compose up
```
-::: info Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
+::: tip Note It may take some time to download the required packages ([`@subql/node`](https://www.npmjs.com/package/@subql/node), [`@subql/query`](https://www.npmjs.com/package/@subql/query), and Postgres) for the first time but soon you'll see a running SubQuery node. :::
## Running an Indexer (subql/node)
@@ -32,90 +32,80 @@ CREATE EXTENSION IF NOT EXISTS btree_gist;
### Installation
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
# NPM
npm install -g @subql/node
```
-
-
+@tab Terra
```shell
# NPM
npm install -g @subql/node-terra
```
-
-
+@tab Avalanche
```shell
# NPM
npm install -g @subql/node-avalanche
```
-
-
+@tab Cosmos
```shell
# NPM
npm install -g @subql/node-cosmos
```
-
-
+@tab Algorand
```shell
# NPM
npm install -g @subql/node-algorand
```
-
-
+:::
::: danger Please note that we **DO NOT** encourage the use of `yarn global` due to its poor dependency management which may lead to an errors down the line. :::
Once installed, you can start a node with the following command:
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node
```
-
-
+@tab Terra
```shell
subql-node-terra
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos
```
-
-
+@tab Algorand
```shell
subql-node-algorand
```
-
-
+:::
### Key Commands
@@ -123,43 +113,38 @@ The following commands will assist you to complete the configuration of a SubQue
#### Point to local project path
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path
```
-
-
+:::
#### Connect to database
@@ -176,43 +161,38 @@ Depending on the configuration of your Postgres database (e.g. a different datab
#### Specify a configuration file
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -c your-project-config.yml
```
-
-
+@tab Terra
```shell
subql-node-terra -c your-project-config.yml
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -c your-project-config.yml
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -c your-project-config.yml
```
-
-
+@tab Algorand
```shell
subql-node-algorand -c your-project-config.yml
```
-
-
+:::
This will point the query node to a manifest file which can be in YAML or JSON format.
@@ -230,43 +210,38 @@ When the indexer first indexes the chain, fetching single blocks will significan
#### Run in local mode
-
-
+::: code-tabs
+@tab Substrate/Polkadot
```shell
subql-node -f your-project-path --local
```
-
-
+@tab Terra
```shell
subql-node-terra -f your-project-path --local
```
-
-
+@tab Avalanche
```shell
subql-node-avalanche -f your-project-path --local
```
-
-
+@tab Cosmos
```shell
subql-node-cosmos -f your-project-path --local
```
-
-
+@tab Algorand
```shell
subql-node-algorand -f your-project-path --local
```
-
-
+:::
For debugging purposes, users can run the node in local mode. Switching to local model will create Postgres tables in the default schema `public`.
diff --git a/docs/vi/run_publish/subscription.md b/docs/vi/run_publish/subscription.md
index eff07a3bbe0..9b79eb52877 100644
--- a/docs/vi/run_publish/subscription.md
+++ b/docs/vi/run_publish/subscription.md
@@ -6,7 +6,7 @@ SubQuery hiện đang hỗ trợ Các theo dõi Graphql. Giống như truy vấn
Các theo dõi rất hữu ích khi bạn muốn ứng dụng khách của mình thay đổi dữ liệu hoặc hiển thị một số dữ liệu mới ngay khi dữ liệu thay đổi hoặc dữ liệu mới có sẵn. Subscriptions allow you to _subscribe_ to your SubQuery project for changes.
-::: info Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
+::: tip Note Read more about [Subscriptions](https://www.apollographql.com/docs/react/data/subscriptions/). :::
## Làm thế nào để theo dõi một thực thể
diff --git a/docs/vi/run_publish/upgrade.md b/docs/vi/run_publish/upgrade.md
index 7dae1005acb..77e2fdd8a2b 100644
--- a/docs/vi/run_publish/upgrade.md
+++ b/docs/vi/run_publish/upgrade.md
@@ -77,10 +77,10 @@ If you just want to upgrade to the latest indexer ([`@subql/node`](https://www.n
Sau khi việc triển khai đã thành công và các nút của chúng ta đã lập chỉ mục dữ liệu của bạn trên chuỗi, bạn sẽ có thể kết nối với dự án của mình thông qua hiển thị của điêm cuối truy vấn GraphQL.
-![Các dự án đang được triển khai và đồng bộ](/assets/img/projects-deploy-sync.png)
+![Các dự án đang được triển khai và đồng bộ](/assets/img/projects_deploy_sync.png)
Ngoài ra, bạn có thể nhấp vào ba dấu chấm bên cạnh tiêu đề dự án của mình và xem nó trên SubQuery Explorer. There you can use the in browser playground to get started - [read more about how to use our Explorer here](../run_publish/query.md).
-![Projects in SubQuery Explorer](/assets/img/projects-explorer.png)
+![Projects in SubQuery Explorer](/assets/img/projects_explorer.png)
-::: info Note Learn more about the [GraphQL Query language.](./graphql.md) :::
+::: tip Note Learn more about the [GraphQL Query language.](./graphql.md) :::
diff --git a/docs/zh/README.md b/docs/zh/README.md
index 7723dc2d1ce..3f43ab4dbe8 100644
--- a/docs/zh/README.md
+++ b/docs/zh/README.md
@@ -4,7 +4,7 @@