From d059920cff9534569a8123cf815810730015baed Mon Sep 17 00:00:00 2001 From: Yash Jagtap Date: Wed, 18 Dec 2024 21:31:32 +0530 Subject: [PATCH] cleanup --- docs/new/explanation/getting-started.md | 8 ++++---- .../cosmos/injective/block-stats.md | 8 ++++---- .../cosmos/injective/foundational.md | 8 ++++---- .../cosmos/injective/usdt-exchanges.md | 8 ++++---- .../develop-your-own-substreams.md | 4 ++-- .../exploring-ethereum/exploring-ethereum.md | 5 ++--- .../exploring-ethereum/map_block_meta_module.md | 2 +- .../solana/explore-solana/explore-solana.md | 1 - .../explore-solana/filter-instructions.md | 4 ++-- .../explore-solana/filter-transactions.md | 6 +++--- .../solana/token-tracker/token-tracker.md | 8 ++++---- .../solana/top-ledger/dex-trades.md | 17 ++++++++--------- .../solana/top-ledger/nft-trades.md | 4 ++-- docs/new/how-to-guides/eth-calls/eth-calls.md | 2 +- docs/new/how-to-guides/mantra.md | 4 ++-- .../sinks/community/other-sinks/kv.md | 2 +- .../sql/deployable-services/local-service.md | 4 ++-- docs/new/how-to-guides/sinks/sql/sql.md | 4 ++-- .../how-to-guides/sinks/stream/javascript.md | 16 ++++++++-------- .../how-to-guides/sinks/subgraph/graph-out.md | 2 +- .../how-to-guides/sinks/subgraph/subgraph.md | 2 +- .../references/cli/command-line-interface.md | 4 ++-- .../references/graph-node/local-development.md | 6 +++--- docs/new/references/manifests.md | 2 +- .../substreams-components/manifests.md | 2 +- .../substreams-components/modules/indexes.md | 2 +- .../substreams-components/modules/modules.md | 2 +- .../modules/setting-up-handlers.md | 2 +- .../substreams-components/modules/types.md | 2 +- .../tutorials/cosmos-compatible/injective.md | 12 ++++++------ docs/new/tutorials/cosmos-compatible/mantra.md | 12 ++++++------ docs/new/tutorials/evm.md | 13 ++++++------- docs/new/tutorials/intro-to-tutorials.md | 2 +- docs/new/tutorials/solana.md | 13 ++++++------- docs/new/tutorials/starknet.md | 13 ++++++------- 35 files changed, 100 insertions(+), 106 deletions(-) diff --git a/docs/new/explanation/getting-started.md b/docs/new/explanation/getting-started.md index 0ffb869fe..6ae62a41b 100644 --- a/docs/new/explanation/getting-started.md +++ b/docs/new/explanation/getting-started.md @@ -1,6 +1,6 @@ # Getting Started with Substreams -Integrating Substreams can be quick and easy. This guide will help you get started with consuming ready-made Substreams packages or developing your own. Substreams are permissionaless. Grab a key [here](https://thegraph.market/), no personal information required, and start streaming on-chain data. +Integrating Substreams can be quick and easy. This guide will help you get started with consuming ready-made Substreams packages or developing your own. Substreams are permissionless. Grab a key [here](https://thegraph.market/), no personal information required, and start streaming on-chain data. # Build @@ -25,10 +25,10 @@ If you can't find a Substreams package that meets your specific needs, you can d - [Injective](../tutorials/cosmos-compatible/injective.md) - [Mantra](../tutorials/cosmos-compatible/mantra.md) -To build and optimize your Substreams from zero, use the minimal path within the [Dev Container](../references/devcontainer-ref.md) to setup your environment and follow the [How-To Guides](./how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md). +To build and optimize your Substreams from zero, use the minimal path within the [Dev Container](../references/devcontainer-ref.md) to setup your environment and follow the [How-To Guides](../how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md). ## Learn - **Substreams Reliability Guarantees**: With a simple reconnection policy, Substreams guarantees you'll _[Never Miss Data](../references/reliability-guarantees.md)_. -- **Substreams Architecture**: For a deeper understanding of how Substreams works, explore the [architectural overview](architecture.md) of the data service. -- **Supported Networks**: Check-out which endpoints are supported [here](./references/chains-and-endpoints.md). +- **Substreams Architecture**: For a deeper understanding of how Substreams works, explore the [architectural overview](../references/architecture.md) of the data service. +- **Supported Networks**: Check-out which endpoints are supported [here](../references/chains-and-endpoints.md). diff --git a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/block-stats.md b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/block-stats.md index 0cff47b6a..05929a402 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/block-stats.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/block-stats.md @@ -1,9 +1,9 @@ -The [BlockStats Susbtreams](https://github.com/streamingfast/substreams-cosmos-block-stats) is a very basic Substreams, extracting data from the Injective blockchain. +The [BlockStats Substreams](https://github.com/streamingfast/substreams-cosmos-block-stats) is a very basic Substreams, extracting data from the Injective blockchain. {% hint style="success" %} **Tip**: This tutorial teaches you how to build a Substreams from scratch. -Remember that you can auto-generate your Substreams module by usig the [code-generation tools](../../../getting-started/injective/injective-first-sql.md). +Remember that you can auto-generate your Substreams module by using the [code-generation tools](../../../getting-started/injective/injective-first-sql.md). {% endhint %} ## Before You Begin @@ -58,7 +58,7 @@ modules: type: proto:cosmos.v1.BlockStats # 6. ``` 1. The `network` field specifies which network is the Substreams going to be executed on. -2. Import the [Cosmos Block Protobuf](https://github.com/streamingfast/firehose-cosmos/blob/develop/cosmos/pb/sf/cosmos/type/v1/block.pb.go#L75), which gives you access to the blockchain data. +2. Import the [Cosmos Block Protobuf](https://github.com/streamingfast/firehose-cosmos/blob/develop/cosmos/pb/sf/cosmos/type/v2/block.pb.go#L75), which gives you access to the blockchain data. 3. Import the user-defined Protobuf schemas (i.e. the outputs of your Substreams). 4. Define a module. `block_to_stats`, which will be mapped to the `block_to_stats` Rust function in the source code. 5. Define the inputs of the module. In this case, the `Block` Cosmos Protobuf. @@ -92,7 +92,7 @@ substreams gui substreams.yaml block_to_stats \ --start-block=64987400 --stop-block=+1000 ``` -Review the [GUI Reference](../../../references/gui.md) to get more information on how to use this utility. +Review the [GUI Reference](https://docs.substreams.dev/reference-material/installing-the-cli/command-line-interface#gui) to get more information on how to use this utility. ## Inspect the Code diff --git a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/foundational.md b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/foundational.md index fd73dc655..a879f0ab6 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/foundational.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/foundational.md @@ -1,4 +1,4 @@ -The [Injective Foundational Substreams](https://github.com/streamingfast/substreams-foundational-modules/injective-common) contains Substreams modules, which retrieve fundammental data on the Injective blockchain. +The [Injective Foundational Substreams](https://github.com/streamingfast/substreams-foundational-modules/injective-common) contains Substreams modules, which retrieve fundamental data on the Injective blockchain. You can use the Injective Foundational Modules as the input for your Substreams or subgraph. @@ -59,15 +59,15 @@ modules: output: type: proto:sf.substreams.cosmos.v1.EventList doc: | - `filtered_events` reads from `all_events` and applies a filter on the event types, only outputing the events that match the filter. + `filtered_events` reads from `all_events` and applies a filter on the event types, only outputting the events that match the filter. The only operator that you should need to use this filter is the logical or `||`, because each event can only match one type. ``` 1. The `all_transactions` module provides access to all the transactions of the Injective blockchain. It receives a raw Injective block object as input (`sf.cosmos.type.v2.Block`), and outputs a list of transactions object (`sf.substreams.cosmos.v1.TransactionList`). 2. The `all_events` module provides access to all the events in the Injective blockchain. It receives a raw Injective block as input (`sf.cosmos.type.v2.Block`), and outputs a list of events object (`sf.substreams.cosmos.v1.EventList`). -3. The `index_events` module uses the `all_events` module to create a cache where events are sorted based on their `type` field. This cache helps in the performance of the module. You can read more about _index modules_ in the [correspoding documentation](../../../develop/indexes). -4. The `filtered_events` allows you to use the `index_events` module (i.e. using the cache of events), to filter only the event types you are interested in. +1. The `index_events` module uses the `all_events` module to create a cache where events are sorted based on their `type` field. This cache helps in the performance of the module. You can read more about _index modules_ in the [corresponding documentation](../../../../references/substreams-components/modules/indexes.md). +2. The `filtered_events` allows you to use the `index_events` module (i.e. using the cache of events), to filter only the event types you are interested in. The string parameter passed as input is used to specify which events you want to consume. ## Use The Foundational Modules diff --git a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/usdt-exchanges.md b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/usdt-exchanges.md index 25fc875a4..c2d27fccd 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/usdt-exchanges.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/cosmos/injective/usdt-exchanges.md @@ -3,7 +3,7 @@ The [USDT Exchanges Volume Subgraph](https://github.com/streamingfast/injective- {% hint style="success" %} **Tip**: This tutorial teaches you how to build a Substreams from scratch. -Remember that you can auto-generate your Substreams module by usig the [code-generation tools](../../../getting-started/injective/injective-first-sps.md). +Remember that you can auto-generate your Substreams module by using the [code-generation tools](../../../getting-started/injective/injective-first-sps.md). {% endhint %} The subgraph uses the [Substreams triggers](../../../consume/subgraph/triggers.md) to import data from the Injective foundational modules. @@ -136,15 +136,15 @@ export function handleEvents(bytes: Uint8Array): void { // 1. 3. Load the `USDTExchangeVolume` subgraph entity, which will store the historical volume. If it is the first trade, then the entity will not exist, and it must be created. 4. Iterate over the events and verify that the event type is `wasm` (`type == wasm`). This should be already filtered by the Substreams, but it is also nice to re-check it. -5. Iterate over the attributes of every event, finding out the neccesary information (contract address, action, ask amount, offer amount...). +5. Iterate over the attributes of every event, finding out the necessary information (contract address, action, ask amount, offer amount...). 6. Verify that the contract where the event is executed corresponds to the `INJ-USDT` pair in the Dojo DEX. 7. Update the entity. ## Deploy to a Local Graph Node -You can test your Substreams-powered Subgraph by deploying to a local Graph Node set-up. Take a look at the the [Graph Node Local Development tutorial](../../graph-node/local-development.md), which provides information on how to spin up a local environment for Graph Node. +You can test your Substreams-powered Subgraph by deploying to a local Graph Node set-up. Take a look at the the [Graph Node Local Development tutorial](../../../../references/graph-node/local-development.md), which provides information on how to spin up a local environment for Graph Node. -First, clone the [Substreams Development Environment GitHub respository](https://github.com/streamingfast/substreams-dev-environment) and move to the `graph-node` folder. Execute the `start.sh` command with the Injective information (make sure you have Docker running in your computer). +First, clone the [Substreams Development Environment GitHub repository](https://github.com/streamingfast/substreams-dev-environment) and move to the `graph-node` folder. Execute the `start.sh` command with the Injective information (make sure you have Docker running in your computer). ```bash ./start.sh injective-mainnet https://mainnet.injective.streamingfast.io:443 diff --git a/docs/new/how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md b/docs/new/how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md index 5c240054a..7e9591011 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/develop-your-own-substreams.md @@ -1,11 +1,11 @@ These how-to guides walk through creating a Substreams that uses raw blockchain data to index your dapp. The application includes using Rust and Protobuf to extract, transform, and load the data. {% hint style="warning" %} -**Important**_:_ These how-to guides are in-depth walkthroughs of builing highly performance indexers for your dapp. Less experienced users may want to reference the [Tutorials](../../tutorials/intro-to-tutorials.md) for a quick start. +**Important**_:_ These how-to guides are in-depth walkthroughs of building highly performance indexers for your dapp. Less experienced users may want to reference the [Tutorials](../../tutorials/intro-to-tutorials.md) for a quick start. {% endhint %} Choose your ecosystem to get started: -- [Solana](./solana/solana.md) - [EVM](./evm/exploring-ethereum/exploring-ethereum.md) +- [Solana](./solana/solana.md) - [Cosmos](./cosmos/injective/block-stats.md) diff --git a/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/exploring-ethereum.md b/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/exploring-ethereum.md index 8883d5c18..e6dbb3945 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/exploring-ethereum.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/exploring-ethereum.md @@ -16,7 +16,7 @@ Before moving forward, make sure to reference the minimal path within the [Dev C The `https://github.com/streamingfast/substreams-explorers` GitHub repository contains all the Substreams Explorers currently available. You can simply clone the repository: ``` -$ git clone https://github.com/streamingfast/substreams-explorers +git clone https://github.com/streamingfast/substreams-explorers ``` ### Substreams Basics @@ -83,5 +83,4 @@ The [CLI reference](../../../../references/cli/command-line-interface.md) lets y ### Substreams Components Reference -The [Components Reference](../../../../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. - +The [Components Reference](../../../../references/substreams-components/packages.md) dives deeper into navigating the `substreams.yaml`. diff --git a/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/map_block_meta_module.md b/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/map_block_meta_module.md index f70a344fe..50c007267 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/map_block_meta_module.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/evm/exploring-ethereum/map_block_meta_module.md @@ -6,7 +6,7 @@ Let's run the Substreams first, and then go through the code. ### Running the Substreams -Running a Substreams usually requires three steps: generating the Rust Protobufs, building the WASM container, and using the Substream CLI to start the streaming. Make sure to run the following commands in the `substreams-explorer/ethereum-explorer` folder: +Running a Substreams usually requires three steps: generating the Rust Protobufs, building the WASM container, and using the Substreams CLI to start the streaming. Make sure to run the following commands in the `substreams-explorer/ethereum-explorer` folder: 1. **Generate the Protobuf objects:** The `.proto` files define a data model regardless of any programming language. However, in order to use this model in your Rust application, you must generate the corresponding Rust data structures. Note that running `make protogen` is only necessary when making updates to any file in the proto folder. diff --git a/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/explore-solana.md b/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/explore-solana.md index db21cb3a1..5e5b08574 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/explore-solana.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/explore-solana.md @@ -27,4 +27,3 @@ Take a look at the _Develop Substreams_ section for more information on how to s ## The Solana Explorer The Solana explorer includes several modules showcasing what Solana data you can extract with Substreams (it's easy and fast!). In the following sections, you will find out about the different functions you can use to easily get started with Solana. - diff --git a/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/filter-instructions.md b/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/filter-instructions.md index d1f080c05..f42699a34 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/filter-instructions.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/solana/explore-solana/filter-instructions.md @@ -85,9 +85,9 @@ fn map_filter_instructions(params: String, blk: Block) -> Result Result Result { 2. Iterate over the transactions. 3. Get accounts of the transaction (the `resolved_accounts()` method contains also accounts stored in the [Address Lookup Tables](https://docs.solana.com/developing/lookup-tables)). 4. Iterate over the instructions within the transaction. -5. Keep only inner instructions beloging to the current top-level instruction. -Becuse the inner instructions are at the transaction level, you must filter filter which inner instruction belong to the current instruction by using the `index` property. +5. Keep only inner instructions belonging to the current top-level instruction. +Because the inner instructions are at the transaction level, you must filter filter which inner instruction belong to the current instruction by using the `index` property. 6. Get the program account. 7. Process trade instruction by calling the `get_trade_instruction(...)` function. @@ -186,7 +186,7 @@ fn get_trade_instruction( } ``` 1. Match the program account passed as a parameter. -2. Excuted if the program account is `CLMM9tUoggJu2wagPkkqs9eFG4BWhVBZWkP1qv3Sp7tR` (Crema Finance). +2. Executed if the program account is `CLMM9tUoggJu2wagPkkqs9eFG4BWhVBZWkP1qv3Sp7tR` (Crema Finance). 3. Call the decoding function of Crema Finance (`parse_trade_instruction(...)`). 4. Executed if the program account is `Dooar9JkhdZ7J3LHN3A7YCuoGRUggXhQaG4kijfLGU2j` (Dooar Exchange). 5. Executed if the program account is `Eo7WjKq67rjJQSZxS6z3YkapzY3eMj6Xy8X5EQVn5UaB` (Meteora). @@ -254,7 +254,7 @@ fn process_block(block: Block) -> Result { Until now, the code logic has taken care of the top-level instructions. However, it is also necessary to verify if any of the inner instructions contain relevant DEX information. -The logic for the inner instruction is analogus to the logic of top-level instructions: +The logic for the inner instruction is analogous to the logic of top-level instructions: - Iterate over the inner instructions. - Pass the data to the `get_trade_instruction(...)` function. - If the `get_trade_instruction(...)` function returns a `TradeInstruction` object, then a new `TradeData` object is created and added to the array. @@ -335,4 +335,3 @@ fn process_block(block: Block) -> Result { }); } ``` - diff --git a/docs/new/how-to-guides/develop-your-own-substreams/solana/top-ledger/nft-trades.md b/docs/new/how-to-guides/develop-your-own-substreams/solana/top-ledger/nft-trades.md index 572085313..f7ff39a53 100644 --- a/docs/new/how-to-guides/develop-your-own-substreams/solana/top-ledger/nft-trades.md +++ b/docs/new/how-to-guides/develop-your-own-substreams/solana/top-ledger/nft-trades.md @@ -10,7 +10,7 @@ TopLedger is an active contributor to the Substreams community and has developed The NFT Trades Substreams requires medium to advanced Substreams knowledge. If this is the first time you are using Substreams, make sure you: -- Read the [Develop Substreams](../../../develop/develop.md) section, which will teach you the basics of the developing Substreams modules. +- Read the [Develop Substreams](../../../../tutorials/intro-to-tutorials.md) section, which will teach you the basics of the developing Substreams modules. - Complete the [Explore Solana](../explore-solana/explore-solana.md) tutorial, which will assist you in understanding the main pieces of the Solana Substreams. Clone the [TopLedger Solana Programs](https://github.com/Topledger/solana-programs) project and navigate to the `nft-trades` folder, which contains the code of the Substreams. @@ -29,7 +29,7 @@ modules: type: proto:sf.solana.nft.trades.v1.Output ``` -The `Output` object provided as the Substreams output is defined in the `proto/outout.proto` file: +The `Output` object provided as the Substreams output is defined in the `proto/output.proto` file: ```protobuf message Output { diff --git a/docs/new/how-to-guides/eth-calls/eth-calls.md b/docs/new/how-to-guides/eth-calls/eth-calls.md index 33cd0087f..d2840ebcf 100644 --- a/docs/new/how-to-guides/eth-calls/eth-calls.md +++ b/docs/new/how-to-guides/eth-calls/eth-calls.md @@ -34,7 +34,7 @@ Complete the information required by the previous command, such as name of the p In the `Contract address to track` step, write `0xdac17f958d2ee523a2206206994597c13d831ec7`, the address of the USDT smart contract. ```bash -Project name (lowercase, numbers, undescores): usdttracker +Project name (lowercase, numbers, underscores): usdttracker Protocol: Ethereum Ethereum chain: Mainnet Contract address to track (leave empty to use "Bored Ape Yacht Club"): 0xdac17f958d2ee523a2206206994597c13d831ec7 diff --git a/docs/new/how-to-guides/mantra.md b/docs/new/how-to-guides/mantra.md index 3f70ab19d..c39bcd957 100644 --- a/docs/new/how-to-guides/mantra.md +++ b/docs/new/how-to-guides/mantra.md @@ -20,13 +20,13 @@ In this guide, you'll learn how to initialize a MANTRA-based Substreams project. ## Step 2: Visualize the Data -1. Create your account [here](https://thegraph.market/) to generate an authentification token (JWT) and pass it as input to: +1. Create your account [here](https://thegraph.market/) to generate an authentication token (JWT) and pass it as input to: ```bash substreams auth ``` -2. Run the following command to visualize and itterate on your filtered data model: +2. Run the following command to visualize and iterate on your filtered data model: ```bash substreams gui diff --git a/docs/new/how-to-guides/sinks/community/other-sinks/kv.md b/docs/new/how-to-guides/sinks/community/other-sinks/kv.md index 99397313b..fbf804aa8 100644 --- a/docs/new/how-to-guides/sinks/community/other-sinks/kv.md +++ b/docs/new/how-to-guides/sinks/community/other-sinks/kv.md @@ -133,7 +133,7 @@ pub fn process_deltas(ops: &mut KvOperations, deltas: store::Deltas ops.push_delete(&delta.key, delta.ordinal), - x => panic!("unsupported opeation {:?}", x), + x => panic!("unsupported operation {:?}", x), } } } diff --git a/docs/new/how-to-guides/sinks/sql/deployable-services/local-service.md b/docs/new/how-to-guides/sinks/sql/deployable-services/local-service.md index 142c50fa4..c8b6fc16a 100644 --- a/docs/new/how-to-guides/sinks/sql/deployable-services/local-service.md +++ b/docs/new/how-to-guides/sinks/sql/deployable-services/local-service.md @@ -1,4 +1,4 @@ -In you want to manage your own infrastructure, you can use still the deployable services, but locally. This essetially means using the `substreams alpha service` command pointing to a local Docker installation. The following tutorial teaches you how to use the Substreams:SQL deployable service locally. +In you want to manage your own infrastructure, you can use still the deployable services, but locally. This essentially means using the `substreams alpha service` command pointing to a local Docker installation. The following tutorial teaches you how to use the Substreams:SQL deployable service locally. ## Tutorial @@ -23,7 +23,7 @@ substreams init Fill the requested information (name: `cryptopunks`, protocol: `ethereum`, chain: `mainnet`, contract: `b47e3cd837ddf8e4c57f05d70ab865de6e193bbb`) ``` -Project name (lowercase, numbers, undescores): cryptopunks +Project name (lowercase, numbers, underscores): cryptopunks Protocol: Ethereum Ethereum chain: Mainnet ✔ Contract address to track: b47e3cd837ddf8e4c57f05d70ab865de6e193bbb diff --git a/docs/new/how-to-guides/sinks/sql/sql.md b/docs/new/how-to-guides/sinks/sql/sql.md index 125640692..8e3e33505 100644 --- a/docs/new/how-to-guides/sinks/sql/sql.md +++ b/docs/new/how-to-guides/sinks/sql/sql.md @@ -9,7 +9,7 @@ Substreams offers two different ways of consuming data as SQL: - Using the SQL sink, which currently supports PostgresSQL and Clickhouse (recommended) ### - Substreams:SQL Deployable Service (beta) -Use the Substreams CLI to easily send the data of your Substreams to a database. It also has support for **dbt transformations**, so it's great for data analyts! +Use the Substreams CLI to easily send the data of your Substreams to a database. It also has support for **dbt transformations**, so it's great for data analysts! You can deploy a new service by using the `substreams alpha service` command. @@ -29,4 +29,4 @@ Previous to the implementation of the Deployable Services, the Postgres Sink was In order to the send the data to a SQL database, your Substreams must have a `db_out` module that emits [`DatabaseChanges`](https://docs.rs/substreams-database-change/latest/substreams_database_change/pb/database/struct.DatabaseChanges.html) objects. -The `DatabaseChanges` object is something that the Postgres sink can understand, thus acting as a conversion layet between the data model of your Substreams and the table structure of the database. +The `DatabaseChanges` object is something that the Postgres sink can understand, thus acting as a conversion layer between the data model of your Substreams and the table structure of the database. diff --git a/docs/new/how-to-guides/sinks/stream/javascript.md b/docs/new/how-to-guides/sinks/stream/javascript.md index 6a84160ea..9f90595cc 100644 --- a/docs/new/how-to-guides/sinks/stream/javascript.md +++ b/docs/new/how-to-guides/sinks/stream/javascript.md @@ -88,11 +88,11 @@ When you consume a Substreams package, a long-live gRPC connection is establishe {% tabs %} {% tab title="NodeJS" %} -The `index.js` file contains the `main()` function, which runs an infite loop and takes care of managing the disconnections. +The `index.js` file contains the `main()` function, which runs an infinite loop and takes care of managing the disconnections. ```js const TOKEN = process.env.SUBSTREAMS_API_TOKEN // Substreams token. By default it takes the SUBSTREAMS_API_TOKEN environment variable of your system -const ENDPOINT = "https://mainnet.eth.streamingfast.io" // Substreams endpont. In this case, Ethereum mainnet +const ENDPOINT = "https://mainnet.eth.streamingfast.io" // Substreams endpoint. In this case, Ethereum mainnet const SPKG = "https://spkg.io/streamingfast/ethereum-explorer-v0.1.2.spkg" // Substreams package. In this case, taken from the substreams.dev registry const MODULE = "map_block_meta" const START_BLOCK = '100000' @@ -117,8 +117,8 @@ const main = async () => { }, }); - // The infite loop handles disconnections. Every time an disconnection error is thrown, the loop will automatically reconnect - // and start consuming from the latest commited cursor. + // The infinite loop handles disconnections. Every time an disconnection error is thrown, the loop will automatically reconnect + // and start consuming from the latest committed cursor. while (true) { try { await stream(pkg, registry, transport); @@ -137,11 +137,11 @@ const main = async () => { {% endtab %} {% tab title="Web" %} -The `main.js` file contains the `main()` function, which runs an infite loop and takes care of managing the disconnections. +The `main.js` file contains the `main()` function, which runs an infinite loop and takes care of managing the disconnections. ```js const TOKEN = "" // Substreams token. Put here your Substreams API token. -const ENDPOINT = "https://mainnet.eth.streamingfast.io" // Substreams endpont. In this case, Ethereum mainnet +const ENDPOINT = "https://mainnet.eth.streamingfast.io" // Substreams endpoint. In this case, Ethereum mainnet const SPKG = "https://spkg.io/streamingfast/ethereum-explorer-v0.1.2.spkg" // Substreams package. In this case, taken from the substreams.dev registry const MODULE = "map_block_meta" const START_BLOCK = '100000' @@ -166,8 +166,8 @@ const main = async () => { }, }); - // The infite loop handles disconnections. Every time an disconnection error is thrown, the loop will automatically reconnect - // and start consuming from the latest commited cursor. + // The infinite loop handles disconnections. Every time an disconnection error is thrown, the loop will automatically reconnect + // and start consuming from the latest committed cursor. while (true) { try { await stream(pkg, registry, transport); diff --git a/docs/new/how-to-guides/sinks/subgraph/graph-out.md b/docs/new/how-to-guides/sinks/subgraph/graph-out.md index fe960d759..d4a50c3c3 100644 --- a/docs/new/how-to-guides/sinks/subgraph/graph-out.md +++ b/docs/new/how-to-guides/sinks/subgraph/graph-out.md @@ -1,5 +1,5 @@ -If you want to include the extractions logic in Substreams to benefit from the paralellization engine, you can use the **EntityChanges** model. +If you want to include the extractions logic in Substreams to benefit from the parallelization engine, you can use the **EntityChanges** model. Essentially, this means that you will create a `graph_out` module in Substreams, which will emit an **EntityChanges** structure representing the subgraph entities. diff --git a/docs/new/how-to-guides/sinks/subgraph/subgraph.md b/docs/new/how-to-guides/sinks/subgraph/subgraph.md index 1f3686184..c9ac616e5 100644 --- a/docs/new/how-to-guides/sinks/subgraph/subgraph.md +++ b/docs/new/how-to-guides/sinks/subgraph/subgraph.md @@ -8,7 +8,7 @@ The subgraph will read the `EntityChanges` object and consume the data. ## What Option To Use It is really a matter of where you put your logic, in the subgraph or the Substreams. -- [Substreams Triggers](./triggers.md): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and write all your transformations ussing AssemblyScript. This method creates the subgraph entities directly in the subgraph. +- [Substreams Triggers](./triggers.md): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and write all your transformations using AssemblyScript. This method creates the subgraph entities directly in the subgraph. - [Substreams Graph-Out](./graph-out.md): By writing more of the logic into Substreams, you can consume the module's output directly into `graph-node`. You will create the subgraph entities in the Substreams and the subgraph will read them. Having more of your logic in Substreams benefits from a parallelized model and a cursor to [never miss data](../../../references/reliability-guarantees.md), whereas triggers will be linearly consumed in `graph-node`. diff --git a/docs/new/references/cli/command-line-interface.md b/docs/new/references/cli/command-line-interface.md index c735dfdfc..41ea823ff 100644 --- a/docs/new/references/cli/command-line-interface.md +++ b/docs/new/references/cli/command-line-interface.md @@ -23,7 +23,7 @@ If you choose to not use it, make sure that you are in a directory that contains ### **`init`** -The `init` command allows you to initialize a Substreams project for several blockchains. It is a conversational-like command: you will be asked several questions and a project with the specicied features will be created for you. +The `init` command allows you to initialize a Substreams project for several blockchains. It is a conversational-like command: you will be asked several questions and a project with the specified features will be created for you. The options included in the `init` command will evolve over time, but every blockchain should, at least, contain one option. @@ -149,7 +149,7 @@ There are the shortcuts that you can use to navigate the GUI. You can always get | Navigate Modules - Forward | `i` | | Navigate Modules - Backwards | `u` | | Search | `/` + *text* + `enter` | -| Commnads information | `?` | +| Commands information | `?` | ### `pack` **(DEPRECATED)** diff --git a/docs/new/references/graph-node/local-development.md b/docs/new/references/graph-node/local-development.md index 1459c6484..801faa93b 100644 --- a/docs/new/references/graph-node/local-development.md +++ b/docs/new/references/graph-node/local-development.md @@ -1,8 +1,8 @@ ## Local Development of Subgraphs with Graph Node -The Graph Node is the software that indexers run to index subgraphs. When developing a subgraph (or Substreams-powered subgrpah), it is very convenient to test the subgraph deployment locally. This can be achieved by running the Graph Node software and all its dependencies in a local Docker environment. +The Graph Node is the software that indexers run to index subgraphs. When developing a subgraph (or Substreams-powered subgraph), it is very convenient to test the subgraph deployment locally. This can be achieved by running the Graph Node software and all its dependencies in a local Docker environment. -Clone the [Substreams Development Environment GitHub respository](https://github.com/streamingfast/substreams-dev-environment), which contains the necessary shell scripts to run a local Graph Node in your computer. +Clone the [Substreams Development Environment GitHub repository](https://github.com/streamingfast/substreams-dev-environment), which contains the necessary shell scripts to run a local Graph Node in your computer. ### Requirements @@ -12,7 +12,7 @@ This tutorial requires you to: ### Set up the Environment -In the [Substreams Development Environment GitHub respository](https://github.com/streamingfast/substreams-dev-environment), move to the `graph-node` folder. The entrypoint to set up the Graph Node local environment is the `start.sh` script, which spins up a Graph Node instance configured for a specific network (e.g. `injective-mainnet`), along with a local IPFS node and a local Postgres database. When using this script, you must pass two parameters: `NETWORK` and `SUBSTREAMS_ENDPOINT`. +In the [Substreams Development Environment GitHub repository](https://github.com/streamingfast/substreams-dev-environment), move to the `graph-node` folder. The entrypoint to set up the Graph Node local environment is the `start.sh` script, which spins up a Graph Node instance configured for a specific network (e.g. `injective-mainnet`), along with a local IPFS node and a local Postgres database. When using this script, you must pass two parameters: `NETWORK` and `SUBSTREAMS_ENDPOINT`. ```bash ./start.sh diff --git a/docs/new/references/manifests.md b/docs/new/references/manifests.md index e78ad0f5a..200075b85 100644 --- a/docs/new/references/manifests.md +++ b/docs/new/references/manifests.md @@ -195,7 +195,7 @@ network: ethereum ### image -The `image` field specifies the icon displayed for the Substreams package, which is used in the [Substreams Regsitry](https://substreams.dev). The path is relative to the folder where the manifest is. +The `image` field specifies the icon displayed for the Substreams package, which is used in the [Substreams Registry](https://substreams.dev). The path is relative to the folder where the manifest is. ```yaml image: ./ethereum-icon.png diff --git a/docs/new/references/substreams-components/manifests.md b/docs/new/references/substreams-components/manifests.md index e78ad0f5a..200075b85 100644 --- a/docs/new/references/substreams-components/manifests.md +++ b/docs/new/references/substreams-components/manifests.md @@ -195,7 +195,7 @@ network: ethereum ### image -The `image` field specifies the icon displayed for the Substreams package, which is used in the [Substreams Regsitry](https://substreams.dev). The path is relative to the folder where the manifest is. +The `image` field specifies the icon displayed for the Substreams package, which is used in the [Substreams Registry](https://substreams.dev). The path is relative to the folder where the manifest is. ```yaml image: ./ethereum-icon.png diff --git a/docs/new/references/substreams-components/modules/indexes.md b/docs/new/references/substreams-components/modules/indexes.md index dcba6be3e..cd60def89 100644 --- a/docs/new/references/substreams-components/modules/indexes.md +++ b/docs/new/references/substreams-components/modules/indexes.md @@ -38,7 +38,7 @@ A possible flow to use an index module to index all the events in a block: Given this string of addresses, Substreams checks if the event address is contained on a given block before actually decoding the data of the block. You can use logical operators (`and` and `or`) to select what events to search. -This previous flow is just an example of a preferred way to use index modules, but it is totally up to you to decide the structure of your Substreams. For example, instead of having a separate module, `all_events`, which extracts all the events of the block, you can receieve the raw `Block` object diretly on the `index_events` module. +This previous flow is just an example of a preferred way to use index modules, but it is totally up to you to decide the structure of your Substreams. For example, instead of having a separate module, `all_events`, which extracts all the events of the block, you can receive the raw `Block` object directly on the `index_events` module. The definition of the `index_events` module looks like any other Substreams module, but it is a special _kind_, `kind: blockIndex` and outputs a special data model, `sf.substreams.index.v1.Keys`. The `Keys` object contains a list of labels that will be used to identify the content of that block. diff --git a/docs/new/references/substreams-components/modules/modules.md b/docs/new/references/substreams-components/modules/modules.md index 37a23fc4a..583c46b70 100644 --- a/docs/new/references/substreams-components/modules/modules.md +++ b/docs/new/references/substreams-components/modules/modules.md @@ -4,7 +4,7 @@ description: Learn the basics about modules ## Modules -In Substreams, manifests and modules are concepts tighly related because they are fundamental to understand how Substreams works. +In Substreams, manifests and modules are concepts tightly related because they are fundamental to understand how Substreams works. In simple terms, a Substreams module is a Rust function that receives an input and returns an output. For example, the following Rust function receives an Ethereum block and returns a custom object containing fields such as block number, hash or parent hash. diff --git a/docs/new/references/substreams-components/modules/setting-up-handlers.md b/docs/new/references/substreams-components/modules/setting-up-handlers.md index cbae21022..12b35a33b 100644 --- a/docs/new/references/substreams-components/modules/setting-up-handlers.md +++ b/docs/new/references/substreams-components/modules/setting-up-handlers.md @@ -20,7 +20,7 @@ Update the generated [`Cargo.toml`](https://github.com/streamingfast/substreams- [package] name = "substreams-template" version = "0.1.0" -description = "Substream template demo project" +description = "Substreams template demo project" edition = "2021" repository = "https://github.com/streamingfast/substreams-template" diff --git a/docs/new/references/substreams-components/modules/types.md b/docs/new/references/substreams-components/modules/types.md index 069acfae5..6d4610661 100644 --- a/docs/new/references/substreams-components/modules/types.md +++ b/docs/new/references/substreams-components/modules/types.md @@ -130,7 +130,7 @@ let store = StoreUSDPrice { The current implementation is as follows: - Start with value = get_last() (1.65) -- Iterate ord 4, value = detla.OldValue (1.47) +- Iterate ord 4, value = delta.OldValue (1.47) - Iterate ord 3, value = delta.OldValue () - Iterate ord 2, value = delta.OldValue (1.54) - Iterate ord 1, ordinal == 1, return value (1.54) diff --git a/docs/new/tutorials/cosmos-compatible/injective.md b/docs/new/tutorials/cosmos-compatible/injective.md index 7eb59a45c..f30be5d75 100644 --- a/docs/new/tutorials/cosmos-compatible/injective.md +++ b/docs/new/tutorials/cosmos-compatible/injective.md @@ -19,9 +19,9 @@ Tip: Have the start block of your transaction or specific events ready. ## Step 2: Visualize the Data -1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentification token (JWT), then pass this token back as input. +1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentication token (JWT), then pass this token back as input. -2. Now you can freely use the `substreams gui` to visualize and itterate on your extracted data. +2. Now you can freely use the `substreams gui` to visualize and iterate on your extracted data. ## Step 2.5: (Optionally) Transform the Data @@ -29,17 +29,17 @@ Within the generated directories, modify your Substreams modules to include addi ## Step 3: Load the Data -To make your Substreams queriable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. +To make your Substreams queryable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. ### Subgraph -1. Run `substreams codegen subgraph` to intialize the sink, producing the neccessary files and function definitions. +1. Run `substreams codegen subgraph` to initialize the sink, producing the necessary files and function definitions. 2. Create your [subgraph mappings](../how-to-guides/sinks/subgraph/triggers.md) within the `mappings.ts` and associated entities within the `schema.graphql`. 3. Deploy ### SQL -1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to intialize the sink, producing the neccessary files. +1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to initialize the sink, producing the necessary files. 2. Run `substreams build` build the [Substreams:SQL](../how-to-guides/sinks/sql/sql-sink.md) sink. 3. Run `substreams-sink-sql` to sink the data into your selected SQL DB. @@ -61,4 +61,4 @@ The [CLI reference](../references/cli/command-line-interface.md) lets you explor ### Substreams Components Reference -The [Components Reference](../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. +The [Components Reference](../references/substreams-components/) dives deeper into navigating the `substreams.yaml`. diff --git a/docs/new/tutorials/cosmos-compatible/mantra.md b/docs/new/tutorials/cosmos-compatible/mantra.md index c3d2530af..7e89e4272 100644 --- a/docs/new/tutorials/cosmos-compatible/mantra.md +++ b/docs/new/tutorials/cosmos-compatible/mantra.md @@ -14,9 +14,9 @@ Tip: Have the start block of your transaction or specific events ready. ## Step 2: Visualize the Data -1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentification token (JWT), then pass this token back as input. +1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentication token (JWT), then pass this token back as input. -2. Now you can freely use the `substreams gui` to visualize and itterate on your extracted data. +2. Now you can freely use the `substreams gui` to visualize and iterate on your extracted data. ## Step 2.5: (Optionally) Transform the Data @@ -24,17 +24,17 @@ Within the generated directories, modify your Substreams modules to include addi ## Step 3: Load the Data -To make your Substreams queriable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. +To make your Substreams queryable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. ### Subgraph -1. Run `substreams codegen subgraph` to intialize the sink, producing the neccessary files and function definitions. +1. Run `substreams codegen subgraph` to initialize the sink, producing the necessary files and function definitions. 2. Create your [subgraph mappings](../how-to-guides/sinks/subgraph/triggers.md) within the `mappings.ts` and associated entities within the `schema.graphql`. 3. Deploy ### SQL -1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to intialize the sink, producing the neccessary files. +1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to initialize the sink, producing the necessary files. 2. Run `substreams build` build the [Substreams:SQL](../how-to-guides/sinks/sql/sql-sink.md) sink. 3. Run `substreams-sink-sql` to sink the data into your selected SQL DB. @@ -56,4 +56,4 @@ The [CLI reference](../references/cli/command-line-interface.md) lets you explor ### Substreams Components Reference -The [Components Reference](../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. \ No newline at end of file +The [Components Reference](../references/substreams-components/) dives deeper into navigating the `substreams.yaml`. diff --git a/docs/new/tutorials/evm.md b/docs/new/tutorials/evm.md index e1916ecb6..9d0e93d8b 100644 --- a/docs/new/tutorials/evm.md +++ b/docs/new/tutorials/evm.md @@ -10,9 +10,9 @@ In this guide, you'll learn how to initialize an EVM-based Substreams project wi ## Step 2: Visualize the Data -1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentification token (JWT), then pass this token back as input. +1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentication token (JWT), then pass this token back as input. -2. Now you can freely use the `substreams gui` to visualize and itterate on your extracted data. +2. Now you can freely use the `substreams gui` to visualize and iterate on your extracted data. ## Step 2.5: (Optionally) Transform the Data @@ -20,17 +20,17 @@ Within the generated directories, modify your Substreams modules to include addi ## Step 3: Load the Data -To make your Substreams queriable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. +To make your Substreams queryable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. ### Subgraph -1. Run `substreams codegen subgraph` to intialize the sink, producing the neccessary files and function definitions. +1. Run `substreams codegen subgraph` to initialize the sink, producing the necessary files and function definitions. 2. Create your [subgraph mappings](../how-to-guides/sinks/subgraph/triggers.md) within the `mappings.ts` and associated entities within the `schema.graphql`. 3. Deploy ### SQL -1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to intialize the sink, producing the neccessary files. +1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to initialize the sink, producing the necessary files. 2. Run `substreams build` build the [Substreams:SQL](../how-to-guides/sinks/sql/sql-sink.md) sink. 3. Run `substreams-sink-sql` to sink the data into your selected SQL DB. @@ -52,5 +52,4 @@ The [CLI reference](../references/cli/command-line-interface.md) lets you explor ### Substreams Components Reference -The [Components Reference](../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. - +The [Components Reference](../references/substreams-components/packages.md) dives deeper into navigating the `substreams.yaml`. diff --git a/docs/new/tutorials/intro-to-tutorials.md b/docs/new/tutorials/intro-to-tutorials.md index 32123ff34..8f3ad0383 100644 --- a/docs/new/tutorials/intro-to-tutorials.md +++ b/docs/new/tutorials/intro-to-tutorials.md @@ -4,7 +4,7 @@ Substreams data streams are available on the chains listed [here](../references/ If your blockchain is not supported, please ask in Discord. Then, consult the relevant ecosystem guide to get started using Substreams real-time data streams: -- [EVM](../tutorials/evm.md) +- [EVM](./evm.md) - [Solana](./solana.md) - [Starknet](./starknet.md) - [Injective](./cosmos-compatible/injective.md) diff --git a/docs/new/tutorials/solana.md b/docs/new/tutorials/solana.md index 0623a7913..95242fd5b 100644 --- a/docs/new/tutorials/solana.md +++ b/docs/new/tutorials/solana.md @@ -17,9 +17,9 @@ The modules within Solana Common exclude voting transactions, to benefit from a ## Step 2: Visualize the Data -1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentification token (JWT), then pass this token back as input. +1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentication token (JWT), then pass this token back as input. -2. Now you can freely use the `substreams gui` to visualize and itterate on your extracted data. +2. Now you can freely use the `substreams gui` to visualize and iterate on your extracted data. ## Step 2.5: (Optionally) Transform the Data @@ -27,17 +27,17 @@ Within the generated directories, modify your Substreams modules to include addi ## Step 3: Load the Data -To make your Substreams queriable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. +To make your Substreams queryable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. ### Subgraph -1. Run `substreams codegen subgraph` to intialize the sink, producing the neccessary files and function definitions. +1. Run `substreams codegen subgraph` to initialize the sink, producing the necessary files and function definitions. 2. Create your [subgraph mappings](../how-to-guides/sinks/subgraph/triggers.md) within the `mappings.ts` and associated entities within the `schema.graphql`. 3. Deploy ### SQL -1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to intialize the sink, producing the neccessary files. +1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to initialize the sink, producing the necessary files. 2. Run `substreams build` build the [Substreams:SQL](../how-to-guides/sinks/sql/sql-sink.md) sink. 3. Run `substreams-sink-sql` to sink the data into your selected SQL DB. @@ -59,5 +59,4 @@ The [CLI reference](../references/cli/command-line-interface.md) lets you explor ### Substreams Components Reference -The [Components Reference](../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. - +The [Components Reference](../references/substreams-components/packages.md) dives deeper into navigating the `substreams.yaml`. diff --git a/docs/new/tutorials/starknet.md b/docs/new/tutorials/starknet.md index 1bbe10c3c..826abf877 100644 --- a/docs/new/tutorials/starknet.md +++ b/docs/new/tutorials/starknet.md @@ -14,9 +14,9 @@ Note: Starknet ABIs are mutable within blocks, therefore the current ABI of your ## Step 2: Visualize the Data -1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentification token (JWT), then pass this token back as input. +1. Run `substreams auth` to create your [account](https://thegraph.market/) and generate an authentication token (JWT), then pass this token back as input. -2. Now you can freely use the `substreams gui` to visualize and itterate on your extracted data. +2. Now you can freely use the `substreams gui` to visualize and iterate on your extracted data. ## Step 2.5: (Optionally) Transform the Data @@ -24,17 +24,17 @@ Within the generated directories, modify your Substreams modules to include addi ## Step 3: Load the Data -To make your Substreams queriable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. +To make your Substreams queryable (as opposed to [direct streaming](../how-to-guides/sinks/stream/stream.md)), you can automatically generate a Subgraph (known as a [Substreams-powered subgraph](https://thegraph.com/docs/en/sps/introduction/)) or SQL-DB sink. ### Subgraph -1. Run `substreams codegen subgraph` to intialize the sink, producing the neccessary files and function definitions. +1. Run `substreams codegen subgraph` to initialize the sink, producing the necessary files and function definitions. 2. Create your [subgraph mappings](../how-to-guides/sinks/subgraph/triggers.md) within the `mappings.ts` and associated entities within the `schema.graphql`. 3. Deploy ### SQL -1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to intialize the sink, producing the neccessary files. +1. Run `substreams codegen sql` and choose from either ClickHouse or Postgres to initialize the sink, producing the necessary files. 2. Run `substreams build` build the [Substreams:SQL](../how-to-guides/sinks/sql/sql-sink.md) sink. 3. Run `substreams-sink-sql` to sink the data into your selected SQL DB. @@ -56,5 +56,4 @@ The [CLI reference](../references/cli/command-line-interface.md) lets you explor ### Substreams Components Reference -The [Components Reference](../references/substreams-components/) dives deeeper into navigating the `substreams.yaml`. - +The [Components Reference](../references/substreams-components/) dives deeper into navigating the `substreams.yaml`.