This project aims to index order data from each of The Meta DAO's proposals into candlestick data. This way we can show charts on how the proposals do over time in the UI.
The indexer is made of 3 components:
- the indexer service which contacts an RPC periodically to poll for any orders on not yet concluded proposals; the indexer will consolidate the order data into candles then store these in a database
- a postgres database
- a hasura instance which exposes a real-time GraphQL read-only API over the postgres data
Since this is just a generic means to cache on-chain data into Postgres then expose a real-time GraphQL API over this data, it could be used for more than just candlestick indexing, but we'll begin with that use-case.
Historical data on Solana is not available from your standard Solana RPC. Geyser was created in order to achieve lossless real-time data streaming into a db, however it only can give you the current account states an RPC has stored plus any future state, allowing you to construct an accurate account history from the moment you enable the Geyser plugin but not an account history from prior transactions. Not to mention this costs $2k per month from Triton and $1.1k per month from Helius. There are likely better ways to spend The Meta DAO's treasury, plus one of the selling points of an indexer in the first place is its potential to save on rather than balloon RPC costs. There is an ongoing collab between Triton, Firedancer, and Protocol Labs devs to store all historical data on IPFS (project Old Faithful) but this is still a work in progress and just storing the index used to lookup accounts on IPFS is already 50 terabytes!
Is there a simpler, cost effective way to get historical data?
The approach futarchy-indexer takes is to replay the transaction history to recreate each historical account state. Historical transactions are not pruned as aggressively as account state (where only the latest account state is kept) so this works with standard solana RPCs without needing to upgrade to more expensive infra tiers. The downside of this approach is a lot more complexity since we have to actually parse each historical transaction and know how the Solana program executing that transaction would have translated this into a mutation on any account states. If that translation is very complex, this approach can be difficult to maintain. Thankfully, the states and transactions we're concerned about here: token balances, twap markets, proposal metadata, orderbooks and swaps, aren't too complex.
Futarchy Indexer operates on 2 core entities:
- transaction watchers
- indexers
A transaction watcher takes an account, then subscribes real time to all signatures for that account. It's job is to ensure it both
- stores real time transactions for an account using RPC webhook APIs
- has not skipped storing any transaction metadata, utilizing the
getSignaturesForAddress
API.
An indexer depends on one or more transaction watchers. Once it sees all its dependencies have backed up transactions to a certain slot, it can process all these transactions up to that slot, parsing instruction data and updating corresponding tables representing proposals, twaps, order books and trading history.
Why do we want multiple indexers?
- This allows no-downtime upgrades to the indexing and transaction caching logic.
- If a bug is identified in prior indexer logic, we simply create a new indexer starting at slot 0 which will overwrite existing data until it catches up with the existing indexer, at which point we can remove the duplicate indexer.
- If a bug is identified in the transaction caching logic, we update the logic then set the transaction watcher's slot back to 0, and start a new indexer at 0 which will overrite existing data using the corrected transactions.
- As we upgrade the Meta DAO we'll need to watch different sets of accounts. For example autocrat V0 and V0.1 have different programs and DAO accounts and should be represented by different watchers. Once we switch from OpenBook to an in-house AMM, we'll also need a new watcher. Multiple wathcers / indexers in parallel means we can index data for proposals based on old and new accounts in simultaneously, and not lose the ability to index historical proposal data even as the DAO is upgraded
FUTARCHY_HELIUS_API_KEY
used by indexerFUTARCHY_PG_URL
used by indexer
After cloning run pnpm install
in the project directory
Docs on each top-level script below
Migrate db to match definition in packages/database/lib/schema.ts
. Assumes you have set the FUTARCHY_PG_URL
env var.
Also regenerates the graphql client (TODO).
Run raw sql against the database. Assumes you have set the FUTARCHY_PG_URL
env var.
You can add to the COMMON_STATEMENTS
const in packages/database/src/run-sql.ts
if you have a long sql query you want to save for later reuse.
TODO
Starts the service
Creates a new transaction watcher on a particular account
Resets an existing transaction watcher to a particular transaction/slot (or resets it back to 0)
Validates whether the cached txs for an account matches the
TODO
Syncs the current Hasura GraphQL schema types to the client in futarchy-sdk using genql