Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(release): update 0b5fb2a #10

Merged
merged 2 commits into from
May 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 22 additions & 0 deletions .env.localstack
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
AWS_REGION=us-east-1
AWS_ENDPOINT=http://localstack:4566
AWS_ACCESS_KEY_ID=test
AWS_ACCESS_KEY_SECRET=test
SQS_PREPARE_BUNDLE_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/prepare-bundle-queue
SQS_POST_BUNDLE_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/post-bundle-queue
SQS_SEED_BUNDLE_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/seed-bundle-queue
SQS_FINALIZE_UPLOAD_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/finalize-multipart-queue
SQS_OPTICAL_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/optical-post-queue
SQS_NEW_DATA_ITEM_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/batch-insert-new-data-items-queue
SQS_UNBUNDLE_BDI_URL=http://sqs.us-east-1.localstack.localstack.cloud:4566/000000000000/bdi-unbundle-queue
PLAN_BUNDLE_ENABLED=true
VERIFY_BUNDLE_ENABLED=true
SKIP_BALANCE_CHECKS=true
OPTICAL_BRIDGING_ENABLED=false
TURBO_OPTICAL_KEY=$ARWEAVE_WALLET
NODE_ENV=local
DATA_ITEM_BUCKET=raw-data-items
DATA_ITEM_BUCKET_REGION=us-east-1
LOG_LEVEL=debug
S3_FORCE_PATH_STYLE=true
ARWEAVE_WALLET=
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ node_modules
# Env
**/.env

!.env.localstack

#Yarn
#https://yarnpkg.com/getting-started/qa#which-files-should-be-gitignored
.pnp.*
Expand All @@ -36,3 +38,6 @@ node_modules

# 🕵️‍♂️
.wallet

# LocalStack default volume folder
/volume
2 changes: 1 addition & 1 deletion .mocharc.js
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ process.env.BLOCKLISTED_ADDRESSES ??= // cspell:disable
module.exports = {
extension: ["ts"],
require: ["ts-node/register/transpile-only", "tests/testSetup.ts"],
timeout: "10000", // 10 seconds
timeout: "20000", // 20 seconds
parallel: true,
recursive: true,
};
7 changes: 7 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"blocklisted",
"bundlr",
"dataitem",
"ethersproject",
"indep",
"indexdef",
"indexname",
Expand Down Expand Up @@ -70,5 +71,11 @@
"typescript.enablePromptUseWorkspaceTsdk": true,
"[dockerfile]": {
"editor.defaultFormatter": "ms-azuretools.vscode-docker"
},
"[shellscript]": {
"editor.defaultFormatter": "foxundermoon.shell-format"
},
"[dotenv]": {
"editor.defaultFormatter": "foxundermoon.shell-format"
}
}
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ WORKDIR /usr/src/app
COPY . .
RUN yarn && yarn build

# Clean out dependencies
RUN yarn workspaces focus --production
# Clear cache and install production dependencies
RUN rm -rf node_modules && yarn workspaces focus --production

FROM gcr.io/distroless/nodejs${NODE_VERSION_SHORT}-debian11
WORKDIR /usr/src/app
Expand Down
15 changes: 15 additions & 0 deletions Dockerfile.localstack
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
FROM localstack/localstack

COPY scripts/provision_localstack.sh /opt/code/provision_localstack.sh
COPY scripts/localstack_entrypoint.sh /docker-entrypoint-initaws.d/entrypoint.sh

RUN chmod +x /opt/code/provision_localstack.sh \
&& chmod +x /docker-entrypoint-initaws.d/entrypoint.sh

RUN aws configure --profile localstack set aws_access_key_id test && \
aws configure --profile localstack set aws_secret_access_key test && \
aws configure --profile localstack set region us-east-1

# A wrapper around the localstack image's entrypoint script
# that provisions the necessary 'AWS' resources for Turbo.
ENTRYPOINT ["/docker-entrypoint-initaws.d/entrypoint.sh"]
30 changes: 27 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
# Turbo Upload Service

Welcome to the Turbo Upload Service 👋
Turbo is a robust, data bundling service that packages [ANS-104](https://github.com/ArweaveTeam/arweave-standards/blob/master/ans/ANS-104.md) "data items" for reliable delivery to [Arweave](https://arweave.org). It is architected to run at scale in AWS, but can be run in smaller scale, Docker-enabled environments via integrations with [LocalStack](https://github.com/localstack/localstack). Additionally, local-development-oriented use cases are supported via integrations with [ArLocal](https://github.com/textury/arlocal).

Turbo is powered by two primary services:

- Upload Service: accepts incoming data uploads in single request or multipart fashion.
- Fulfillment Service: facilitates asynchronous back-end operations for reliable data delivery to Arweave

They are composed atop a common set of service dependencies including but not limited to:

- a PostgreSQL database (containerized locally or running on RDS in AWS)
- an object store (S3)
- a collection of durable job queues (SQS) that facilitate various workloads relevant to Arweave ecosystem integrations

Data items accepted by the service can be signed with Arweave, Ethereum, or Solana private keys.

## Setting up the development environment

Expand All @@ -12,10 +25,21 @@ For a compatible development environment, we require the following packages inst
- `yarn`
- `husky`
- `docker`
- `aws`
- `localstack` (optional)

### Quick Start: Run all services in Docker

- Set an escaped, JSON string representation of an Arweave JWK to the ARWEAVE_WALLET environment variable (necessary for bundle signing) in [.env.localstack](.env.localstack)
- Run `docker compose --env-file ./.env.localstack up upload-service`

Once all of its dependencies are healthy, the Upload Service will start on port 3000. Visit its `/api-docs` endpoint for more information on supported HTTP routes.

NOTE: Database and queue state persistence across service runs are the responsibility of the operator.

### Running the Upload Service locally

With a compatible system, follow these steps to start the upload service:
With a compatible system, follow these steps to start the Upload Service on its own on your local system:

- `cp .env.sample .env` (and update values)
- `yarn`
Expand Down Expand Up @@ -104,7 +128,7 @@ Unit and integration tests can be run locally or via docker. For either, you can

### Integration Tests

- `yarn test:docker` - runs integration tests (and unit tests) in an isolated docker container (RECOMMENDED)
- `yarn test:integration:local` - runs the integration tests locally against postgres and arlocal docker containers
- `yarn test:integration:local -g "Router"` - runs targeted integration tests against postgres and arlocal docker containers
- `watch -n 30 'yarn test:integration:local -g "Router'` - runs targeted integration tests on an interval (helpful when actively writing tests)
- `yarn test:docker` - runs integration tests (and unit tests) in an isolated docker container
118 changes: 114 additions & 4 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,20 +13,88 @@ services:
NODE_VERSION: ${NODE_VERSION:-18.17.0}
NODE_VERSION_SHORT: ${NODE_VERSION_SHORT:-18}
environment:
NODE_ENV: ${NODE_ENV:-test}
NODE_ENV: ${NODE_ENV:-local}
DB_HOST: upload-service-pg
DB_PORT: 5432
DB_PASSWORD: postgres
PAYMENT_SERVICE_BASE_URL: ${PAYMENT_SERVICE_BASE_URL:-payment.ardrive.dev}
MAX_DATA_ITEM_SIZE: ${MAX_DATA_ITEM_SIZE:-10737418240}
ALLOW_LISTED_ADDRESSES: ${ALLOW_LISTED_ADDRESSES:-}
MIGRATE_ON_STARTUP: true
AWS_ENDPOINT: ${AWS_ENDPOINT:-}
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID:-}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY:-}
SQS_PREPARE_BUNDLE_URL: ${SQS_PREPARE_BUNDLE_URL:-}
SQS_FINALIZE_UPLOAD_URL: ${SQS_FINALIZE_UPLOAD_URL:-}
SQS_OPTICAL_URL: ${SQS_OPTICAL_URL:-}
SQS_NEW_DATA_ITEM_URL: ${SQS_NEW_DATA_ITEM_URL:-}
SQS_UNBUNDLE_BDI_URL: ${SQS_UNBUNDLE_BDI_URL:-}
OPTICAL_BRIDGING_ENABLED: ${OPTICAL_BRIDGING_ENABLED:-false}
SKIP_BALANCE_CHECKS: ${SKIP_BALANCE_CHECKS:-true}
DATA_ITEM_BUCKET: ${DATA_ITEM_BUCKET:-raw-data-items}
DATA_ITEM_BUCKET_REGION: ${DATA_ITEM_BUCKET_REGION:-us-east-1}
LOG_LEVEL: ${LOG_LEVEL:-info}
S3_FORCE_PATH_STYLE: ${S3_FORCE_PATH_STYLE:-}
AWS_REGION: ${AWS_REGION:-us-east-1}
ports:
- "${PORT:-3000}:${PORT:-3000}"
volumes:
- upload-service-data-items:/temp
depends_on:
- upload-service-pg
- fulfillment-service

fulfillment-service:
build:
context: .
dockerfile: Dockerfile.fulfillment
args:
NODE_VERSION: ${NODE_VERSION:-18.17.0}
NODE_VERSION_SHORT: ${NODE_VERSION_SHORT:-18}
environment:
NODE_ENV: ${NODE_ENV:-local}
DB_HOST: upload-service-pg
DB_PORT: 5432
DB_PASSWORD: postgres
PORT: ${FULFILLMENT_PORT:-4000}
AWS_ENDPOINT: ${AWS_ENDPOINT:-}
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID:-}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY:-}
SQS_PREPARE_BUNDLE_URL: ${SQS_PREPARE_BUNDLE_URL:-}
SQS_POST_BUNDLE_URL: ${SQS_POST_BUNDLE_URL:-}
SQS_SEED_BUNDLE_URL: ${SQS_SEED_BUNDLE_URL:-}
SQS_FINALIZE_UPLOAD_URL: ${SQS_FINALIZE_UPLOAD_URL:-}
SQS_OPTICAL_URL: ${SQS_OPTICAL_URL:-}
SQS_NEW_DATA_ITEM_URL: ${SQS_NEW_DATA_ITEM_URL:-}
SQS_UNBUNDLE_BDI_URL: ${SQS_UNBUNDLE_BDI_URL:-}
PLAN_BUNDLE_ENABLED: ${PLAN_BUNDLE_ENABLED:-true}
VERIFY_BUNDLE_ENABLED: ${VERIFY_BUNDLE_ENABLED:-true}
OPTICAL_BRIDGING_ENABLED: ${OPTICAL_BRIDGING_ENABLED:-false}
SKIP_BALANCE_CHECKS: ${SKIP_BALANCE_CHECKS:-true}
DATA_ITEM_BUCKET: ${DATA_ITEM_BUCKET:-raw-data-items}
DATA_ITEM_BUCKET_REGION: ${DATA_ITEM_BUCKET_REGION:-us-east-1}
S3_FORCE_PATH_STYLE: ${S3_FORCE_PATH_STYLE:-}
AWS_REGION: ${AWS_REGION:-us-east-1}

depends_on:
localstack:
condition: service_healthy
upload-service-pg:
condition: service_started
migrator-service:
condition: service_started

migrator-service:
build:
context: .
dockerfile: Dockerfile.migration
args:
NODE_VERSION: ${NODE_VERSION:-18.17.0}
environment:
DB_HOST: upload-service-pg
DB_PORT: 5432
DB_PASSWORD: postgres
depends_on:
- upload-service-pg

upload-service-pg:
image: postgres:13.8
Expand All @@ -52,9 +120,51 @@ services:
DISABLE_LOGS: ${DISABLE_LOGS:-true}
NODE_ENV: ${NODE_ENV:-test}
ARWEAVE_GATEWAY: ${ARWEAVE_GATEWAY:-http://arlocal:1984}
AWS_ENDPOINT: ${AWS_ENDPOINT:-}
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID:-}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY:-}
SQS_PREPARE_BUNDLE_URL: ${SQS_PREPARE_BUNDLE_URL:-}
SQS_POST_BUNDLE_URL: ${SQS_POST_BUNDLE_URL:-}
SQS_SEED_BUNDLE_URL: ${SQS_SEED_BUNDLE_URL:-}
SQS_FINALIZE_UPLOAD_URL: ${SQS_FINALIZE_UPLOAD_URL:-}
SQS_OPTICAL_URL: ${SQS_OPTICAL_URL:-}
SQS_NEW_DATA_ITEM_URL: ${SQS_NEW_DATA_ITEM_URL:-}
SQS_UNBUNDLE_BDI_URL: ${SQS_UNBUNDLE_BDI_URL:-}
DATA_ITEM_BUCKET: ${DATA_ITEM_BUCKET:-raw-data-items}
DATA_ITEM_BUCKET_REGION: ${DATA_ITEM_BUCKET_REGION:-us-east-1}
S3_FORCE_PATH_STYLE: ${S3_FORCE_PATH_STYLE:-}
depends_on:
- upload-service-pg
- arlocal
localstack:
condition: service_healthy
upload-service-pg:
condition: service_started
arlocal:
condition: service_started

localstack:
container_name: "${LOCALSTACK_DOCKER_NAME:-localstack}"
build:
context: .
dockerfile: Dockerfile.localstack
ports:
- "127.0.0.1:4566:4566" # LocalStack Gateway
#- "127.0.0.1:4510-4559:4510-4559" # external services port range
environment:
# LocalStack configuration: https://docs.localstack.cloud/references/configuration/
- SERVICES=${SERVICES:-s3,sqs,secretsmanager}
- DEBUG=${DEBUG:-1}
- NODE_ENV=${NODE_ENV:-local}
- ARWEAVE_WALLET=${ARWEAVE_WALLET:-}
- TURBO_OPTICAL_KEY=${TURBO_OPTICAL_KEY:-$ARWEAVE_WALLET}
volumes:
- "${LOCALSTACK_VOLUME_DIR:-./volume}:/var/lib/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:4566/_localstack/health"]
interval: 90s
timeout: 30s
retries: 1
start_period: 15s

volumes:
upload-service-data:
Expand Down
Loading
Loading