Skip to content

Latest commit

 

History

History
62 lines (41 loc) · 1.65 KB

QUICK_START.md

File metadata and controls

62 lines (41 loc) · 1.65 KB

Quick Start

You'll need a Kubernetes cluster to try it out, e.g. Docker for Desktop.

Deploy into the argo-dataflow-system namespace:

kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/config/quick-start.yaml

Change to the installation namespace:

kubectl config set-context --current --namespace=argo-dataflow-system

Wait for the deployments to be available (ctrl+c when available):

kubectl get deploy -w

If you want the user interface:

kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/config/apps/argo-server.yaml
kubectl get deploy -w ;# (ctrl+c when available)
kubectl port-forward svc/argo-server 2746:2746

Open http://localhost:2746/pipelines/argo-dataflow-system.

Run one of the examples.

Kafka

If you want to experiment with Kafka, install Kafka:

kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/config/apps/kafka.yaml

Configure dataflow to use that Kafka by default:

kubectl apply -f https://raw.githubusercontent.com/argoproj-labs/argo-dataflow/main/examples/dataflow-kafka-default-secret.yaml 

Wait for the statefulsets to be available (ctrl+c when available):

kubectl get statefulset -w

If you want to connect to from you desktop, e.g. as a consumer or producer, you can port forward to the Kafka broker:

kubectl port-forward svc/kafka-broker 9092:9092

You can use Kafka's console producer to send messages to the broker, see Kafka quickstart.