Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Export/Import dataflow #733

Open
tuan08 opened this issue Jun 1, 2015 · 1 comment
Open

Implement Export/Import dataflow #733

tuan08 opened this issue Jun 1, 2015 · 1 comment
Assignees

Comments

@tuan08
Copy link
Contributor

tuan08 commented Jun 1, 2015

  1. Create a shell command:
shell.sh dataflow export  --dataflow-id  --config-file

This command will use registry.get("/neverwinterdp/scribengin/dataflow/dataflow-id") data.

  1. Convert the data to DataflowDescriptor object , using JSONSerializer.
  2. get the latest data config in configuration and update the DataflowDescriptor. This is not ready yet, and I need to update some registry code
  3. Write the DataflowDescriptor to --config-file file
@KingOfPoptart
Copy link
Contributor

Use case 1:
Entire cluster shuts down cleanly (including kafka, zookeeper, Scribengin, hadoop, etc)
Entire cluster starts back up
Everything should continue as normal

Use case 2:
Export configuration of currently running cluster
Stop the cluster
Start the cluster again with the exported configuration.

Use case 3:
Export configuration of currently running cluster
Stop the cluster
Start a completely new cluster with the exported configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants