You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fink_datatransfer.py -h
usage: fink_datatransfer.py [-h] [-topic TOPIC] [-limit LIMIT] [-outdir OUTDIR] [-partitionby PARTITIONBY] [-batchsize BATCHSIZE] [--restart_from_beginning]
[--verbose]
Kafka consumer to listen and archive Fink streams from the data transfer service
optional arguments:
-h, --help show this help message and exit
-topic TOPIC Topic name for the stream that contains the data.
-limit LIMIT If specified, download only `limit` alerts from the stream. Default is None, that is download all alerts.
-outdir OUTDIR Folder to store incoming alerts. It will be created if it does not exist.
-partitionby PARTITIONBY
Partition data by `time` (year=YYYY/month=MM/day=DD), or `finkclass` (finkclass=CLASS), or `tnsclass` (tnsclass=CLASS). Default is
time.
-batchsize BATCHSIZE Maximum number of alert within the `maxtimeout` (see conf). Default is 1000 alerts.
--restart_from_beginning
If specified, restart downloading from the 1st alert in the stream. Default is False.
--verbose If specified, print on screen information about the consuming.
Note that unlike the livestream service, the schema for these alerts is stored in a separate topic (topic_name_schema) that is polled simultaneously when retrieving alerts.
To do:
new script
documentation in the README
documentation in the readthedocs (along with the datatransfer service).
The text was updated successfully, but these errors were encountered:
PoC:
Note that unlike the livestream service, the schema for these alerts is stored in a separate topic (topic_name_schema) that is polled simultaneously when retrieving alerts.
To do:
The text was updated successfully, but these errors were encountered: