diff --git a/.github/API_fink.png b/.github/API_fink.png index e33cf28..a367901 100644 Binary files a/.github/API_fink.png and b/.github/API_fink.png differ diff --git a/README.md b/README.md index bf36f8f..39d0aae 100644 --- a/README.md +++ b/README.md @@ -4,43 +4,27 @@ ![structure](.github/API_fink.png) -This repository contains the code source of the Fink REST API used to access object data stored in tables in Apache HBase. +The object API is a Flask application used to access object data stored in tables in Apache HBase. The application relies internally on two components, the Java Gateway and the Fink cutout API. -## Requirements and installation - -You will need Python installed (>=3.11) with requirements listed in [requirements.txt](requirements.txt). You will also need [fink-cutout-api](https://github.com/astrolabsoftware/fink-cutout-api) fully installed (which implies Hadoop installed on the machine, and Java 11/17). For the full installation and deployment, refer as to the [procedure](install/README.md). +The Java Gateway enables the Flask application to communicate with a JVM using [py4j](https://www.py4j.org/), where the Fink HBase client based on [Lomikel](https://github.com/hrivnac/Lomikel) is available. This client simplifies the interaction with HBase tables, where Fink aggregated alert data is stored. -## Configuration +The Fink cutout API is a Flask application to access cutouts from the Fink datalake. We only store cutout metadata in HBase, and this API retrieves the data from the raw parquet files stored on HDFS. -First you need to configure the parameters in [config.yml](config.yml): +## User documentation -```yml -# Host and port of the application -HOST: localhost -PORT: 32000 +TBD -# URL of the fink_cutout_api -CUTOUTAPIURL: http://localhost - -# HBase configuration -HBASEIP: localhost -ZOOPORT: 2183 +## Requirements and installation -# Table schema (schema_{fink_broker}_{fink_science}) -SCHEMAVER: schema_3.1_5.21.14 +You will need Python installed (>=3.9) with requirements listed in [requirements.txt](requirements.txt). You will also need [fink-cutout-api](https://github.com/astrolabsoftware/fink-cutout-api) fully installed (which implies Hadoop installed on the machine, and Java 11/17). For the full installation and deployment, refer as to the [procedure](install/README.md). -# Maximum number of rows to -# return in one call -NLIMIT: 10000 -``` +## Deployment -Make sure that the `SCHEMAVER` is the same you use for your tables in HBase. +The input parameters can be found in [config.yml](config.yml). Make sure that the `SCHEMAVER` is the same you use for your tables in HBase. TODO: - [ ] Find a way to automatically sync schema with tables. -## Deployment - ### Debug After starting [fink-cutout-api](https://github.com/astrolabsoftware/fink-cutout-api), you can simply test the API using: @@ -51,7 +35,7 @@ python app.py ### Production -The application is managed by `gunicorn` and `systemd` (see [install](install/README.md)), and you can simply manage it using: +The application is simply managed by `gunicorn` and `systemd` (see [install](install/README.md)), and you can simply manage it using: ```bash # start the application