An API for various core FAIR Metrics Tests, written in python, conforming to the specifications defined by the FAIRMetrics working group.
FAIR metrics tests are API operations which test if a subject URL is complying with certain requirements defined by a community, they usually check if the resource available at the subject URL complies with the FAIR principles (Findable, Accessible, Interoperable, Reusable)
This API is deployed publicly at https://metrics.api.fair-enough.semanticscience.org
🗃️ It can be used with the following FAIR evaluation services:
This FAIR Metrics tests API has been built with the FAIR test python library.
ℹ️ You are welcome to submit a pull request to propose to add your tests to our API in production: https://metrics.api.fair-enough.semanticscience.org
- Fork this repository
- Clone your forked repository
- Copy duplicate an existing test file in the
metrics
folder, and modify it to define your FAIR metrics tests! - Start your FAIR metrics tests API with
docker-compose up
to test if your FAIR metric test works as expected - Send us a pull request to integrate your test to our API at https://metrics.api.fair-enough.semanticscience.org
- Once your test is published, register it in existing FAIR evaluation services.
First, clone the repository:
git clone https://github.com/MaastrichtU-IDS/fair-enough-metrics
cd fair-enough-metrics
To deploy the API in development, with automatic reload when the code changes run this command:
docker-compose up dev
Access the OpenAPI Swagger UI on http://localhost:8000
If you make changes to the dependencies in pyproject.toml
you will need to rebuild the image to install the new requirements:
docker-compose up dev --build
Run the tests:
docker-compose run test
# You can pass args:
docker-compose run test pytest -s
Run in production (change the docker-compose.yml
to your deployment solution):
Define the Bing and Google Search API keys in the secrets.env
file that will not be committed to git:
APIKEY_BING_SEARCH=bingapikey
APIKEY_GOOGLE_SEARCH=googleapikey
To start the stack with production config:
docker-compose up prod -d
We use a reverse nginx-proxy for docker to route the services.
Create and activate a virtual environment if necessary:
python3 -m venv .venv
source .venv/bin/activate
Install dependencies from the source code:
pip install -e ".[test,dev]"
Start the API locally on http://localhost:8000
uvicorn main:app --reload
The API will automatically reload on changes to the code 🔄
The tests are run automatically by a GitHub Action workflow at every push to the main
branch.
The subject URLs to test and their expected score are retrieved from the test_test
attribute for each metric test.
Add tests in the ./tests/test_metrics.py
file. You just need to add new entries to the JSON file to test different subjects results against the metrics tests:
Run the tests locally (from the root folder):
pytest -s
Run the tests only for a specific metric test:
pytest -s --metric a1-metadata-protocol
You can easily use this repository to develop and publish new FAIR metrics tests.
- Fork this repository
- Change the API settings in
api/config.py
- Use the existing tests python files in the
metrics
folder to start writing FAIR metrics tests! - Start your FAIR metrics tests API with
docker-compose
!
We use w3id.org to define persistent identifiers for our services: https://github.com/perma-id/w3id.org
Clone your fork of the w3id.org repository:
git clone [email protected]:vemonet/w3id.org.git
Add the main repository to update your fork later:
git remote add fork https://github.com/perma-id/w3id.org.gi
Upgrade your fork to the latest version of the main repository:
git pull fork master
You just need to add 2 files, you can copy the fair-enough
folder to get started quickly with our configuration:
README.md
: add a short description of what the persistent identifier will be used for, and who are the maintainers (providing their GitHub ID, to make sure the namespace is not changed by unauthorized people in the future). For instance:
# My FAIR metrics tests (fair-metrics-tests)
A namespace for FAIR evaluations metrics tests
## Maintainers
- Vincent Emonet (@vemonet)
.htaccess
: define the redirections from w3id.org to your service. For instance:
Header set Access-Control-Allow-Origin *
Header set Access-Control-Allow-Headers DNT,X-Mx-ReqToken,Keep-Alive,User-Agent,X-Requested-With,If-Modified$
Options +FollowSymLinks
RewriteEngine on
RewriteRule ^(.*)$ https://metrics.api.fair-enough.semanticscience.org/$1 [R=302,L]