Skip to content

MaastrichtU-Library/fair-enough-metrics

 
 

Repository files navigation

☑️ FAIR Enough metrics for research

Test Metrics

An API for various core FAIR Metrics Tests, written in python, conforming to the specifications defined by the FAIRMetrics working group.

FAIR metrics tests are API operations which test if a subject URL is complying with certain requirements defined by a community, they usually check if the resource available at the subject URL complies with the FAIR principles (Findable, Accessible, Interoperable, Reusable)

This API is deployed publicly at https://metrics.api.fair-enough.semanticscience.org

🗃️ It can be used with the following FAIR evaluation services:

This FAIR Metrics tests API has been built with the FAIR test python library.

➕ Contribute a new FAIR Metrics Tests

ℹ️ You are welcome to submit a pull request to propose to add your tests to our API in production: https://metrics.api.fair-enough.semanticscience.org

  1. Fork this repository
  2. Clone your forked repository
  3. Copy duplicate an existing test file in the metrics folder, and modify it to define your FAIR metrics tests!
  4. Start your FAIR metrics tests API with docker-compose up to test if your FAIR metric test works as expected
  5. Send us a pull request to integrate your test to our API at https://metrics.api.fair-enough.semanticscience.org
  6. Once your test is published, register it in existing FAIR evaluation services.

🧑‍💻 Deploy the API

First, clone the repository:

git clone https://github.com/MaastrichtU-IDS/fair-enough-metrics
cd fair-enough-metrics

🐳 Development with docker (recommended)

To deploy the API in development, with automatic reload when the code changes run this command:

docker-compose up dev

Access the OpenAPI Swagger UI on http://localhost:8000

If you make changes to the dependencies in pyproject.toml you will need to rebuild the image to install the new requirements:

docker-compose up dev --build

Run the tests:

docker-compose run test
# You can pass args:
docker-compose run test pytest -s

Run in production (change the docker-compose.yml to your deployment solution):

Define the Bing and Google Search API keys in the secrets.env file that will not be committed to git:

APIKEY_BING_SEARCH=bingapikey
APIKEY_GOOGLE_SEARCH=googleapikey

To start the stack with production config:

docker-compose up prod -d

We use a reverse nginx-proxy for docker to route the services.

🐍 Without docker

📥️ Install dependencies

Create and activate a virtual environment if necessary:

python3 -m venv .venv
source .venv/bin/activate

Install dependencies from the source code:

pip install -e ".[test,dev]"

Deploy the API in development

Start the API locally on http://localhost:8000

uvicorn main:app --reload

The API will automatically reload on changes to the code 🔄

✔️ Test the Metrics Tests API

The tests are run automatically by a GitHub Action workflow at every push to the main branch.

The subject URLs to test and their expected score are retrieved from the test_test attribute for each metric test.

Add tests in the ./tests/test_metrics.py file. You just need to add new entries to the JSON file to test different subjects results against the metrics tests:

Run the tests locally (from the root folder):

pytest -s

Run the tests only for a specific metric test:

pytest -s --metric a1-metadata-protocol

➕ Create a new FAIR Metrics Tests service

You can easily use this repository to develop and publish new FAIR metrics tests.

  1. Fork this repository
  2. Change the API settings in api/config.py
  3. Use the existing tests python files in the metrics folder to start writing FAIR metrics tests!
  4. Start your FAIR metrics tests API with docker-compose!

Use persistent identifiers

We use w3id.org to define persistent identifiers for our services: https://github.com/perma-id/w3id.org

Clone your fork of the w3id.org repository:

git clone [email protected]:vemonet/w3id.org.git

Add the main repository to update your fork later:

git remote add fork https://github.com/perma-id/w3id.org.gi

Upgrade your fork to the latest version of the main repository:

git pull fork master

You just need to add 2 files, you can copy the fair-enough folder to get started quickly with our configuration:

  • README.md: add a short description of what the persistent identifier will be used for, and who are the maintainers (providing their GitHub ID, to make sure the namespace is not changed by unauthorized people in the future). For instance:
# My FAIR metrics tests (fair-metrics-tests)

A namespace for FAIR evaluations metrics tests

## Maintainers

- Vincent Emonet (@vemonet)
  • .htaccess: define the redirections from w3id.org to your service. For instance:
Header set Access-Control-Allow-Origin *
Header set Access-Control-Allow-Headers DNT,X-Mx-ReqToken,Keep-Alive,User-Agent,X-Requested-With,If-Modified$
Options +FollowSymLinks
RewriteEngine on
RewriteRule ^(.*)$ https://metrics.api.fair-enough.semanticscience.org/$1 [R=302,L]

About

☑️ API to publish FAIR metrics tests written in python

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.4%
  • Other 1.6%