This is a monorepo that currently contains 3 main services (besides helper services)
- Inspirehep - this is what the main inspire services, it serves the website inspirehep.net / inspirebeta.net and calls the required services
- Backoffice - a Django app with the goal of fully replacing inspire-next one day with the help of the workflows service
- Workflows - an airflow service responsible for running the workflows.
Okay now the question is how do we develop on it?
By far easiest way to get the project running in your machine is through docker (instruction on how to run it locally below for the brave ones), given that you have enough memory
Make will spin up the required services, depending on what you are working on.
- This will prepare the whole inspire development with demo records:
make run
- This spinup the whole inspirehep development with demo records but without the backoffice
make run-inspirehep
- This will spin up a backoffice
make run-backoffice
- You can stop it by simply run
make stop
Upon spinning it up services should be available in the following routes:
- Inspirehep - http://localhost:8080
- Backoffice - http://localhost:8001
- Airflow / Workflows - http://localhost:8070
- Opensearch - http://localhost:9200
- Postgres db - http://localhost:5432
-
If you simply wish to login to inspirehep, use
[email protected]:123456
-
If you wish to login into inspirehep/backoffice or the actual backoffice use
[email protected]:admin
But if you want to test with orcid you will need to set theORCID_CLIENT_ID
andORCID_CLIENT_SECRET
extra steps must be done: If you wish to test orcid oninspirehep
:- Go to
backend/inspirehep/orcid/config.py
- They will correspond toconsumer_key
andconsumer_secret
If you wish to test orcid onbackoffice
: - Go to
backoffice/.envs/local/.django
- AddORCID_CLIENT_ID
andORCID_CLIENT_SECRET
there.
You can find this values in the password manager for the sandbox orcid environment.
⚠️ Do not forget to remove them before committing⚠️ - Go to
If you wish to run the tests for a given services here's the way to do it
First exect into the container i.e.: docker exec -it <container_name> /bin/bash
or via dockerdestkop
Then depending on the service you are testing:
- backoffice-webserver :
pytest .
- airflow-webserver:
pytest .
- inspire-hep: ?
- backend: ?
There are two ways of setting environment variables on hep:
backend/inspirehep/config.py
docker-compose.services.yml
-INVENIO_
prefix must be added. Variables here overwriteconfig.py
For running the enviroment locally you have the following prerequirements:
$ sudo apt-get install python3 build-essential python3-dev
$ brew install postgresql@14 libmagic openssl@3 openblas python
Please follow the instructions https://github.com/nvm-sh/nvm#installing-and-updating
We're using v20.0.0
(first version we install is the default)
$ nvm install 20.0.0
$ nvm use global 20.0.0
Please follow the instructions https://classic.yarnpkg.com/en/docs/install/#debian-stable
$ brew install yarn
install poetry
https://poetry.eustace.io/docs/
$ curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python -
install pre-commit
https://pre-commit.com/
$ curl https://pre-commit.com/install-local.py | python -
And run
$ pre-commit install
Follow the guide https://docs.docker.com/compose/install/
Turn of the AirPlay Receiver
under System Preference -> Sharing -> AirPlay Receiver.
Otherwise, you will run into problems with port 5000 being already in use.
See this for more information.
Install Homebrew-file
https://homebrew-file.readthedocs.io/en/latest/installation.html
$ brew install rcmdnk/file/brew-file
And run
$ brew file install
$ cd backend
$ poetry install
$ cd ui
$ yarn install
$ cd record-editor
$ yarn install
First you need to start all the services (postgreSQL, Redis, ElasticSearch, RabbitMQ)
$ docker-compose -f docker-compose.services.yml up es mq db cache
And initialize database, ES, rabbitMQ, redis and s3
$ cd backend
$ ./scripts/setup
Note that s3 configuration requires default region to be set to us-east-1
. If you have another default setup in your AWS config (~/.aws/config
) you need to update it!
Also, to enable fulltext indexing & highlighting the following feature flags must be set to true:
FEATURE_FLAG_ENABLE_FULLTEXT = True
FEATURE_FLAG_ENABLE_FILES = True
You can visit Backend http://localhost:8000
$ cd backend
$ ./scripts/server
You can visit UI http://localhost:3000
$ cd ui
$ yarn start
$ cd ui
$ yarn start
In case you wanna use docker and just run the record-editor locally, use the following steps:
- Add the following volume mount to the record-editor service in the docker-compose.yml:
- ./record-editor/dist/:/usr/share/nginx/html
- Navigate into the record-editor folder and first run
yarn
and thenyarn start
- Open a second terminal and run
make run
The record editor should now be availabe and automatically update when changes are made to the codebase.
You can also connect UI to another environment by changing the proxy in ui/setupProxy.js
proxy({
target: 'http://A_PROXY_SERVER',
...
});
The backend tests locally use testmon
to only run tests that depend on code that has changed (after the first run) by default:
$ cd backend
$ poetry run ./run-tests.sh
If you pass the --all
flag to the run-tests.sh
script, all tests will be run (this is equivalent to the --testmon-noselect
flag). All other flags passed to the script are transferred to py.test
, so you can do things like
$ poetry run ./run-tests.sh --pdb -k test_failing
You'll need to run all tests or force test selection (e.g. with -k
) in a few cases:
- an external dependency has changed, and you want to make sure that it doesn't break the tests (as
testmon
doesn't track external deps) - you manually change a test fixture in a non-python file (as
testmon
only tracks python imports, not external data)
If you want to invoke py.test
directly but still want to use testmon
, you'll need to use the --testmon --no-cov
flags:
$ poetry run py.test tests/integration/records --testmon --no-cov
If you want to disable testmon
test selection but still perform collection (to update test dependencies), use --testmon-noselect --no-cov
instead.
Note that testmon
is only used locally to speed up tests and not in the CI to be completely sure all tests pass before merging a commit.
If you wish to modify the SNow integration tests, you have to set the following variables in the SNow config file:
SNOW_CLIENT_ID
SNOW_CLIENT_SECRET
SNOW_AUTH_URL
The secrets can be found in the inspirehep QA or PROD sealed secrets. After setting the variables, run the tests, so the cassettes get generated.
Before you push dont forget to delete the secrets from the config file!
$ cd ui
$ yarn test # runs everything (lint, bundlesize etc.) indentical to CI
$ yarn test:unit # will open jest on watch mode
Note that jest
automatically run tests that changed files (unstaged) affect.
Runs everything from scratch, identical to CI
$ sh cypress-tests-chrome.sh
$ sh cypress-tests-firefox.sh
Opens cypress runner GUI runs them against local dev server (localhost:8080)
$ cd e2e
$ yarn test:dev
$ yarn test:dev --env inspirehep_url=<any url that serves inspirehep ui>
Visual tests are run only on headless
mode. So yarn test:dev
which uses the headed browser will ignore them.
Running existing visual tests and updating/creating snapshots requires cypress-tests.sh
script.
For continuous runs (when local DB is running and has required records etc.), the script can be reduced to only the last part sh cypress-tests-run.sh
.
If required, tests can run against localhost:3000
by simply modifying --host
option in sh cypress-tests-run.sh
.
You may not always need to run tests exactly like on the CI environment.
- To run specific suite, just change
test
script ine2e/package.json
temporarily tocypress run --spec cypress/integration/<spec.test.js>
First make sure that you are running:
$ cd backend
$ ./scripts/server
There is a command inspirehep importer records
which accepts url -u
, a directory of JSON
files -d
and JSON
files -f
.
A selection of demo records can be found in data
directory and they are structure based on the record type (i.e. literature
). Examples:
# Local
$ poetry run inspirehep importer records -u https://inspirehep.net/api/literature/20 -u https://inspirehep.net/api/literature/1726642
# Docker
$ docker-compose exec hep-web inspirehep importer records -u https://inspirehep.net/api/literature/20 -u https://inspirehep.net/api/literature/1726642
# `--save` will save the imported record also to the data folder
$ <...> inspirehep importer records -u https://inspirehep.net/api/literature/20 --save
Valid --token
or backend/inspirehep/config.py:AUTHENTICATION_TOKEN
is required.
# Local
$ poetry run inspirehep importer records -d data/records/literature
# Docker
$ docker-compose exec hep-web inspirehep importer records -d data/records/literature
# Local
$ poetry run inspirehep importer records -f data/records/literature/374836.json -f data/records/authors/999108.json
# Docker
$ docker-compose exec hep-web inspirehep importer records -f data/records/literature/374836.json -f data/records/authors/999108.json
# Local
$ poetry run inspirehep importer demo-records
# Docker
$ docker-compose exec hep-web inspirehep importer demo-records