Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion: Run tests in parallel #87

Closed
fzimmermann89 opened this issue Oct 11, 2023 · 9 comments · Fixed by #89 or #322
Closed

Suggestion: Run tests in parallel #87

fzimmermann89 opened this issue Oct 11, 2023 · 9 comments · Fixed by #89 or #322
Labels
CI ci/workflow related enhancement New feature or request priority: low

Comments

@fzimmermann89
Copy link
Member

As the number of tests increases, it might be nice to run tests in parallel, both in VSCode as well as in our CI

https://code.visualstudio.com/docs/python/testing#_run-tests-in-parallel

@fzimmermann89 fzimmermann89 added enhancement New feature or request priority: low CI ci/workflow related labels Oct 11, 2023
@schuenke
Copy link
Collaborator

Good idea. I would suggest to use the -n logical, no ?

Use -n logical to use the number of logical CPU cores rather than physical ones. This currently requires the psutil package to be installed; if it is not, pytest-xdist will fall back to -n auto behavior.

With -n auto, pytest-xdist will use as many processes as your computer has CPU cores

For more infos: https://pytest-xdist.readthedocs.io/en/latest/distribution.html

@ckolbPTB
Copy link
Collaborator

In my VSCode -n logical leads to the strange effect that I cannot run all the tests from within VSCode. First the number of tests displayed at the top seems to be random (6 on my Mac, 64 on the cluster), wenn I click run all then I get in the end something like 6 out of 6 completed.

All the tests are shown correctly in the list and I can run every test individually without any problems.

Anybody else have this behaviour?

@fzimmermann89
Copy link
Member Author

WE currently have
addopts = "-n auto"
set in the pyproject.toml. Did you change that to logical or are you experiencing issus with the current configuration?

If the latter, can you check if you also have issues with insteadl setting "-n 4" in the pyproject.toml?
Which version of pytest-xdist and vscode are you using?
For me, pytest-xdist-3.3.1, Python 3.11.6, pytest-7.4.2, vscode 1.83 works without issues on ubuntu.

@ckolbPTB
Copy link
Collaborator

Sorry, I meant addopts = "-n auto" in the toml-file.

It also does not work if I use addopts = "-n 4" or addopts = "-n 1". It only works if I remove this line completely. If I run it from the command line in VSCode everything works fine for any setting.

I have got Python 3.11.4, pytest-7.4.0, xdist-3.3.1, vscode 1.83

@fzimmermann89
Copy link
Member Author

I get similar symptoms if I enable pythonTestAdapter in my settings.json->python.experiments.optInto .
Without this setting it works.

My test output (vscode->test results->left side of the panel) reads

Running tests (pytest): /home/zimmer08/code/mrpro
Running test with arguments: --rootdir /home/zimmer08/code/mrpro --override-ini junit_family=xunit1 --junit-xml=/tmp/tmp-1055I5KqB6Hz17d5.xml
Current working directory: /home/zimmer08/code/mrpro
Workspace directory: /home/zimmer08/code/mrpro
Run completed, parsing output
./tests/data/test_csm_data.py::test_CsmData_is_frozen_dataclass[random_test_data0-random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_csm_data.py::test_CsmData_iterative_Walsh[random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-20-0-True] Passed

./tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-1-0-True] Passed

./tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-20-0.7853981633974483-True] Passed

./tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-1-0.7853981633974483-True] Passed

./tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-1-0-False] Passed

./tests/data/test_dcf_data.py::test_dcf_3d_cart_full_traj_voronoi[40-16-20] Passed

./tests/data/test_dcf_data.py::test_dcf_3d_cart_full_traj_voronoi[1-2-2] Passed

./tests/data/test_dcf_data.py::test_dcf_3d_cart_nonuniform_traj_voronoi[30-20-10-k2_steps0-k1_steps0-k0_steps0] Passed

./tests/data/test_dcf_data.py::test_dcf_3d_cart_traj_broadcast_voronoi[40-16-20] Passed

./tests/data/test_dcf_data.py::test_dcf_3d_cart_traj_broadcast_voronoi[1-2-2] Passed

./tests/data/test_dcf_data.py::test_dcf_rpe_traj_voronoi[10-6-20] Passed

./tests/data/test_dcf_data.py::test_dcf_rpe_traj_voronoi[10-1-20] Passed

./tests/data/test_dcf_data.py::test_dcf_rpe_traj_voronoi[10-6-1] Passed

./tests/data/test_dcf_data.py::test_dcf_spiral_traj_voronoi[10-2-1] Passed

./tests/data/test_dcf_data.py::test_dcf_spiral_traj_voronoi_singlespiral Passed

./tests/data/test_idata.py::test_IData_from_kheader_and_tensor[random_kheader0-random_test_data0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_idata.py::test_IData_from_dcm_file Passed

./tests/data/test_kdata.py::test_KData_modify_header[b0-11.3] Passed

./tests/data/test_kdata.py::test_KData_modify_header[tr-value1] Passed

./tests/data/test_kdata.py::test_KData_from_file Passed

./tests/data/test_kdata.py::test_KData_raise_wrong_ktraj_shape Passed

./tests/data/test_kdata.py::test_KData_from_file_diff_nky_for_rep Passed

./tests/data/test_kdata.py::test_KData_kspace Passed

./tests/data/test_ktraj.py::test_ktraj_repeat_detection_exact[cartesian_grid0] Passed

./tests/data/test_ktraj.py::test_ktraj_repeat_detection_tol[cartesian_grid0] Passed

./tests/data/test_ktraj.py::test_ktraj_tensor_conversion[cartesian_grid0] Passed

./tests/data/test_ktraj.py::test_ktraj_raise_not_broadcastable Passed

./tests/data/test_ktraj.py::test_ktraj_raise_wrong_dim Passed

./tests/data/test_qdata.py::test_QData_from_iheader_and_tensor[random_kheader0-random_test_data0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_qdata.py::test_QData_from_kheader_and_tensor[random_kheader0-random_test_data0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_qdata.py::test_QData_from_dcm_file Passed

./tests/data/test_traj_calculators.py::test_KTrajectoryRpe_golden[random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_traj_calculators.py::test_KTrajectoryRpe_shift[random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_traj_calculators.py::test_KTrajectoryRpe_uniform[random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/data/test_traj_calculators.py::test_KTrajectorySunflowerGoldenRpe[random_kheader0-random_full_ismrmrd_header0-random_acquisition0] Passed

./tests/phantoms/test_coil_sensitivities.py::test_birdcage_sensitivities_shape Passed

./tests/phantoms/test_ellipse_phantom.py::test_image_space Passed

./tests/phantoms/test_ellipse_phantom.py::test_kspace_correct_shape Passed

./tests/phantoms/test_ellipse_phantom.py::test_kspace_raises_error Passed

./tests/phantoms/test_ellipse_phantom.py::test_kspace_image_match Passed

Total number of tests expected to run: 42
Total number of tests run: 42
Total number of tests passed: 42
Total number of tests failed: 0
Total number of tests failed with errors: 0
Total number of tests skipped: 0
Total number of tests with no result data: 0
Finished running tests!

In vscode->output->python test log I have:

============================ test session starts ==============================
platform linux -- Python 3.11.6, pytest-7.4.2, pluggy-1.3.0
rootdir: /home/zimmer08/code/mrpro
configfile: pyproject.toml
testpaths: tests
plugins: xdist-3.3.1, cov-4.1.0
created: 12/12 workers
12 workers [42 items]

..........................................                               [100%]
-------------- generated xml file: /tmp/tmp-10555XXB8Me68G04.xml ---------------
============================= 42 passed in 15.42s ==============================

@fzimmermann89
Copy link
Member Author

If we don't find a root cause soon, we might want to

  • remove the -n auto setting from pyproject.toml
  • add it to the github workflow call

Then whoever wants to have the tests run in parallel can still manually add "-n auto" to vscode's pytest arguments settings locally.

@fzimmermann89
Copy link
Member Author

... and do debugging after the ismrm deadline ;)

fzimmermann89 added a commit that referenced this issue Oct 18, 2023
Addresses issues caused by #89, mentioned in #87 
by disabling parallel testing for now.

We still test in parallel in the CI.
We still install pytest-xdist.

Any developer not having issues with parallel testing can add "-n auto" 
in their vscode pytest settings as an additional argument to still use parallel test execution.
@ckolbPTB
Copy link
Collaborator

In vscode->output->python test log I get - which is strange because everything else tells me that the tests have run successfully (well 6 out of 6 I guess no information about any errors):

============================= test session starts ==============================
platform darwin -- Python 3.11.4, pytest-7.4.0, pluggy-1.2.0
rootdir: /Users/kolbit01/Documents/PTB/Code/Bitbucket/mrpro
configfile: pyproject.toml
plugins: cov-4.1.0, xdist-3.3.1
created: 2/2 workers
2 workers [42 items]

INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/_pytest/main.py", line 270, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>                          ^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/_pytest/main.py", line 324, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_hooks.py", line 433, in __call__
INTERNALERROR>     return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_manager.py", line 112, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 155, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_result.py", line 108, in get_result
INTERNALERROR>     raise exc.with_traceback(exc.__traceback__)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 80, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>           ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/xdist/dsession.py", line 122, in pytest_runtestloop
INTERNALERROR>     self.loop_once()
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/xdist/dsession.py", line 145, in loop_once
INTERNALERROR>     call(**kwargs)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/xdist/dsession.py", line 283, in worker_testreport
INTERNALERROR>     self.config.hook.pytest_runtest_logreport(report=rep)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_hooks.py", line 433, in __call__
INTERNALERROR>     return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_manager.py", line 112, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 116, in _multicall
INTERNALERROR>     raise exception.with_traceback(exception.__traceback__)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 80, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>           ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/_pytest/terminal.py", line 574, in pytest_runtest_logreport
INTERNALERROR>     *self.config.hook.pytest_report_teststatus(report=rep, config=self.config)
INTERNALERROR>      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_hooks.py", line 433, in __call__
INTERNALERROR>     return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_manager.py", line 112, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 116, in _multicall
INTERNALERROR>     raise exception.with_traceback(exception.__traceback__)
INTERNALERROR>   File "/Users/kolbit01/Documents/PTB/Data/CONDA/MRpro/lib/python3.11/site-packages/pluggy/_callers.py", line 80, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>           ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/Users/kolbit01/.vscode/extensions/ms-python.python-2023.18.0/pythonFiles/vscode_pytest/__init__.py", line 203, in pytest_report_teststatus
INTERNALERROR>     node_path = map_id_to_path[report.nodeid]
INTERNALERROR>                 ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
INTERNALERROR> KeyError: 'tests/data/test_dcf_data.py::test_dcf_2d_rad_traj_voronoi[100-1-0.7853981633974483-True]'

============================ no tests ran in 11.72s ============================

In Test Results I get:

Finished running tests!

@fzimmermann89
Copy link
Member Author

microsoft/vscode-python#22232
might be related.

If it is related, it could be fixed with the next release of the python vscode extension...

fzimmermann89 added a commit that referenced this issue Nov 10, 2024
Addresses issues caused by #89, mentioned in #87 
by disabling parallel testing for now.

We still test in parallel in the CI.
We still install pytest-xdist.

Any developer not having issues with parallel testing can add "-n auto" 
in their vscode pytest settings as an additional argument to still use parallel test execution.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI ci/workflow related enhancement New feature or request priority: low
Projects
None yet
3 participants