Skip to content

Commit

Permalink
Merge branch 'main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
magland authored Nov 17, 2023
2 parents ca638ad + efc042e commit 8e031e0
Show file tree
Hide file tree
Showing 41 changed files with 896 additions and 229 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/full-test-with-codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,8 @@ jobs:
- name: Shows installed packages by pip, git-annex and cached testing files
uses: ./.github/actions/show-test-environment
- name: run tests
env:
HDF5_PLUGIN_PATH: ${{ github.workspace }}/hdf5_plugin_path_maxwell
run: |
source ${{ github.workspace }}/test_env/bin/activate
pytest -m "not sorters_external" --cov=./ --cov-report xml:./coverage.xml -vv -ra --durations=0 | tee report_full.txt; test ${PIPESTATUS[0]} -eq 0 || exit 1
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/full-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,8 @@ jobs:
- name: Test core
run: ./.github/run_tests.sh core
- name: Test extractors
env:
HDF5_PLUGIN_PATH: ${{ github.workspace }}/hdf5_plugin_path_maxwell
if: ${{ steps.modules-changed.outputs.EXTRACTORS_CHANGED == 'true' || steps.modules-changed.outputs.CORE_CHANGED == 'true' }}
run: ./.github/run_tests.sh "extractors and not streaming_extractors"
- name: Test preprocessing
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 23.10.1
rev: 23.11.0
hooks:
- id: black
files: ^src/
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ With SpikeInterface, users can:

## Documentation

Detailed documentation of the latest PyPI release of SpikeInterface can be found [here](https://spikeinterface.readthedocs.io/en/0.99.0).
Detailed documentation of the latest PyPI release of SpikeInterface can be found [here](https://spikeinterface.readthedocs.io/en/0.99.1).

Detailed documentation of the development version of SpikeInterface can be found [here](https://spikeinterface.readthedocs.io/en/latest).

Expand Down
2 changes: 2 additions & 0 deletions doc/modules/preprocessing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,8 @@ dtype (unless specified otherwise):
Some scaling pre-processors, such as :code:`whiten()` or :code:`zscore()`, will force the output to :code:`float32`.

When converting from a :code:`float` to an :code:`int`, the value will first be rounded to the nearest integer.


Available preprocessing
-----------------------
Expand Down
13 changes: 13 additions & 0 deletions doc/releases/0.99.1.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
.. _release0.99.1:

SpikeInterface 0.99.1 release notes
-----------------------------------

14th November 2023

Minor release with some bug fixes.

* Fix crash when default start / end frame arguments on motion interpolation are used (#2176)
* Fix bug in `make_match_count_matrix()` when computing matching events (#2182, #2191, #2196)
* Fix maxwell tests by setting HDF5_PLUGIN_PATH env in action (#2161)
* Add read_npz_sorting to extractors module (#2183)
7 changes: 7 additions & 0 deletions doc/whatisnew.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ Release notes
.. toctree::
:maxdepth: 1

releases/0.99.1.rst
releases/0.99.0.rst
releases/0.98.2.rst
releases/0.98.1.rst
Expand All @@ -32,6 +33,12 @@ Release notes
releases/0.9.1.rst


Version 0.99.1
==============

* Minor release with some bug fixes


Version 0.99.0
==============

Expand Down
231 changes: 140 additions & 91 deletions src/spikeinterface/comparison/comparisontools.py

Large diffs are not rendered by default.

7 changes: 6 additions & 1 deletion src/spikeinterface/comparison/paircomparisons.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ def __init__(
delta_time=0.4,
match_score=0.5,
chance_score=0.1,
ensure_symmetry=False,
n_jobs=1,
verbose=False,
):
Expand Down Expand Up @@ -55,6 +56,8 @@ def __init__(
self.unit1_ids = self.sorting1.get_unit_ids()
self.unit2_ids = self.sorting2.get_unit_ids()

self.ensure_symmetry = ensure_symmetry

self._do_agreement()
self._do_matching()

Expand Down Expand Up @@ -85,7 +88,7 @@ def _do_agreement(self):

# matrix of event match count for each pair
self.match_event_count = make_match_count_matrix(
self.sorting1, self.sorting2, self.delta_frames, n_jobs=self.n_jobs
self.sorting1, self.sorting2, self.delta_frames, ensure_symmetry=self.ensure_symmetry
)

# agreement matrix score for each pair
Expand Down Expand Up @@ -153,6 +156,7 @@ def __init__(
delta_time=delta_time,
match_score=match_score,
chance_score=chance_score,
ensure_symmetry=True,
n_jobs=n_jobs,
verbose=verbose,
)
Expand Down Expand Up @@ -285,6 +289,7 @@ def __init__(
delta_time=delta_time,
match_score=match_score,
chance_score=chance_score,
ensure_symmetry=False,
n_jobs=n_jobs,
verbose=verbose,
)
Expand Down
100 changes: 88 additions & 12 deletions src/spikeinterface/comparison/tests/test_comparisontools.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,73 @@ def test_make_match_count_matrix_repeated_matching_but_no_double_counting():
assert_array_equal(result.to_numpy(), expected_result)


def test_make_match_count_matrix_repeated_matching_but_no_double_counting_2():
# More challenging condition, this was failing with the previous approach that used np.where and np.diff
# This actual implementation should fail but the "clip protection" by number of spike make the solution.
# This is cheating but acceptable for really corner cases (burst in the ground truth).
frames_spike_train1 = [100, 105, 110]
frames_spike_train2 = [
100,
105,
]
unit_indices1 = [0, 0, 0]
unit_indices2 = [
0,
0,
]
delta_frames = 20 # long enough, so all frames in both sortings are within each other reach

sorting1, sorting2 = make_sorting(frames_spike_train1, unit_indices1, frames_spike_train2, unit_indices2)

# this is easy because it is sorting2 centric
result = make_match_count_matrix(sorting2, sorting1, delta_frames=delta_frames, ensure_symmetry=False)
expected_result = np.array([[2]])
assert_array_equal(result.to_numpy(), expected_result)

# this work only because we protect by clipping
result = make_match_count_matrix(sorting1, sorting2, delta_frames=delta_frames, ensure_symmetry=False)
expected_result = np.array([[2]])
assert_array_equal(result.to_numpy(), expected_result)


def test_make_match_count_matrix_ensure_symmetry():
frames_spike_train1 = [
100,
102,
105,
120,
1000,
]
unit_indices1 = [0, 2, 1, 0, 0]
frames_spike_train2 = [101, 150, 1000]
unit_indices2 = [0, 1, 0]
delta_frames = 100

sorting1, sorting2 = make_sorting(frames_spike_train1, unit_indices1, frames_spike_train2, unit_indices2)

result = make_match_count_matrix(sorting1, sorting2, delta_frames=delta_frames, ensure_symmetry=True)
result_T = make_match_count_matrix(sorting2, sorting1, delta_frames=delta_frames, ensure_symmetry=True)

assert_array_equal(result.T, result_T)


def test_make_match_count_matrix_test_proper_search_in_the_second_train():
"Search exhaustively in the second train, but only within the delta_frames window, do not terminate search early"
frames_spike_train1 = [500, 600, 800]
frames_spike_train2 = [0, 100, 200, 300, 500, 800]
unit_indices1 = [0, 0, 0]
unit_indices2 = [0, 0, 0, 0, 0, 0]
delta_frames = 20

sorting1, sorting2 = make_sorting(frames_spike_train1, unit_indices1, frames_spike_train2, unit_indices2)

result = make_match_count_matrix(sorting1, sorting2, delta_frames=delta_frames)

expected_result = np.array([[2]])

assert_array_equal(result.to_numpy(), expected_result)


def test_make_agreement_scores():
delta_frames = 10

Expand All @@ -150,15 +217,15 @@ def test_make_agreement_scores():
[0, 0, 5],
)

agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)
print(agreement_scores)

ok = np.array([[2 / 3, 0], [0, 1.0]], dtype="float64")

assert_array_equal(agreement_scores.values, ok)

# test if symetric
agreement_scores2 = make_agreement_scores(sorting2, sorting1, delta_frames, n_jobs=1)
# test if symmetric
agreement_scores2 = make_agreement_scores(sorting2, sorting1, delta_frames)
assert_array_equal(agreement_scores, agreement_scores2.T)


Expand All @@ -178,7 +245,7 @@ def test_make_possible_match():
[0, 0, 5],
)

agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)

possible_match_12, possible_match_21 = make_possible_match(agreement_scores, min_accuracy)

Expand Down Expand Up @@ -207,7 +274,7 @@ def test_make_best_match():
[0, 0, 5],
)

agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)

best_match_12, best_match_21 = make_best_match(agreement_scores, min_accuracy)

Expand Down Expand Up @@ -236,7 +303,7 @@ def test_make_hungarian_match():
[0, 0, 5],
)

agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)

hungarian_match_12, hungarian_match_21 = make_hungarian_match(agreement_scores, min_accuracy)

Expand Down Expand Up @@ -344,8 +411,8 @@ def test_do_confusion_matrix():

event_counts1 = do_count_event(sorting1)
event_counts2 = do_count_event(sorting2)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)
hungarian_match_12, hungarian_match_21 = make_hungarian_match(agreement_scores, min_accuracy)

confusion = do_confusion_matrix(event_counts1, event_counts2, hungarian_match_12, match_event_count)
Expand All @@ -363,8 +430,8 @@ def test_do_confusion_matrix():

event_counts1 = do_count_event(sorting1)
event_counts2 = do_count_event(sorting2)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)
hungarian_match_12, hungarian_match_21 = make_hungarian_match(agreement_scores, min_accuracy)

confusion = do_confusion_matrix(event_counts1, event_counts2, hungarian_match_12, match_event_count)
Expand All @@ -391,8 +458,8 @@ def test_do_count_score_and_perf():

event_counts1 = do_count_event(sorting1)
event_counts2 = do_count_event(sorting2)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames, n_jobs=1)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames, n_jobs=1)
match_event_count = make_match_count_matrix(sorting1, sorting2, delta_frames)
agreement_scores = make_agreement_scores(sorting1, sorting2, delta_frames)
hungarian_match_12, hungarian_match_21 = make_hungarian_match(agreement_scores, min_accuracy)

count_score = do_count_score(event_counts1, event_counts2, hungarian_match_12, match_event_count)
Expand All @@ -415,6 +482,15 @@ def test_do_count_score_and_perf():

if __name__ == "__main__":
test_make_match_count_matrix()
test_make_match_count_matrix_sorting_with_itself_simple()
test_make_match_count_matrix_sorting_with_itself_longer()
test_make_match_count_matrix_with_mismatched_sortings()
test_make_match_count_matrix_no_double_matching()
test_make_match_count_matrix_repeated_matching_but_no_double_counting()
test_make_match_count_matrix_repeated_matching_but_no_double_counting_2()
test_make_match_count_matrix_test_proper_search_in_the_second_train()
test_make_match_count_matrix_ensure_symmetry()

test_make_agreement_scores()

test_make_possible_match()
Expand Down
Loading

0 comments on commit 8e031e0

Please sign in to comment.