-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Speed up read_raw_neuralynx()
on large datasets with many gaps
#12371
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think a changelog entry is needed for this so I'll label it as such, but feel free to add one if you want @KristijanArmeni (probably as 12371.newfeature.rst
since this is an enhancement) . I'll mark for merge-when-green, thanks in advance!
Head branch was pushed to by a user without write access
Thanks @larsoner! Added a changelog entry as suggested. |
pip-pre failures are unrelated, I opened #12372 to track it |
@larsoner just pinging here that it looks like this was not auto-merged (it was automatically disabled after my changelog commit). But I see pip-pre fails are now sorted and main is passing so I think just this just needs an update from main and should be all green. |
Thanks for the ping @KristijanArmeni, merged |
mne-tools#12371) Co-authored-by: Eric Larson <[email protected]>
Reference issue
Fixes #12370.
What does this implement/fix?
Drop
np.where
and compute onsets vianp.cumsum
Instead of usingnp.where
to iterative over gap indices and search over the recording samples, in this PR I compute all segment start samples by cumulatively summing segment sample sizes (see this change: 5b40d2).This gives starting indices for all segments, then I select only those corresponding to gaps. Doing it this way, yields substantial improvements over using
np.where
as confirmed on a a few local datasets with increasing numbers of gaps (see Figure in #12370)Additional information
Any additional information you think is important.