-
Notifications
You must be signed in to change notification settings - Fork 0
Activity Tracking
Matias Andina edited this page Apr 9, 2020
·
5 revisions
We are tracking activity using optic flow estimation and tossing all video.
Why?
- It gets the job done (general activity estimate).
- It's computationally inexpensive. As of 2020, DeepLabCut is still computationally expensive, we can't run it real time on a raspberry Pi 4.
TODO: add gif showing the movement quantification in open field
An open field video is used as example. In this case, movement was calculated in two ways:
- Ground truth: quantification of the mouse movement using background subtraction technique.
- Inferred movement: quantification of the mouse movement using optic flow (
opt_flow.py
).
The inferred movement shows a similar pattern with the ground truth.
Another way to explore this, is looking at the cumulative movement during the trial.
Additionally, a correlation of the quantified movement for each timepoint yields a good linear relationship between both movement quantification methods.