This software package implements a low-level sensor data fusion algorithm, in which the data extracted from highly synchronized radar and vision sensors is combined associating the measurements and then feeding the fused measurements to a central tracking algorithm, based on Kalman filter updates.
The first step is to temporally and spatially align the sensor-level object lists from all the sensors to a common reference frame. This puts all of the object list into a global coordinate system. Once this is accomplished, the object lists from all of the sensors are associated with one another, in order to determine which object from different sensors correspond to the same object in reality. Combined with other perception modules such as lane detection, digital maps and host vehicle localization, this provides any driver assistance application specific situation assessment algorithms. Some examples of those are controlling an actuator, triggering a warning and changing a state.
Instructions to enable Python - Matlab Interoperability
1. Create a virtual environment
2. Install Python 3.6
3. Install Python modules:
- numpy v. 1.16.2
- scikit-learn v. 0.19.1
- scipy v. 1.1.0
- sklearn v. 0.0
4. Start Matlab and setup the Python interpreter for Matlab with: pyversion(path_to_python.exe_in_your_virtual_env)
After successfully completing basic installation, you'll be ready to run the demo.
To run the demo:
Run SF_Synthetic_Main.m
Arrows represent the velocity vectors of other actors in the scene.
On matlabDemo.py, you can tune the hyperparameters:
--clutter_threshold: type=float, default=0.75, help='if mahalanobis distance > clutter thresholod, assign the object as false positive'
--last_seen: type=float, default=1.0, help='if the tracked object has not been seen longer than last_seen, delete it from the fusion list'
--distance_to_ego: type=float, default=200, help='distance to ego (L1 norm of the tracked objects' state vector)'