MiRo is a small pet-like robot intended to be a companion. It is provided with a matrix of tactile sensors on its head and its back. The values of these tactile sensors are acquired via a ROS-based subscriber communication channel. The goal of this assignment is to process touch patterns using synchronised neural oscillators, each one coupled with a specific tactile sensor. When close tactile sensors are excited with similar stimuli, the corresponding neural oscillators become synchronised (i.e., they are coupled) and their values are used to understand the touch pattern on MiRo.
- Create a simple ROS subscriber that received data from MiRo sensors and save things on a file
- The node write on the file timestamp, number òf activated sensors and the cluster group of each sensor. Note that it writes a row only if almost one sensor is activated.
- Mapped MiRo sensors "flatting" his body. This is needed to understand which sensors are near and so which ones must be syncronized (if they are active). The map is created mesauring distances of MiRo sensors and writing them in the MiroMap.txt file
- Modify the matlab code according to our case
- Simplified "main" SlotineExperiment
- Modified readMap to read MiroMap.txt
- Modified ReadInput and ActiveOscillators according to MiRo sensors.
- Create findSyncronizations.m function to understand which sensors are coupled between them, starting from the values of the curves
In this figure are represented the neural oscillators responses (only the activated ones at least once in the pattern). This is a zoom in steps 97-98-99 (x label is multiplied by tf that is 100). We see 2 clusters: the curve in blue is in reality two curves overlying each other.
In this figure we see a heatMap of the syncroMatrix that gives for each step (each row) which sensors are not activated (gray) and which ones are activated (colored). Different colors correspond to different clusters. In this case we used the method 4 (9 observations from three maximum) and it is possible see the few errors.With findSyncronizations everything work well for simple activation sequences. Errors with this patterns are always zero with every method. With more complex pattern (touch both head and body) it finds almost every time right clusters of touched sensors. Sometimes it finds false syncronizations (coupled oscillators that aren't really coupled, and not coupled ones that are shown as coupled), due to the fact that we take unfortunate values of the curves. We try taking 5,9 values (and make the average) of each curve for each step (a step is a row in the activation sequence) in different parts of the step and with different method (changeable as input of findSyncronization.m. Also the percentage error to say if sensors are coupled is settable):
- Five observations on the whole step time (dividing it into equal subTimes)
- 11 errors with headBody pattern (0.09%)
- Nine observations on the whole step time (dividing it into equal subTimes)
- 10 errors with headBody pattern (0.08%)
- Five observations only on the second half of the step time (to wait for curves to assestate well)
- 12 errors with headBody pattern (0.1%)
- Nine observations. We take three maximums for the first activated curve and we observe the curves taking three near values for each maximum
- This is actual the method which gives less errors with the complex pattern headBody_caress (only 5 (0.04%), as you can see in the heatmap above). This because the points where the curves are more different are near the maximums. However, respect to the other method, it gives errors also for not coupled oscillators that should be coupled.
- Nine observations in random times of the step
- Number of errors is variable, but it is never less than 10 (>0.1%)
- listenerMiro.cpp node of the sensor_acquisition package inside src/sensor_acquisition/src (Ros subscriber to store values from MiRo sensors)
- Matlab functions inside matlab folder (SlotineExperiment.m is the main)
- Inside doc folder there are matlab publish outputs of each function and doxygen (open index.html) for listenerMiro.cpp
There are already touch patterns taken with listenerMiro.cpp inside matlab/activation_sequence. They are simple caress on body, head, and both (the "complex" one). Also there is the txt file with the coordinate of the MiRo's sensors (given as input to readMap). With Matlab, run SlotineExperiment.m (this is the main). If you want you can change here something:
- To change activation sequence, specify it as argument of ReadInput function
- As argument of ActivateOscillators you can change ti and tf (the smaller their range is, the more the errors could increase because curves have less time to assestate in each step). You can also change step if you want to take only some rows of the activation sequence chosen
- Inside each function there is the possibility to show more figures uncommenting final parts of the code
If you want to acquire new patter with MiRo, you need to compile and execute listenerMiro node:
- Be sure to have ROS installed in your system and to have setuped correctly your pc to communicate with MiRo (find instruction on MiRo website
- inside catkin_miro folder use catkin_make
- run roscore then, in another terminal, write command source ./devel/setup.bash
- launch node with "rosrun sensor_acquisition listener_Miro <name of activation sequence you want>
Note: We found that on some system catkin_make fails because it can't find header file of the compiled platform_sensors_msg.msg. If this happen, copy the file platform_sensors_msg.h (located in HeaderCatError folder) to devel/include/sensor_acquisition/ and run catkin_make again
Davide Torielli & Fabio Fusaro For the "Software Architectures for Robotics" course 2017/2018
Starting from code of Fulvio Mastrogiovanni (aims at validating the experiment in the Wang's and Slotine's paper; Last modified on July 19, 2010) And modified by Barbara Bruno & Jorhabib E. Gomez (for the "Software Architectures for Robotics" course 2010/2011)