Skip to content

ssafarik/Flylab

Repository files navigation

Flylab Software Overview
---------------------------
Flylab is a platform for running experiments on flies in walking arenas that consist of some combination of realtime multifly tracking, closed-loop (visual servoing) interaction with mechanical fly-sized robots, closed-loop laser-galvo steering to stimulate flies dynamically and remotely, closed-loop interaction with LED display panels (q.v. M. Reiser), a fairly versatile architecture to conduct a wide range of experiments with minimal programming on the part of the researcher, data-logging of fly & robot states, and a number of other related pieces.  

The main components of Flylab are the experiments system, the image processing & tracking system, robotic actuation (motorarm or fivebar), ledpanel control, and laser galvo control.

Your primary interaction with Flylab will be through the experiments system.  You describe the experiment you want to conduct by filling out a data structure in an experiment file.  When you run the associated .launch file Flylab controls all the hardware and records a log file of what the flies did for each trial of your experiment.

When the experiment is running, you'll see a couple of windows, the most obvious of which is ROS Rviz that shows you a 3D view of the arena with the camera image(s) and tracking system.  Another smaller window (FlylabGUI) has a few buttons that let you pause/continue/stop the experiment, take a background image, etc.



Contents:
---------------------------
Installation
Calibration
Running
Experiments
Hardware Independence
Units
Background Subtraction
Image Processing & Tracking
LED Panels
Robotic Actuator
Laser Galvos
Record Video for Later
Retrack Prerecorded .bag file
Parameters
ROS Commands



Installation:
---------------------------
You need:
* Arena hardware (at a minimum, a camera and a place for flies to be viewed)
* ROS
* Flylab
* Misc other software bits

Refer to the file 'install.txt' for detailed instructions.



Calibration
---------------------------
Calibration is the process of getting the various coordinate systems to line up (camera, actuators, etc).  

Camera calibration has two parts: distortion (i.e. intrinsic parameters), and converting pixel units to millimeters (extrinsic parameters).

Laser galvo calibration involves converting voltages to millimeters, and there is a semi-automated process for doing that.

Robotic actuator calibration aligns the robot position with the camera position.

Calibration instructions for the cameras are in the file Flylab/calibrate_camera.txt.
Calibration instructions for the galvos are in the file Flylab/calibrate_galvos.txt.



Running:
---------------------------
To run Flylab, you first need to ensure that roscore is running.  In a terminal window (note below that the $ is the command prompt, not typed by you):
$ roscore

Then in another terminal window use the roslaunch command to launch Flylab:
$ roslaunch experiments nameofexperiment.launch



Experiments
---------------------------
Here is an overview, but see the file Flylab/readme_experiments.txt for more detail.  

An experiment is a sequence of trials, and is defined by a set of parameters that you edit in an experiment file (for examples, see Flylab/experiments/nodes/*.py).  A given experiment can use or not use the available hardware (motors, led panels, galvos, etc), and the behavior of each piece of hardware can be specified in detail.  The parameters are quite flexible, and once you get familiar with them it should be straightforward to conduct nearly any experiment you desire.

Each experiment consists of one or more trials, and the results of each trial are written to a log file (in .csv format).  The .csv file contains, at each timestep, the "arena state" that is the positions and velocities of all the objects (robots & flies) in the arena.

In the "experiments" directory there is a set of experiments files (*.py).  An experiment is started via the <experimentname>.launch file, and that starts all of the other needed Flylab components.  

Internally an experiment runs as a state machine, which just means that it goes through a sequence of states like a flowchart.  Progress through the flowchart is partially determined by a triggering mechanism where you get to specify the trigger conditions.

During operation, you may view the state machine graph of the experiment with the following ROS command:  rosrun smach_viewer smach_viewer.py

Experiment parameters are documented in the experiment.py files, and please refer to these files, and to Flylab/readme_experiments.txt for examples and more detail.



Hardware Independence
---------------------------
The Flylab code is independent of the specific arena hardware.  This is accomplished by using parameters that are specific to your hardware (i.e. RIG).  The parameters are set in a subdirectory of the Flylab/config directory, and each machine sets an environment variable 'RIG' that tells Flylab which set of parameters to use (this should have been done during Flylab install).  A given experiment can run on any rig (up to the rig's capabilities), and a given rig will do its best to run any experiment.



Units
---------------------------
Angle units are radians, with positive going counterclockwise.
Distance units are millimeters.  

The axes shown in Rviz are X=red, Y=green, Z=blue.



Background Subtraction
---------------------------
Background subtraction is initialized with the ~/background.png image on disk, and updates as a moving average with the time-constant rc_background (in seconds). 



Image Processing & Tracking
---------------------------
When an image comes in from the camera, it goes through a processing sequence that returns an ArenaState, i.e. the list of objects and their positions and velocities.

1. Subtract the background image from the camera image.
2. Apply a threshold to make the image black & white.
3. Find the contours (i.e. edge pixels) around each object.
4. Match the visual objects with their corresponding tracked model object.
5. Decide which end of the fly is the head, i.e. orientation detection.  See param rcFilterFlip.
6. Publish the results as an ArenaState message.

When Flylab is running, you can see the ArenaState messages via the command:
$ rostopic echo ArenaState



LED Panels
---------------------------
The ROS node called ledpanels listens to the topic "ledpanels/command", and converts "MsgPanelsCommand" messages to corresponding serial commands for the panels controller.



Robotic Actuators
---------------------------
Here is an architectural overview.  Please see the file Flylab/readme_experiments.txt for more detail on the operation of the robot.

There are two mechanisms currently supported, the Fivebar and the Motorarm.  The Fivebar mechanism has two degrees of freedom, and allows movement in both X and Y within its workspace.  The Motorarm is a single dof mechanism that moves in a circle.  Each mechanism performs visual servoing when provided with the visual position (via the "visual_state" message callback).

The ROS node for each supports the same interface:
Messages: 
    "experiment/command", with commands "continue", "pause_now", and "exit_now"

Services:
    "set_stage_state_ref"    Set the reference state for the mechanism.  
                             It will go there.
    "signal_input"           Set the reference state for the mechanism.  
                             It will go there.
    "get_stage_state"        Return the current state of the mechanism.
    "home_stage"             Go to the home position.
    "calibrate_stage"        Find the limit switches, then go to home position.



Laser Galvos
---------------------------
Here is an overview.  Please see the following files for more detail on the operation of the galvos:
* Flylab/readme_experiments.txt 
* Flylab/calibrate_galvos.txt 
 
 
To run an experiment using the galvos:
 1. Turn on illumination.
 2. Turn on the ThorLabs "Laser Diode Controller".
 3. Turn on the Acopian "Galvos Power Supply".
 4. Turn on the "Enable IR Laser" switch.
 5. Verify that all the safety panels (and their switches) are closed.  
 6. The laser diode controller "OPEN" LED should be off.
 7. Adjust the laser power as desired.
 8. Run roscore
 9. (On the Windows galvobox machine) run galvodriver w/ roscore already running.
10. roslaunch experiments <nameofexperiment.py>


At present, the galvos use a National Instruments DAQ, and that requires Windows.  Subsequently, the galvodriver needs a Windows machine (e.g. galvobox) to run, while the rest of Flylab runs on Ubuntu.  See Flylab/install.txt for details.

Some notes on the software architecture:
The galvos software is split into several components, the GalvoDirector, GalvoDriver, plus the PatternGenerator.  Their functions are as follows.

GalvoDirector:  Listen for galvodirector/command messages, where a command is essentially a list of patterns to be drawn.  Using the PatternGenerator, for each commanded patten we create a point cloud template, and then every time an ArenaState message appears, we transform the template into the proper frame, combine multiple pointclouds into one, and publish it to the galvodriver.

GalvoDriver:  Listen for galvodriver/pointcloud messages.  When a pointcloud is received, loop through the points continuoutsly, sending them to the laser galvos (via the DAQ).

PatternGenerator:  Offers the service "GetPatternPoints" to convert a pattern request to a pointcloud.



Record Video for Later
---------------------------
To save the camera images and related data from each trial for later purposes, you simply set self.experimentparams.save.bag=True in your experiment file.  Each trial will create a time-stamped .bag file in the ~/FlylabData directory.

To also save a .mov file of the trial, set self.experimentparams.save.mov=True



Retrack Prerecorded .bag file
---------------------------
You may replay a .bag file as if it were happening live, and Flylab will retrack the objects and write a new .csv file.

If environment var RETRACK is set to 1, and BAGFILE is set to a file (i.e. "export RETRACK=1" and "export BAGFILE=/path/to/file.bag"), then Flylab uses the given .bag file as a source instead of the camera(s).  For example, to retrack a single .bag file:
$ export RETRACK=1                                      
$ export BAGFILE=/home/rancher/bagfiles/12345.bag       
$ roslaunch experiments <experiment>.launch                         

To sequentially retrack a bunch of bag files, use the Flylab/retracker program.  An example, to retrack multiple .bag files:
$ export BAGFILESPEC=/home/rancher/bagfiles/*.bag       
$ rosrun Flylab retracker



Extract images or video from a .bag file
---------------------------
You may process a .bag file and save the contained images to .png / .mov / .fmf files.  
Specify the .bag file and the image topic on the command line, and optionally the .mov or .fmf filename. 

$ rosrun save bagconverter.py <bagfile> <imagetopic> [filename.mov | filename.fmf]

Examples:
$ rosrun save bagconverter.py yourfile.bag /camera/image_rect /home/rancher/asdf.mov
$ rosrun save bagconverter.py yourfile.bag /camera/image_rect /home/rancher/asdf.fmf


Before you run the above command, it may help to see what's in the .bag via the following command:
$ rosbag info yourfile.bag


Parameters
---------------------------
Each subsystem of Flylab (tracking, ledpanels, galvos, etc), has its own set of parameters, located in a params_*.launch file in the appropriate config/RIG directory.  Please see the respective files for documentation on each parameter.



Camera Parameters
---------------------------
There are three main types of camera: ethernet, firewire, and usb.  Each type uses a different driver (with the exception of Firefly usb cameras that use the firewire driver), and each driver enables a different set of parameters.  Some description is given here, but please look at existing params_camera.launch files in the Flylab/config/RIG/launch directories for examples.

Ethernet, using the camera_aravis driver:
    guid                  String identifying the camera.  Defaults to the 
                          first camera if not set.
    Acquire               True/false turns the camera on/off.
    ExposureAuto          Off/Once/Continuous
    GainAuto              Off/Once/Continuous
    ExposureTimeAbs       Time the shutter is open, in microseconds.
    Gain                  Pixel multiplier, 100 is 1.0
    AcquisitionMode       Single/Continuous
    AcquisitionFrameRate  Frames per second, driven internally to the camera.
    TriggerMode           Off/
    TriggerSource         Line1/Line2/Software
    softwaretriggerrate   Frames per second, driven by the software driver.
    frame_id              String to use the the frame_id field of the 
                          ROS Image header.
    mtu                   Max transmission unit, bytes per network packet.

Firewire or USB, using the 1394 driver:
    video_mode            String specifying the camera mode, 
                          usually "format7_mode0"
    frame_rate            Frames per second.
    shutter               Time the shutter is open.  (units?)
 


Some useful commands:
-----------------------
roslaunch experiments <experimentname>.launch            # Run an experiment.
rostopic echo camera/image_raw/header                    # Show image headers 
                                                         # to see if the camera 
                                                         # is sending images.
rosrun image_view image_view image:=camera/image_rect &  # Open a window to 
                                                         # show image_rect.
rosrun smach_viewer smach_viewer.py                      # Show the state 
                                                         # machine graph.
rqt_console                                              # Show ROS messages, 
                                                         # logs, etc.


About

Software platform for Flyranch experiments & hardware.

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •