Skip to content

Latest commit

 

History

History
112 lines (90 loc) · 9.01 KB

README.md

File metadata and controls

112 lines (90 loc) · 9.01 KB

RT-GENE: Real-Time Eye Gaze and Blink Estimation in Natural Environments

License: CC BY-NC-SA 4.0 stars GitHub issues GitHub repo size

License + Attribution

This code is licensed under CC BY-NC-SA 4.0 (note that some libraries are used that are distributed under a different license, see below). Commercial usage is not permitted; please contact [email protected] or [email protected] regarding commercial licensing. If you use this dataset or the code in a scientific publication, please cite the following paper:

@inproceedings{FischerECCV2018,
author = {Tobias Fischer and Hyung Jin Chang and Yiannis Demiris},
title = {{RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments}},
booktitle = {European Conference on Computer Vision},
year = {2018},
month = {September},
pages = {339--357}
}

RT-GENE was supported in part by the Samsung Global Research Outreach program, and in part by the EU Horizon 2020 Project PAL (643783-RIA).

If you use our blink estimation code, please also cite the relevant paper:

@inproceedings{CortaceroICCV2019W,
author={Kevin Cortacero and Tobias Fischer and Yiannis Demiris},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision Workshops},
title = {RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments},
year = {2019},
}

RT-BENE was supported by the EU Horizon 2020 Project PAL (643783-RIA) and a Royal Academy of Engineering Chair in Emerging Technologies to Yiannis Demiris.

More information can be found on the Personal Robotic Lab's website: https://www.imperial.ac.uk/personal-robotics/software/.

Requirements

Manual installation

  1. Download, install, and configure ROS (full installation; we recommend the Kinectic or Melodic distributions of ROS depending on your Ubuntu version): http://wiki.ros.org/kinetic/Installation or http://wiki.ros.org/melodic/Installation
  2. Install additional packages for ROS:
    • For kinetic: sudo apt-get install python-catkin-tools ros-kinetic-ros-numpy ros-kinetic-camera-info-manager-py ros-kinetic-uvc-camera libcamera-info-manager-dev
    • For melodic: sudo apt-get install python-catkin-tools python-catkin-pkg ros-melodic-uvc-camera libcamera-info-manager-dev
  3. Install required Python packages:
    • For pip users (we recommend using virtualenv or similar tools): pip install tensorflow-gpu numpy scipy tqdm torch torchvision Pillow dlib opencv-python
    • For conda users (create a new environment first if you want): conda install -c conda-forge dlib tensorflow-gpu numpy scipy tqdm pillow rospkg opencv empy pytorch torchvision
  4. Download and build RT-GENE:
    1. cd $HOME/catkin_ws/src && git clone https://github.com/Tobias-Fischer/rt_gene.git
    2. cd $HOME/catkin_ws && catkin build

Optional ensemble model files

  • To use an ensemble scheme using 4 models trained on the MPII, UTMV and RT-GENE datasets, you need to adjust the estimate_gaze.launch file (make sure you comply with the licenses of MPII and UTMV! these model files are licensed under CC BY-NC-SA 4.0).
  • Open $(rospack find rt_gene)/launch/estimate_gaze.launch and comment out <rosparam param="model_files">['model_nets/Model_allsubjects1.h5']</rosparam> and uncomment <!--rosparam param="model_files">['model_nets/all_subjects_mpii_prl_utmv_0_02.h5', ..., ..., ...</rosparam-->

Model download

Note that required model files are downloaded the first time that the ROS node starts. An alternative mirror can be found here; these files need to be moved into $HOME/catkin_ws/src/rt_gene/rt_gene/model_nets.

Requirements for live gaze estimation (Kinect One)

Requirements for live gaze estimation (webcam / RGB only)

  • Calibrate your camera (http://wiki.ros.org/camera_calibration).
  • Save the resulting *.yaml file to $(rospack find rt_gene)/webcam_configs/.
  • Change the entry for the camera_info_url in the $(rospack find rt_gene)/launch/start_webcam.launch file.

Instructions for estimating gaze

Estimate gaze live from Kinect One

  1. roscore
  2. roslaunch rt_gene start_kinect.launch
  3. roslaunch rt_gene estimate_gaze.launch

Estimate gaze live from Webcam / RGB only camera

  1. roscore
  2. roslaunch rt_gene start_webcam.launch
  3. roslaunch rt_gene estimate_gaze.launch

Estimate gaze from Video

  1. roscore
  2. roslaunch rt_gene start_video.launch (make sure to change the camera_info_url and video_file arguments)
  3. roslaunch rt_gene estimate_gaze.launch

Estimate gaze from ROSBag

  1. roscore
  2. roslaunch rt_gene start_rosbag.launch rosbag_file:=/path/to/rosbag.bag (this assumes a recording with the Kinect v2 and might need adjustments)
  3. roslaunch rt_gene estimate_gaze.launch ros_frame:=kinect2_nonrotated_link

Instructions for estimating blinks

Follow the instructions for estimating gaze above, and run in addition roslaunch rt_gene estimate_blink.launch. Note that the blink estimation relies on the extract_landmarks_node.py node, however can run independently from the estimate_gaze.py node.

List of libraries

Code included from other libraries

External libraries required via Python imports