Skip to content

๐Ÿ‘€ Eye Tracking library easily implementable to your projects

License

Notifications You must be signed in to change notification settings

jun297/GazeTracking

ย 
ย 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

27 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Gaze Tracking

made-with-python Open Source Love License: MIT GitHub stars

This is a Python (2 and 3) library that provides a webcam-based eye tracking system. It gives you the exact position of the pupils and the gaze direction, in real time.

Demo

๐Ÿš€ Quick note: I'm looking for job opportunities as a software developer, for exciting projects in ambitious companies. Anywhere in the world. Send me an email!

Installation

Clone this project:

git clone https://github.com/antoinelame/GazeTracking.git

For Pip install

Install these dependencies (NumPy, OpenCV, Dlib):

pip install -r requirements.txt

The Dlib library has four primary prerequisites: Boost, Boost.Python, CMake and X11/XQuartx. If you doesn't have them, you can read this article to know how to easily install them.

For Anaconda install

Install these dependencies (NumPy, OpenCV, Dlib):

conda env create --file environment.yml
#After creating environment, activate it
conda activate GazeTracking

Verify Installation

Run the demo:

python example.py

Simple Demo

import cv2
from gaze_tracking import GazeTracking

gaze = GazeTracking()
webcam = cv2.VideoCapture(0)

while True:
    _, frame = webcam.read()
    gaze.refresh(frame)

    new_frame = gaze.annotated_frame()
    text = ""

    if gaze.is_right():
        text = "Looking right"
    elif gaze.is_left():
        text = "Looking left"
    elif gaze.is_center():
        text = "Looking center"

    cv2.putText(new_frame, text, (60, 60), cv2.FONT_HERSHEY_DUPLEX, 2, (255, 0, 0), 2)
    cv2.imshow("Demo", new_frame)

    if cv2.waitKey(1) == 27:
        break

Documentation

In the following examples, gaze refers to an instance of the GazeTracking class.

Refresh the frame

gaze.refresh(frame)

Pass the frame to analyze (numpy.ndarray). If you want to work with a video stream, you need to put this instruction in a loop, like the example above.

Position of the left pupil

gaze.pupil_left_coords()

Returns the coordinates (x,y) of the left pupil.

Position of the right pupil

gaze.pupil_right_coords()

Returns the coordinates (x,y) of the right pupil.

Looking to the left

gaze.is_left()

Returns True if the user is looking to the left.

Looking to the right

gaze.is_right()

Returns True if the user is looking to the right.

Looking at the center

gaze.is_center()

Returns True if the user is looking at the center.

Horizontal direction of the gaze

ratio = gaze.horizontal_ratio()

Returns a number between 0.0 and 1.0 that indicates the horizontal direction of the gaze. The extreme right is 0.0, the center is 0.5 and the extreme left is 1.0.

Vertical direction of the gaze

ratio = gaze.vertical_ratio()

Returns a number between 0.0 and 1.0 that indicates the vertical direction of the gaze. The extreme top is 0.0, the center is 0.5 and the extreme bottom is 1.0.

Blinking

gaze.is_blinking()

Returns True if the user's eyes are closed.

Webcam frame

frame = gaze.annotated_frame()

Returns the main frame with pupils highlighted.

You want to help?

Your suggestions, bugs reports and pull requests are welcome and appreciated. You can also starring โญ๏ธ the project!

If the detection of your pupils is not completely optimal, you can send me a video sample of you looking in different directions. I would use it to improve the algorithm.

Licensing

This project is released by Antoine Lamรฉ under the terms of the MIT Open Source License. View LICENSE for more information.

About

๐Ÿ‘€ Eye Tracking library easily implementable to your projects

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 94.6%
  • Shell 3.3%
  • Dockerfile 2.1%