Skip to content

Releases: alireza787b/Python-Gaze-Face-Tracker

Version 1.3: Enhanced Head Orientation and Gaze Visualization

28 Nov 17:28
271d79b
Compare
Choose a tag to compare

Version 1.3: Enhanced Head Orientation and Gaze Visualization

Version 1.3 of our Python Gaze and Face Tracker, featurs significant improvements in head orientation accuracy and gaze direction visualization. This release also includes refined parameter definitions and updated documentation to enhance user experience and customization.

What's New in 1.3

  • Improved Head Orientation and Positioning: The accuracy of head pose estimation has been significantly enhanced. This improvement allows for more precise tracking of the user's head movements, crucial for applications in HCI, gaming, and accessibility tools.

  • Enhanced Gaze Direction Visualization: We've introduced an advanced visualization of gaze direction. This feature provides a more intuitive understanding of the user's focus and attention direction, making it an invaluable tool for user interaction studies and interface design.

  • Refined Parameter Definitions: Parameters have been redefined and organized for better clarity and ease of customization. Users can now more easily adjust settings to suit their specific requirements.

  • Updated Documentation: The documentation has been thoroughly updated to reflect the new features and changes. It now includes a detailed explanation of interactive commands, making the tool more user-friendly and accessible.

Interactive Commands

  • 'c' Key: Calibrate Head Pose - Recalibrates the head pose estimation to the current orientation.
  • 'r' Key: Start/Stop Recording - Toggles the recording of data.
  • 'q' Key: Quit Program - Exits the program.

Acknowledgements

Special thanks to Asadullah-Dal17 for contributing in many parts of this code.

Get Started

To get started with Version 1.3, visit our GitHub repository and follow the installation or youtube tutorial and usage instructions.


For any questions or issues, please open an issue on GitHub.

v1.2: Enhanced Head Pose Estimation and Facial Landmark Visualization

24 Nov 02:55
63b3dda
Compare
Choose a tag to compare

Python-Gaze-Face-Tracker v1.2 introduces significant enhancements to our advanced real-time eye-tracking and facial landmark detection system. This version is improved by providing head position estimation and visually informative tracking capabilities.

What's New in v1.2:

Real-Time Head Pose Estimation: Dive into the next level of interaction with our real-time head pose estimation. Now, you can accurately determine the roll, pitch, and yaw of the user's head with live feedback.

Custom Real-Time Facial Landmark Visualization: Explore the capabilities of MediaPipe easily if you are developer or just curios using mediapipe_landmarks_test.py script. This tool allows you to visualize and track each of the MediaPipe facial landmark indices in real time, providing a hands-on way to identify the most relevant facial landmarks for your projects and observe them directly in the video feed.

Enhanced Documentation and Guides: I have improved documentation, making it easier for you to get started and integrate our tracker into your projects.

Bug Fixes: I have fixed some bugs and some improvements...

Use Cases: The Python-Gaze-Face-Tracker v1.2 continues to be an excellent tool for applications in aviation, human-computer interaction (HCI), augmented reality (AR), and research. Whether you're developing interactive technologies or conducting advanced studies, this tool is designed to meet your needs.

Note: This release is intended for educational and research purposes and is particularly suited for applications in aviation, HCI, AR, and similar fields.

Get started with Python-Gaze-Face-Tracker v1.2 today and take your project to new heights!

Current commit ID: 63b3dda

v1.1 Enhanced with Blink Detection & Stability Improvements

15 Nov 04:03
Compare
Choose a tag to compare

Release 1.1: Blink Detection Integration and Stability Enhancements

New Features

  • Blink Detection Integration: Collaboratively developed with Asadullah Dal, this release introduces blink detection capabilities. The feature allows for the real-time counting and logging of blinks, enhancing the application's utility in various research and practical domains.

Improvements

  • Stability and performance improvements in landmark detection.
  • Enhanced accuracy and responsiveness in real-time tracking.

Bug Fixes

  • Various bug fixes to improve data logging consistency and application stability.

Documentation

For detailed installation and usage instructions, please refer to the README.

Acknowledgements

Special acknowledgment to Asadullah Dal for his significant contributions to the blink detection feature. His expertise has greatly enriched the functionality of the Python-Gaze-Face-Tracker.


Feedback and contributions are always appreciated to further improve and develop Python-Gaze-Face-Tracker. Stay tuned for future updates and enhancements.

v1.0

14 Nov 09:35
5ad4a6d
Compare
Choose a tag to compare

Python-Gaze-Face-Tracker

Advanced Real-Time Eye and Facial Landmark Tracking System


Description

Python-Gaze-Face-Tracker is a Python-based application designed for advanced real-time eye tracking, facial landmark detection, and head tracking, utilizing OpenCV and MediaPipe technology. Specializing in uncalibrated gaze tracking, this tool is an easy to use Python eye and facial landmark tracker. It excels in visualizing iris positions and offers robust logging capabilities for both eye and facial landmark data. Equipped with the ability to transmit this iris and gaze information over UDP sockets, Python-Gaze-Face-Tracker stands out for various applications, including aviation, human-computer interaction (HCI), and augmented reality (AR). Its unmatched efficiency in iris tracking and face tracking positions it as a go-to solution for researchers and developers seeking a comprehensive Python facial landmark and gaze tracking system. This tool not only facilitates detailed eye movement analysis but also supports head tracking, making it a comprehensive package for advanced gaze tracking and facial feature analysis in any interactive technology application.


Features

  • Real-Time Eye Tracking: Tracks and visualizes iris and eye corner positions in real-time using webcam input.
  • Facial Landmark Detection: Detects and displays up to 468 facial landmarks.
  • Data Logging: Records tracking data to CSV files, including timestamps, eye positions, and optional facial landmark data. Note: Enabling logging of all 468 facial landmarks can result in large log files.
  • Socket Communication: Supports transmitting only iris tracking data via UDP sockets for integration with other systems or applications.

Requirements

  • Python 3.x
  • OpenCV (opencv-python)
  • MediaPipe (mediapipe)
  • Other Python standard libraries: math, socket, argparse, time, csv, datetime, os

Note

The Python-Gaze-Face-Tracker is intended for educational and research purposes and is particularly suited for applications in aviation, HCI, AR, and similar fields.