Skip to content

Version 1.3: Enhanced Head Orientation and Gaze Visualization

Latest
Compare
Choose a tag to compare
@alireza787b alireza787b released this 28 Nov 17:28
· 3 commits to main since this release
271d79b

Version 1.3: Enhanced Head Orientation and Gaze Visualization

Version 1.3 of our Python Gaze and Face Tracker, featurs significant improvements in head orientation accuracy and gaze direction visualization. This release also includes refined parameter definitions and updated documentation to enhance user experience and customization.

What's New in 1.3

  • Improved Head Orientation and Positioning: The accuracy of head pose estimation has been significantly enhanced. This improvement allows for more precise tracking of the user's head movements, crucial for applications in HCI, gaming, and accessibility tools.

  • Enhanced Gaze Direction Visualization: We've introduced an advanced visualization of gaze direction. This feature provides a more intuitive understanding of the user's focus and attention direction, making it an invaluable tool for user interaction studies and interface design.

  • Refined Parameter Definitions: Parameters have been redefined and organized for better clarity and ease of customization. Users can now more easily adjust settings to suit their specific requirements.

  • Updated Documentation: The documentation has been thoroughly updated to reflect the new features and changes. It now includes a detailed explanation of interactive commands, making the tool more user-friendly and accessible.

Interactive Commands

  • 'c' Key: Calibrate Head Pose - Recalibrates the head pose estimation to the current orientation.
  • 'r' Key: Start/Stop Recording - Toggles the recording of data.
  • 'q' Key: Quit Program - Exits the program.

Acknowledgements

Special thanks to Asadullah-Dal17 for contributing in many parts of this code.

Get Started

To get started with Version 1.3, visit our GitHub repository and follow the installation or youtube tutorial and usage instructions.


For any questions or issues, please open an issue on GitHub.