π¬ Eye Tracking System with Research Features
A high precision eye tracking system designed for research, academic studies, and human-computer interaction applications.
- High-precision pupil tracking with sub-pixel accuracy
- Calibration system with 9-point calibration
- Real-time quality assessment and monitoring
- Multi-modal sensor fusion for enhanced accuracy
- Attention span analysis and processing speed assessment
- Data logging with multiple export formats
- Real-time analytics dashboard with charts
- Session management and annotation capabilities
- Quality monitoring and statistical analysis
- 9-point calibration system for high accuracy
- Real-time calibration quality assessment
- Adaptive calibration based on user performance
- Calibration validation and feedback
- Clone the repository:
git clone <repository-url>
cd face_eye_tracker- Install dependencies:
pip install -r requirements.txt- Download the MediaPipe model:
# Download face_landmarker.task from:
# https://storage.googleapis.com/mediapipe-models/face_landmarker/face_landmarker/float16/1/face_landmarker.task
# Place it in the face_eye_tracker directorypython run_eye_tracker.py --ui research# Modern UI
python run_eye_tracker.py --ui modern
# Simple UI
python run_eye_tracker.py --ui simple
# Full Feature UI
python run_eye_tracker.py --ui comprehensive
# Headless mode
python run_eye_tracker.py --ui headless- Start Calibration: Click "Start Calibration" in the research interface
- Follow Points: Look at each of the 9 calibration points as prompted
- Quality Assessment: Monitor calibration quality in real-time
- Complete: Finish calibration when all points are calibrated
- Start Session: Begin data collection with "Start Research Session"
- Monitor Metrics: Watch real-time research metrics and quality indicators
- Add Annotations: Mark significant events or conditions during the session
- Export Data: Export research data for analysis
The system supports multiple export formats:
- JSON: Complete session data with metadata
- CSV: Tabular data for statistical analysis
- Excel: Multi-sheet workbook with data
- Pickle: Python-compatible data format
-
Pupil Position: High-precision pupil center coordinates
-
Gaze Point: Estimated gaze position on screen
-
Pupil Diameter: Pupil size measurements
-
Eye Velocity: Movement speed and patterns
-
Fixation Duration: Time spent looking at specific areas
-
Blink Pattern Analysis: Blink rate, duration, and patterns
-
Eye Openness: Continuous monitoring of eye openness
-
Head Pose: Head position and movement analysis
- Tracking Quality: Overall system performance
- Calibration Quality: Calibration accuracy assessment
- Face Detection: Face detection confidence
- Pupil Tracking: Pupil detection reliability
- High-precision pupil tracking
- Real-time quality monitoring
- Research interface
- Calibration system
- Real-time analytics dashboard
- Session management tools
- Data collection
- Multiple export formats
- Real-time analysis
- Quality assessment
face_eye_tracker/
βββ face-eye-tracker/
β βββ utils/
β β βββ core/
β β β βββ advanced_tracker.py # Tracking engine
β β β βββ tracker.py # Standard tracking engine
β β βββ research_data_logger.py # Research data logging
β β βββ data_logger.py # Standard data logging
β βββ ui/
β β βββ research_ui.py # Research interface
β β βββ modern_ui.py # Modern interface
β β βββ comprehensive_ui.py # Full feature interface
β β βββ simple_ui.py # Simple interface
β β βββ headless_ui.py # Headless interface
β βββ main.py # Main application
βββ face_landmarker.task # MediaPipe model
βββ requirements.txt # Dependencies
βββ run_eye_tracker.py # Launcher script
- Optimized frame processing for minimal latency
- Efficient data structures for high-frequency updates
- Background analysis threads for non-blocking operation
- Smart buffering for smooth UI updates
- Real-time quality monitoring with automatic adjustments
- Adaptive thresholds based on environmental conditions
- Error handling and recovery mechanisms
- Performance metrics and optimization feedback
- Resolution: 1280x720 (research mode), 640x480 (standard mode)
- Frame Rate: 60 FPS (research mode), 30 FPS (standard mode)
- Auto-focus: Enabled for stability
- Auto-exposure: Optimized for eye tracking
- Pupil Detection Confidence: 0.8 (high accuracy)
- Gaze Estimation Confidence: 0.7 (balanced accuracy)
- Quality Threshold: 0.7 (minimum acceptable quality)
- Cognitive science studies
- Human-computer interaction research
- Attention and focus studies
- Driver monitoring systems
- Workplace safety assessment
- Educational technology research
- Healthcare monitoring
- Interface usability studies
- Attention pattern analysis
- User behavior research
- Extend AdvancedEyeTracker for new tracking capabilities
- Update ResearchEyeTrackerUI for new interface elements
- Enhance ResearchDataLogger for new data types
- Add new export formats as needed
- Modify tracking parameters in
advanced_tracker.py - Customize UI elements in
research_ui.py - Add new metrics in the data logging system
- Implement custom analysis algorithms
- JSON: Complete session data with metadata
- CSV: Tabular data for statistical analysis
- Excel: Multi-sheet workbook with data
- Pickle: Python-compatible data format
- Real-time analytics in the research interface
- Statistical summaries and trend analysis
- Quality assessment and validation
- Custom analysis capabilities
We welcome contributions to improve the system:
- Fork the repository
- Create a feature branch
- Implement improvements
- Add tests and documentation
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- MediaPipe for face landmark detection
- OpenCV for computer vision capabilities
- Research community for feedback and improvements
For questions, issues, or feature requests:
- Create an issue on GitHub
- Check the documentation for common solutions
- Review the examples for usage patterns
π¬ Eye Tracking System
Accurate and Reliable
To run the research UI, execute the following command from the face_eye_tracker directory:
python3 face-eye-tracker/main.py --ui researchThis will launch the research interface, which includes:
- Real-time data visualization
- Head pose (yaw and roll) display
The system now displays the head's yaw and roll angles in the research UI. These values are estimated from the face mesh and can be used to monitor the subject's head orientation during the session.
The accuracy of the eye tracking may be affected if the user is wearing glasses. The system may not perform optimally with all types of eyewear.
... (The original README content can be appended here if needed) ...