Research output: Contribution to journal › Article › peer-review
Modern mental fatigue detection methods include many parameters for evaluation. For example, many researchers use human subjective evaluation or driving parameters to assess this human condition. Development of a method for detecting the functional state of mental fatigue is an extremely important task. Despite the fact that human operator support systems are becoming more and more widespread, at the moment there is no open-source solution that can monitor this human state based on eye movement monitoring in real time and with high accuracy. Such a method allows the prevention of a large number of potential hazardous situations and accidents in critical industries (nuclear stations, transport systems, and air traffic control). This paper describes the developed method for mental fatigue detection based on human eye movements. We based our research on a developed earlier dataset that included captured eye-tracking data of human operators that implemented different tasks during the day. In the scope of the developed method, we propose a technique for the determination of the most relevant gaze characteristics for mental fatigue state detection. The developed method includes the following machine learning techniques for human state classification: random forest, decision tree, and multilayered perceptron. The experimental results showed that the most relevant characteristics are as follows: average velocity within the fixation area; average curvature of the gaze trajectory; minimum curvature of the gaze trajectory; minimum saccade length; percentage of fixations shorter than 150 ms; and proportion of time spent in fixations shorter than 150 milliseconds. The processing of eye movement data using the proposed method is performed in real time, with the maximum accuracy (0.85) and F1-score (0.80) reached using the random forest method.
Original language | English |
---|---|
Article number | 6805 |
Number of pages | 12 |
Journal | Sensors |
Volume | 24 |
Issue number | 21 |
DOIs | |
State | Published - 23 Oct 2024 |
ID: 126271205