Unsupervised Human Fatigue Expression Discovery via Time Series Chain
Supplemental Files
Date
2024-04-18
Authors
Haroon, Adam
Carlyon, William
Cantu, Frida
Ackaah-Gyasi, Kofi Nketia
Zhang, Li
Reza, Md Alimoor
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Fatigue manifests as a multifaceted human condition involving both psychological and physiological aspects. It is characterized by a diminished capacity to perform tasks effectively, potentially resulting in negative emotional states, errors in passive or active tasks, and even medical emergencies. There is a growing interest and practicality in continuous monitoring to identify fatigue during extended work periods. Despite the importance of fatigue detection, it is very challenging to build a model in practice due to limited data and a diverse set of sensor modalities. In this paper, we propose an unsupervised pipeline to address the challenge of fatigue detection from video streams in a challenging realistic environment. Specifically, we propose an effective fatigue expression discovery framework by first extracting key landmark points (e.g., shoulder joint, mouth) from video streaming data, then identifying evolving behavior patterns with time series chain, an effective high-order time series primitive, to discover precursors for potential human fatigues. To demonstrate the effectiveness of our proposed framework, we show that our framework can detect signs of fatigue using video data captured in real-world fatigue scenarios
Series Number
Journal Issue
Is Version Of
Versions
Series
Academic or Administrative Unit
Type
Presentation
Comments
Accepted to DS2-MH workshop at SIAM SDM 2024. Haroon Adam, Carlyon William, Cantu Frida, Ackaah-Gyasi Kofi Nketia, Zhang Li, Reza Md Alimoor. “Unsupervised Human Fatigue Expression Discovery via Time Series Chain.” Data Science for Smart Manufacturing and Healthcare Workshop 2024. https://dssmh.github.io/.
Rights Statement
Copyright 2024 The authors. This work is licensed under CC BY 4.0.