Multimodal Emotion Recognition for Seafarers: A Framework Integrating Improved D-S Theory and Calibration: A Case Study of a Real Navigation Experiment
Liu Yang,Junzhang Yang,3 Authors,Qing Liu
TLDR
A novel fusion framework for seafarer ER that compensates for the uncertainty of single models and enhances the performance of ER for seafarers, which provides a feasible path for the ER of seafarers.
Abstract
The influence of seafarers’ emotions on work performance can lead to severe marine accidents. However, research on emotion recognition (ER) of seafarers remains insufficient, and existing studies only deploy single models and disregard the model’s uncertainty, which might lead to unreliable recognition. In this paper, a novel fusion framework for seafarer ER is proposed. Firstly, feature-level fusion using Electroencephalogram (EEG) and navigation data collected in a real navigation environment was conducted. Then, calibration is employed to mitigate the uncertainty of the outcomes. Secondly, a weight combination strategy for decision fusion was designed. Finally, we conduct a series of evaluations of the proposed model. The results showed that the average recognition performance across the three emotional dimensions, as measured by accuracy, precision, recall, and F1 score, reaches 85.14%, 84.43%, 86.27%, and 85.33%, respectively. The results demonstrate that the use of physiological and navigation data can effectively identify seafarers’ emotional states. Additionally, the fusion model compensates for the uncertainty of single models and enhances the performance of ER for seafarers, which provides a feasible path for the ER of seafarers. The findings of this study can be used to promptly identify the emotional state of seafarers and develop early warnings for bridge systems for shipping companies and help inform policy-making on human factors to enhance maritime safety.
