Header menu link for other important links
Convolutional neural network based emotion classification using electrodermal activity signals and time-frequency features
Published in Elsevier BV
Volume: 159

In this work, an attempt has been made to classify emotional states using Electrodermal Activity (EDA) signals and Convolutional Neural Network (CNN) learned features. The EDA signals are obtained from the publicly available DEAP database and are decomposed into tonic and phasic components. The phasic component is subjected to the short-time Fourier transform. Thirty-eight features of time, frequency, and time–frequency domain are extracted from the phasic signal. These extracted features are applied to CNN to learn robust and prominent features. Five machine learning algorithms, namely linear discriminant analysis, multilayer perceptron, support vector machine, decision tree, and extreme learning machine are used for the classification. The results show that the proposed approach is able to classify the emotional states using arousal-valence dimensions. Classification using CNN learned features are found to be better than the conventional features. The trained end-to-end CNN model is found to be accurate (F-measure = 79.30% and 71.41% for arousal and valence dimensions) in classifying various emotional states. The proposed method is found to be robust in handling the dynamic variation of EDA signals for different emotional states. The results show that the proposed approach outperformed most of the state-of-the-art methods. Thus, it appears that the proposed method could be beneficial in analyzing various emotional states in both normal and clinical conditions.

About the journal
JournalData powered by TypesetExpert Systems with Applications
PublisherData powered by TypesetElsevier BV
Open AccessNo