Abstract
A multivariate pattern‐classification system was developed for the study of facial electromy‐ographic (EMG) patterning in 12 female subjects during affect‐laden imagery and for posed facial expressions. A parameter‐extraction procedure identified the dynamic EMG signal properties which accorded the maximal degree of self‐reported emotion discrimination. Discriminant analyses on trialwise EMG vectors allowed assessment of specific EMG‐site conformations typifying rated emotions of happiness, sadness, anger, and fear. The discriminability among emotion‐specific EMG conformations was correlated with subjective ratings of affective‐imagery vividness and duration. Evidence was obtained suggesting that the EMG patterns encoded complex, “blended” reported affective states during the imagery. Classification analyses produced point‐predictions of reported emotional states in 10 of the 12 subjects, and provided the first computer pattern recognition of self‐reported emotion from psychophysiological responses.
Original language | English (US) |
---|---|
Pages (from-to) | 622-637 |
Number of pages | 16 |
Journal | PSYCHOPHYSIOLOGY |
Volume | 21 |
Issue number | 6 |
DOIs | |
State | Published - Nov 1984 |
Externally published | Yes |
Keywords
- Electromyography
- Electrophysiological recording
- Emotion
- Emotional disorders
- Facial expression
- Facial muscles
- Facial nerve
- Factor analysis
- Multivariate statistical analysis
- Nonverbal communication
ASJC Scopus subject areas
- General Neuroscience
- Neuropsychology and Physiological Psychology
- Experimental and Cognitive Psychology
- Neurology
- Endocrine and Autonomic Systems
- Developmental Neuroscience
- Cognitive Neuroscience
- Biological Psychiatry