From a face to its category via a few information processing states in the brain
Section snippets
Subjects
Subjects were four paid observers (BB, MB, ML and RZ) from Glasgow University, UK.
Stimuli
Face stimuli were computed from 256 × 256 pixels grey-scale pictures of 10 actors (5 males and 5 females) each displaying two facial expressions (neutral and happy). All photographs were taken under standardised conditions of illumination and hairstyle was normalised across faces to eliminate this feature. Stimuli were presented on a light grey background at the centre of a computer monitor with a fixed chin rest
Behavioural information
MB was correct on 92% of the trials in GENDER (RT: μ = 861 ms; σ = 271 ms) and 90% in EXNEX (RT: μ = 900 ms; σ = 330 ms); RZ 85% (RT: μ = 709 ms; σ = 192 ms) and 90% (RT: μ = 732 ms; σ = 237 ms); ML 88% (RT: μ = 790 ms; σ = 210 ms) and 87% (RT: μ = 730 ms; σ = 185 ms); BB 81% (RT: μ = 846 ms; σ = 168 ms) and 80% (RT: μ = 760 ms; σ = 152 ms), respectively. The classification image analysis revealed that the accuracy of observers mostly depended on the presence of two eyes and some of the mouth in GENDER and on the exclusive
Discussion
Individual observer's FTF maps and the finite state automata have suggested (1) distinct visual information processing states and (2) transitions between states over the time course of two face categorisation tasks. States and their transitions were identified mostly in the low-frequency EEG, in the theta [4–8 Hz] and alpha [8–12 Hz] bands. The results suggest a time line of distinct feature processing states in the brain on OTR and OTL electrodes, revealing for the first time the dynamics and
Conclusions
We have related brain activity to the states of a cognitive information processing mechanism. In doing so, we have addressed three questions: to address the question of the brain activity supporting face processing, we examined a time-resolved brain measurement (Time × Frequency oscillatory energy) on face-sensitive electrodes. To address the question of the information content processed in the oscillatory energy, we applied a novel classification image to produce the FTF maps and identified
Acknowledgment
This research was supported by an Economic and Social Research Council grant (R000237901) awarded to P.G.S.
References (51)
- et al.
Memory formation by neuronal synchronization
Brain Res. Rev.
(2006) - et al.
Brain oscillations in perception and memory
Int. J. Psychophys.
(2000) - et al.
Early processing of the six basic facial emotional expressions
Cogn. Brain Res.
(2003) Theta oscillations in the hippocampus
Neuron
(2002)Effects of face inversion on the structural encoding and recognition of faces—Evidence from event-related potentials
Cogn. Brain Res.
(2000)- et al.
Attention-dependent suppression of distracter visual input can be cross-modally cued as indexed by anticipatory parieto-occipital alpha-band oscillations
Brain Res. Cogn. Brain Res.
(2001) - et al.
Hippocampal theta rhythms synchronizes visual neurons in sleep and waking
Brain Res.
(2002) - et al.
Bubbles: a new technique to reveal the use of visual information in recognition tasks
Vis. Res.
(2001) EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis
Brain Res. Brain Res. Rev.
(1999)- et al.
Shape representation in the inferior temporal cortex of monkeys
Curr. Biol.
(1995)
Object recognition: similar visual strategies of birds and mammals
Curr. Biol.
Amygdala oscillations and the consolidation of emotional memories
Trends Cogn. Sci.
Synchronous oscillatory activity in sensory systems: new vistas on mechanisms
Curr. Opin. Neurobiol.
Spatio-temporal localization of the face inversion effect: an event-related potentials study
Biol. Psychol.
Cogn. Brain Res.
Diagnostic recognition: task constraints, object information, and their interactions
Cognition
Time as coding space?
Curr. Opin. Neurobiol.
Oscillatory gamma activity in humans and its role in object representation
Trends Cogn. Sci.
Is perception discrete or continuous?
Trends Cogn. Sci.
Synchronous neural oscillations and cognitive processes
Trends Cogn. Sci.
Effects of gaze on amygdala sensitivity to anger and fear faces
Science
A mechanism for impaired fear recognition after amygdala damage
Nature
Electrophysiological studies of face perception in humans
J. Cogn. Neurosci.
Accurate statistical tests for smooth classification images
J. Vis.
Hemispheric differences in brain activity related to the recognition of emotional expressions by 5 year-old children
Dev. Neuropsychol.
Cited by (35)
Face perception foundations for pattern recognition algorithms
2021, NeurocomputingCitation Excerpt :N170 is the face-sensitive component, reflects the categorization of a stimulus as a face and deals with structural encoding of faces [60]; P100 reflects top-down attentional processes related to FP [61]; N250 and N250r deal with familiarity and repetitions [61];
The time course of object, scene, and face categorization
2017, Handbook of Categorization in Cognitive ScienceFear recognition impairment in early-stage Alzheimer's disease: When focusing on the eyes region improves performance
2013, Brain and CognitionCitation Excerpt :This finding constitutes a key result of this study. Although both fearful and angry faces indicate to the viewer, through the eyes region, the presence of an aversive or a threat signal (Smith, Cottrell, Gosselin, & Schyns, 2005), cumulative evidence tends to show that processing fearful emotions of the eyes region is particularly relevant for identifying emotional saliency and is useful to discriminate fear from other emotions (Schyns et al., 2007; Smith et al., 2007). Consequently, we assume that in our protocol, the eyes region condition permitted compensation for the fear deficit, but may not have improved recognition of anger, for which the eyes region may be less salient.
The human Turing machine: A neural framework for mental programs
2011, Trends in Cognitive SciencesCitation Excerpt :Hence, although many aspects of the brain do not resemble a Turing machine, an emergent aspect of cognition, the conscious rational thought that was at the root of Turing's insight, seems to act as a serial Turing machine. Previous work has used notions of Turing devices to map macroscopic brain states (derived from quantitative dynamic noninvasive human imaging measures) to information processing theory [8–10]. Here we investigate which neural architectures could implement a Brain Turing Machine.
Extracting the internal representation of faces from human brain activity: An analogue to reverse correlation
2010, NeuroImageCitation Excerpt :One promising technique that would allow for such an approach is a behavioral analogue to reverse correlation, referred to as classification image analysis (Ahumada, 1996; Beard and Ahumada, 1998; Ahumada, 2002). Along that vein, several neurophysiological studies have begun to utilize reverse correlation techniques using actual faces (or parts of faces) in hopes of better understanding the nature of different face selective brain signals (Smith et al., 2004, 2006, 2007). However, such studies cannot bypass the high-level cognitive attributes which accompany actual faces (or parts of faces) as stimuli.
Information processing algorithms in the brain
2009, Trends in Cognitive Sciences