Abstract
Humans can recognize their whole-body movements even when displayed as dynamic dot patterns. The sparse depiction of whole-body movements, coupled with a lack of visual experience watching ourselves in the world, has long implicated non-visual mechanisms to self-action recognition. We aimed to identify the neural systems for this ability. Using general linear modeling and multivariate analyses on human brain imaging data from male and female participants, we first found that cortical areas linked to motor processes, including frontoparietal and primary somatomotor cortices, exhibit greater engagement and functional connectivity when recognizing self-generated versus other-generated actions. Next, we show that these regions encode self-identity based on motor familiarity, even after regressing out idiosyncratic visual cues using multiple regression representational similarity analysis. Last, we found the reverse pattern for unfamiliar individuals: encoding localized to occipito-temporal visual regions. These findings suggest that self-awareness from actions emerges from the interplay of motor and visual processes.
Significance Statement: We report for the first time that self-recognition from visual observation of our whole-body actions implicates brain regions associated with motor processes. On functional neuroimaging data, we found greater activity and unique representational patterns in brain areas and networks linked to motor processes when viewing our own actions relative to viewing the actions of others. These findings introduce an important role of motor mechanisms in distinguishing the self from others.
Footnotes
We thank Sophia Baia and Kelly Xue for assistance with data collection and stimuli creation, and Elinor Yeo, Jolie Wu, Kelly Nola, Nicolas Jeong, Danya Elghebagy, David Lipkin, Shahan McGahee for assistance with stimuli creation. We thank Jeff Chiang, Burcu Ürgen, and Giuseppe Marrazzo for helpful advice on the analyses. We thank Lisa Aziz-Zadeh and Sofronia Ringold for helpful feedback on an earlier draft of this manuscript. This project was supported by National Science Foundation BCS- 2142269 to H.L., UCLA faculty research grant to H.L., Tiny Blue Dot Foundation grant to M.M.M., and APA Dissertation Award to A.K. Preliminary versions of this project were presented at the Virtual Society for Neuroscience (2020), V-Vision Sciences Society (2020), Society for Neuroscience (2022), and Association for Scientific Study of Consciousness (2023).
All analysis scripts, behavioral data, and results from the imaging analyses can be downloaded from our GitHub repository: https://github.com/akilakada/self-fmri. Raw nifti data can be shared upon request to the corresponding author and subject to the UCLA Institutional Review Board Guidelines.