Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE
PreviousNext
Symposium and Mini-Symposium

Perceptual-Cognitive Integration for Goal-Directed Action in Naturalistic Environments

Jolande Fooken, Bianca R. Baltaretu, Deborah A. Barany, Gabriel Diaz, Jennifer A. Semrau, Tarkeshwar Singh and J. Douglas Crawford
Journal of Neuroscience 8 November 2023, 43 (45) 7511-7522; https://doi.org/10.1523/JNEUROSCI.1373-23.2023
Jolande Fooken
1Centre for Neuroscience, Queen's University, Kingston, Ontario K7L3N6, Canada
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Jolande Fooken
Bianca R. Baltaretu
2Department of Psychology, Justus Liebig University, Giessen, 35394, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Deborah A. Barany
3Department of Kinesiology, University of Georgia, and Augusta University/University of Georgia Medical Partnership, Athens, Georgia 30602
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Deborah A. Barany
Gabriel Diaz
4Center for Imaging Science, Rochester Institute of Technology, Rochester, New York 14623
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jennifer A. Semrau
5Department of Kinesiology and Applied Physiology, University of Delaware, Newark, Delaware 19713
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Tarkeshwar Singh
6Department of Kinesiology, Pennsylvania State University, University Park, Pennsylvania 16802
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Tarkeshwar Singh
J. Douglas Crawford
7Centre for Integrative and Applied Neuroscience, York University, Toronto, Ontario M3J 1P3, Canada
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Measuring coordinated action with different experimental tools. A, Top, The neural control of eye-hand and body coordination can be probed using MRI-compatible tablets and eye-trackers in the scanner or by developing portable setups (e.g., portable EEG system). Middle, Using a robotic manipulandum allows experimental control of the visual and movement space (e.g., mechanical perturbations) and a head-fixed setup enables well-calibrated high-precision eye tracking, while investigating real object manipulation. Bottom, Studies using virtual reality setups or head-mounted eye-tracking glasses allow the study of eye-hand coordination in naturalistic environments. B, Schematic represents how different experimental tools vary along two axes. The ability by the experimenter to control the visual and physical environment (x axis) and the translation of the observed behavior to the real world (y axis). Green boxes represent behavioral methods. Gray boxes represent neuroimaging techniques.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Schematic overview of the major brain regions and pathways involved in goal-directed eye-hand coordination. Pathways for vision, object and motion perception, eye movements, and top-down cognitive strategies are integrated with cortical, subcortical, and cerebellar networks for sensorimotor transformations to produce coordinated action. A, Left brain lateral view. B, Right brain medial view. LOC, Lateral occipital cortex; MT, middle temporal area; PPC, posterior parietal cortex; SPL 7, superior parietal lobule area 7; SPL 5, superior parietal lobule area 5; IPL, inferior parietal lobule; S1, somatosensory cortex; M1, primary motor cortex; PMC, premotor cortex, SMA, supplementary motor area; PMd, dorsal premotor cortex; FEF, frontal eye fields; PMv, ventral premotor cortex; dlPFC, dorsolateral PFC; CBM, cerebellum; SC, superior colliculus; LGN, lateral geniculate nucleus; PPA, parahippocampal place area; LG, lingual gyrus; Cn, cuneus. Created with Biorender.com.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Visual signals for action. A, Visual scenes consist of “grammar” that is defined by building blocks in a hierarchical structure, consisting of phrases (top), global objects (middle), and associated local objects (bottom) (Võ, 2021). B, When reaching to stationary objects, the target object is commonly fixated throughout the reach, which allows the integration of foveal vision of the object and peripheral vision of the hand. C, When intercepting moving objects, the eyes either track the moving object with SPEMs or fixate on the expected interception location to guide the hand toward the object.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    The role of optic flow on visually guided steering. A, Drivers are immersed in a simulated environment seen through a head-mounted display with integrated eye tracking. B, Exemplary view inside the virtual reality as participants attempt to keep their head centered within a parameterized and procedurally generated roadway. This image is superimposed with a computational estimate of optic flow indicated here as white arrows (Matthis et al., 2018). C, To assess the effect of cortical blindness on the visual perception of heading will require the development of computational models of visually guided steering that account for the blind field. For illustrative purposes, we have superimposed the results from a Humphrey visual field test on the participant's view at a hypothetical gaze location with approximate scaling (reprinted from Cavanaugh et al., 2015 with permission).

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Movement impairment in stroke survivors. A, Kinesthetic Matching Task where a robotic manipulandum (Kinarm Exoskeleton Lab) passively moved the more affected arm and stroke survivors mirror-matched with the less affected arm (left). Middle, Performance from a stroke survivor that performed well with vision of the limb (top, left and right) and without vision of the limb (top, left); other participants performed worse than control participants with and without the use of vision (bottom, left and right), although vision significantly improved performance for some (right, top and bottom). Distribution of participant performance demonstrates that only 12% of stroke survivors (total N = 261) used vision to effectively correct proprioceptively referenced movements, suggesting that vision often fails to effectively compensate for multisensory impairments (right) (Semrau et al., 2018). B, Left, Saccades made by a healthy control (top) and stroke survivor (bottom) during hand movement (green) and hand dwell on target (yellow) in the Trails Making task. Excessive saccades in stroke survivors are associated with deficits in working memory and top-down visual search (middle). When stroke survivors make saccades during reaching, those movements tend to be slower compared with when they do not make saccades during reaching (right panels).

Back to top

In this issue

The Journal of Neuroscience: 43 (45)
Journal of Neuroscience
Vol. 43, Issue 45
8 Nov 2023
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Masthead (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Perceptual-Cognitive Integration for Goal-Directed Action in Naturalistic Environments
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Perceptual-Cognitive Integration for Goal-Directed Action in Naturalistic Environments
Jolande Fooken, Bianca R. Baltaretu, Deborah A. Barany, Gabriel Diaz, Jennifer A. Semrau, Tarkeshwar Singh, J. Douglas Crawford
Journal of Neuroscience 8 November 2023, 43 (45) 7511-7522; DOI: 10.1523/JNEUROSCI.1373-23.2023

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Perceptual-Cognitive Integration for Goal-Directed Action in Naturalistic Environments
Jolande Fooken, Bianca R. Baltaretu, Deborah A. Barany, Gabriel Diaz, Jennifer A. Semrau, Tarkeshwar Singh, J. Douglas Crawford
Journal of Neuroscience 8 November 2023, 43 (45) 7511-7522; DOI: 10.1523/JNEUROSCI.1373-23.2023
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Conclusions
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

  • Rethinking Remapping: Circuit Mechanisms of Recovery after Stroke
  • Neurobiology and Changing Ecosystems: Mechanisms Underlying Responses to Human-Generated Environmental Impacts
  • Presynaptic Protein Synthesis in Brain Function and Disease
Show more Symposium and Mini-Symposium

Subjects

  • 2023 Annual Meeting Issue
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Accessibility
(JNeurosci logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.