Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE

User menu

  • Log out
  • Log in
  • My Cart

Search

  • Advanced search
Journal of Neuroscience
  • Log out
  • Log in
  • My Cart
Journal of Neuroscience

Advanced Search

Submit a Manuscript
  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Collections
    • Podcast
  • ALERTS
  • FOR AUTHORS
    • Information for Authors
    • Fees
    • Journal Clubs
    • eLetters
    • Submit
    • Special Collections
  • EDITORIAL BOARD
    • Editorial Board
    • ECR Advisory Board
    • Journal Staff
  • ABOUT
    • Overview
    • Advertise
    • For the Media
    • Rights and Permissions
    • Privacy Policy
    • Feedback
    • Accessibility
  • SUBSCRIBE
PreviousNext
Articles, Behavioral/Cognitive

Toward High Performance, Weakly Invasive Brain Computer Interfaces Using Selective Visual Attention

David Rotermund, Udo A. Ernst, Sunita Mandon, Katja Taylor, Yulia Smiyukha, Andreas K. Kreiter and Klaus R. Pawelzik
Journal of Neuroscience 3 April 2013, 33 (14) 6001-6011; https://doi.org/10.1523/JNEUROSCI.4225-12.2013
David Rotermund
1Institute for Theoretical Physics and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Udo A. Ernst
1Institute for Theoretical Physics and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sunita Mandon
2Institute for Brain Research, University of Bremen, Bremen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Katja Taylor
2Institute for Brain Research, University of Bremen, Bremen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Yulia Smiyukha
2Institute for Brain Research, University of Bremen, Bremen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Andreas K. Kreiter
2Institute for Brain Research, University of Bremen, Bremen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Klaus R. Pawelzik
1Institute for Theoretical Physics and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Abstract

Brain–computer interfaces have been proposed as a solution for paralyzed persons to communicate and interact with their environment. However, the neural signals used for controlling such prostheses are often noisy and unreliable, resulting in a low performance of real-world applications. Here we propose neural signatures of selective visual attention in epidural recordings as a fast, reliable, and high-performance control signal for brain prostheses. We recorded epidural field potentials with chronically implanted electrode arrays from two macaque monkeys engaged in a shape-tracking task. For single trials, we classified the direction of attention to one of two visual stimuli based on spectral amplitude, coherence, and phase difference in time windows fixed relative to stimulus onset. Classification performances reached up to 99.9%, and the information about attentional states could be transferred at rates exceeding 580 bits/min. Good classification can already be achieved in time windows as short as 200 ms. The classification performance changed dynamically over the trial and modulated with the task's varying demands for attention. For all three signal features, the information about the direction of attention was contained in the γ-band. The most informative feature was spectral amplitude. Together, these findings establish a novel paradigm for constructing brain prostheses as, for example, virtual spelling boards, promising a major gain in performance and robustness for human brain–computer interfaces.

View Full Text
Back to top

In this issue

The Journal of Neuroscience: 33 (14)
Journal of Neuroscience
Vol. 33, Issue 14
3 Apr 2013
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by author
  • Advertising (PDF)
  • Ed Board (PDF)
Email

Thank you for sharing this Journal of Neuroscience article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Toward High Performance, Weakly Invasive Brain Computer Interfaces Using Selective Visual Attention
(Your Name) has forwarded a page to you from Journal of Neuroscience
(Your Name) thought you would be interested in this article in Journal of Neuroscience.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Toward High Performance, Weakly Invasive Brain Computer Interfaces Using Selective Visual Attention
David Rotermund, Udo A. Ernst, Sunita Mandon, Katja Taylor, Yulia Smiyukha, Andreas K. Kreiter, Klaus R. Pawelzik
Journal of Neuroscience 3 April 2013, 33 (14) 6001-6011; DOI: 10.1523/JNEUROSCI.4225-12.2013

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Request Permissions
Share
Toward High Performance, Weakly Invasive Brain Computer Interfaces Using Selective Visual Attention
David Rotermund, Udo A. Ernst, Sunita Mandon, Katja Taylor, Yulia Smiyukha, Andreas K. Kreiter, Klaus R. Pawelzik
Journal of Neuroscience 3 April 2013, 33 (14) 6001-6011; DOI: 10.1523/JNEUROSCI.4225-12.2013
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Articles

  • Memory Retrieval Has a Dynamic Influence on the Maintenance Mechanisms That Are Sensitive to ζ-Inhibitory Peptide (ZIP)
  • Neurophysiological Evidence for a Cortical Contribution to the Wakefulness-Related Drive to Breathe Explaining Hypocapnia-Resistant Ventilation in Humans
  • Monomeric Alpha-Synuclein Exerts a Physiological Role on Brain ATP Synthase
Show more Articles

Behavioral/Cognitive

  • Repeated tDCS at Clinically Relevant Field Intensity Can Boost Concurrent Motor Learning in Rats
  • Dissociable Causal Roles of Dorsolateral Prefrontal Cortex and Primary Motor Cortex over the Course of Motor Skill Development
  • Electrophysiological Correlates of Lucid Dreaming: Sensor and Source Level Signatures
Show more Behavioral/Cognitive
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Issue Archive
  • Collections

Information

  • For Authors
  • For Advertisers
  • For the Media
  • For Subscribers

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Accessibility
(JNeurosci logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
JNeurosci Online ISSN: 1529-2401

The ideas and opinions expressed in JNeurosci do not necessarily reflect those of SfN or the JNeurosci Editorial Board. Publication of an advertisement or other product mention in JNeurosci should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in JNeurosci.