Toward Understanding the ACC's Complex Relationship with Pain and Analgesia
Samuel Kissinger, Estefania O’Neil, Baolin Li, Kirk Johnson, Jeffrey Krajewski, and Akihiko Kato
(see article e2231232024)
Preclinical biomarkers for pain are lacking. The advent of large-scale neurophysiological recording techniques in animal models has cultivated a new strategy with which to unveil the dynamic and multidimensional neuronal activity signatures underlying pain states. Because of the anterior cingulate cortex's (ACC) known role in the negative emotions associated with pain, Kissinger and colleagues directly explored its activity prior to inflammatory pain, during pain, and under analgesia in a mouse model to better understand its use as a neural representation of pain with the end goal of improving preclinical measurements of pain. They used in vivo calcium imaging to assess ACC excitatory neuron activity in control mice, mice undergoing long-term inflammatory pain (Freund's complete adjuvant), and mice treated with an analgesic (gabapentin or ibuprofen). They found that groups of neurons or ensembles were slightly increased after long-term inflammatory pain. Surprisingly, treatment with gabapentin or ibuprofen, which are mechanistically distinct, resulted in shared neurophysiological features including increased correlated ensemble activity in response to the inflammatory pain condition only. The authors conclude that ACC excitatory neurons may encode relief from pain irrespective of analgesic types and not painful state intensity. Ultimately, the findings of this study reveal that excitatory neuron ensemble activity in the ACC has a complicated relationship with pain and analgesia and point to the need for further exploration.
Distinct Visual Cues Differentially Impact Components of Actions
Nina Lee, Lin Lawrence Guo, Adrian Nestor, and Matthias Niemeier
(see article e2100232024)
Grasping objects is a goal-directed behavior composed of discrete action components. To pick up something, people with eyesight must visually analyze it and transform this visual information into motor commands. Research has shown that intending to grasp something is associated with the splitting of attention into planned points of touch (each finger on a ball, for example). Furthermore, visual processing is improved by action-specific information. In this issue, Lee and colleagues advance our understanding of this by investigating the neural basis of how, if at all, features of an object form action-specific representations and are integrated in the brain. They used electroencephalography as human participants viewed objects with variable attributes and either grasped them or touched them with their knuckles. While visual representations of objects were similar whether someone grasped or “knuckled” them, only during grasping conditions did shape representations reactivate early visual cortex at later stages of grasp planning (or later intentions to grasp). In contrast, representations of material became action specific only immediately before objects were lifted. This and differences in object integration between these discrete sensorimotor conditions revealed that distinct visual cues differentially contribute to action components of grasping in a manner that does not occur with “knuckling.” Altogether, these findings contradict the view that goal-directed actions necessarily integrate task features into a stable neural representation and thus point to a need for further testing of the theory.
Footnotes
This Week in The Journal was written by Paige McKeon