Skip to main content

Neurophysiology of birdsong syntax perception

Final Report Summary - SYNTAX (Neurophysiology of birdsong syntax perception)

Language is a key capability distinguishing humans from other animal species, as no other known vocal communication system in the animal kingdom matches its complexity. Nevertheless, many important abilities that underlie the faculty of language are shared with other animals, and are interesting targets for comparative research from evolutionary and mechanistic perspectives. Birdsong has a phonological syntax, as its constituent acoustic elements are often organized according to particular syntactic constraints. This suggests birds use such syntax in natural communication, and makes them an attractive, but as yet virtually unexplored, model for research into the neural underpinnings of syntax perception.

The main aim of this project was to identify and characterize neurophysiological processes that are involved in the processing of syntactical structures in zebra finches (Taeniopygia guttata), a widely accepted animal model to study speech and language-related neural mechanisms. To this end, a state-of-the-art high-density neuroimaging paradigm was implemented, based on an array of intracerebral electrodes that measure action potential and local field activity from a regular matrix of 64 electrode sites simultaneously. Because of the large number of sites, acute recordings were performed under isoflurane anesthesia, which induces a sleep-like state in which primary and secondary cortical areas are known to exhibit stimulus-specific auditory responses to complex sequences, including natural song.

The use of a high-density 8x8 regularly spaced electrode matrix allowed for the visualization of electrophysiological brain activity as series of image plots, and videos, in which temporospatial changes in ongoing activity is emphasized. Surprisingly, we found that spontaneous activity both within and outside auditory areas propagate as traveling waves of action and local field potentials throughout most of the forebrain. These results show that traveling wave propagation (i) is not dependent upon the laminar organization of the mammalian neocortex (in which this phenomenon is also known), (ii) is unlikely to subserve functions unique to this pattern of neuronal organization, and (iii) may contribute to the evolution of complex cognition in birds, including speech-like capabilities.

We then investigated such traveling wave activity in primary and secondary auditory areas under stimulation with artificial sound sequences based on natural song syllables. The aim was to investigate which types of underlying neurocomputational processes are involved in auditory sequence perception, with different hypothesis corresponding to project objectives. These differed in presumed complexity of the systems involved, ranging from change detection, to stimulus specific adaptation, to, ultimately, rule learning. We found strong evidence for both memory-based stimulus-specific adaptation and rule learning. The latter is particularly exciting because sequence rule learning is at the heart of syntax capabilities in humans, and so far this capability in non-human animals has been studied at the behavioral level only, and with results that are hotly debated. Using a novel stimulation paradigm, we carefully separated effects of lower-level memory-based processes and higher-level predictive processes, and identified neural response patterns that are consistent with neural processes that involve prediction of future input based on short-term rule-learning.

Take together, the project has implemented a novel intracerebral neuroimaging paradigm in a songbird, and yielded new insights into the traveling nature of neurophysiological processes that are also involved in (automatic) auditory sequence rule learning. This expands the use of songbirds as the already established model system for speech-like vocal imitation learning to a comparative neural model system for syntactic rule learning. Knowledge into how the brain infers on the causes of sensory inputs and predicts future events may lead to a better understanding of the mechanisms that underlie mismatch negativity (MMN). This scalp-recorded EEG phenomenon in humans is widely studied, and has been implicated in a plethora of cognitive functions, including linguistic ones, and has been linked to disorders such as dyslexia and psychiatric diseases.