Many synapses have a high percentage of synaptic transmission failures. I consider the hypothesis that synaptic failures can increase the efficiency of information transmission across the synapse. I use the information transmitted per vesicle release about the presynaptic spike train as a measure of synaptic transmission efficiency and show that this measure can increase with the synaptic failure probability. I analytically calculate the Shannon mutual information transmitted across two model synapses with probabilistic transmission: one with a constant probability of vesicle release and one with vesicle release probabilities governed by the dynamics of synaptic depression. For inputs generated by a non-Poisson process with positive autocorrelations, both synapses can transmit more information per vesicle release than a synapse with perfect transmission, although the information increases are greater for the depressing synapse than for a constant-probability synapse with the same average transmission probability. The enhanced performance of the depressing synapse over the constant-release-probability synapse primarily reflects a decrease in noise entropy rather than an increase in the total transmission entropy. This indicates alimitation of analysis methods, such as decorrelation, that consider only the total response entropy. My results suggest that synaptic transmission failures governed by appropriately tuned synaptic dynamics can increase the information-carrying efficiency of a synapse.

This content is only available as a PDF.
You do not currently have access to this content.