Abstract
Adaptive behaviors require sensorimotor computations that convert information represented initially in sensory coordinates to commands for action in motor coordinates. Fundamental to these computations is the relationship between the region of the environment sensed by the animal (gaze) and the animal’s locomotor plan. Studies of visually guided animals have revealed an anticipatory relationship between gaze direction and the locomotor plan during target-directed locomotion. Here, we study an acoustically guided animal, an echolocating bat, and relate acoustic gaze (direction of the sonar beam) to flight planning as the bat searches for and intercepts insect prey. We show differences in the relationship between gaze and locomotion as the bat progresses through different phases of insect pursuit. We define acoustic gaze angle, θgaze, to be the angle between the sonar beam axis and the bat’s flight path. We show that there is a strong linear linkage between acoustic gaze angle at time t [θgaze(t)] and flight turn rate at time t + τ into the future [ flight (t + τ)], which can be expressed by the formula
flight (t + τ) = kθgaze(t). The gain, k, of this linkage depends on the bat’s behavioral state, which is indexed by its sonar pulse rate. For high pulse rates, associated with insect attacking behavior, k is twice as high compared with low pulse rates, associated with searching behavior. We suggest that this adjustable linkage between acoustic gaze and motor output in a flying echolocating bat simplifies the transformation of auditory information to flight motor commands.
Introduction
Humans and other animals use information from their environment to guide adaptive motor behaviors such as locomotion. Gaze, the region of the environment a subject explores with the senses, serves to direct locomotion, and much research has addressed how animals use vision during locomotion. Gaze direction restricts the spatial extent of visual information, and the pattern of gaze shifts in humans is related to task demands (Yarbus, 1961). There have been many studies of how gaze direction in visual animals is related to locomotor planning. Such studies have focused on target-directed motion, in which subjects have been required to use specific landmarks in the visual field to guide movement. These landmarks could be presented as a direct goal of locomotion (Land and Collett, 1974; Hollands et al., 2002), as a track on which to navigate (Land and Lee, 1994; Grasso et al., 1996; Land and Tatler, 2001), or as obstacles to navigate around (Grasso et al., 1998; Imai et al., 2001). Such studies have suggested that subjects make anticipatory gaze movements with their head and eyes during locomotion. The anticipatory relationship between gaze and locomotion has contributed to hypotheses of how visual information is used by the nervous system to guide movement (Grasso et al., 1998; Land, 1999; Wann and Swapp, 2000). Studies on chasing in the housefly (Land and Collett, 1974) and prey pursuit in tiger beetles (Gilbert, 1997) have expressed this anticipatory coupling in terms of a locomotor gain between visual direction to a target and locomotor output.
Current studies on gaze and locomotion have not addressed two issues. One gap in our knowledge is whether gaze and locomotion are similarly related both in the presence and in the absence of an explicit target. For example, when searching for a target, the animal may uncouple its gaze from its locomotor plan to scan the environment without changing its direction of motion. Past studies have, however, focused on how animals move once they have acquired a locomotor target and are moving in response to it. It is not known, therefore, whether the animal adjusts the sensory–locomotor gain in response to behavioral demands. Another gap in our knowledge is how gaze in other sensory modalities, such as audition, is related to locomotion. Echolocating bats emit brief, intermittent ultrasonic pulses. Each pulse forms a beam of sound that echoes off objects in its path. Bats compute the direction and distance to obstacles and prey from a spectrotemporal analysis of the returning echoes (for review, see Moss and Schnitzler, 1995). In contrast to vision, the information from echolocation arrives intermittently in time, yielding snapshots of information. Visual directional information is available explicitly, through the location of the image of an object on the retina. Auditory directional information, however, requires a complex mapping of binaural spectrotemporal information into spatial location. The ability to localize objects and navigate via echolocation is very well developed in bats, and the distinctive aspects of echolocation as a sensory system suggest that the study of auditory guided locomotion in bats offers a valuable complement to similar studies in visually guided animals. By comparing and contrasting actions guided by vision and audition, we can test hypotheses of sensorimotor integration for their generality and explore modality-specific specializations that animals may have evolved.
The sonar beam direction of each vocalization restricts the region of space from which the bat receives information. In analogy to visual gaze in humans, the sonar beam direction can be considered a component of acoustic gaze for echolocating bats. Additionally, in Eptesicus fuscus, the sonar beam axis is aligned with the head direction. In this paper, we studied the relationship between acoustic gaze and flight locomotor output in an echolocating bat during different stages of insect pursuit.
Materials and Methods
Animal model.
Insectivorous echolocating bats, such as the big brown bat E. fuscus, perform complex and rapid flight trajectories to catch airborne insects in darkness. They produce intermittent pulses of directed ultrasound and use the information contained in the returning echoes to detect, localize, and track flying insect prey, relying on hearing instead of vision to guide complex spatial behaviors (Griffin, 1958; Griffin et al., 1960). The sonar pulses produced by E. fuscus are frequency modulated and consist of multiple harmonics with the fundamental sweeping from ∼60 to 25 kHz during the approach stage of insect pursuit (Surlykke and Moss, 2000). Bats change the duration, bandwidth, and production rate of their sonar signals with behavioral state (Griffin, 1958). When cruising in open space, the pulse production rate (PPR) of E. fuscus may be as low as 4 Hz, and the call duration may be as long as 20 ms (Surlykke and Moss, 2000). As the bat detects and then approaches prey, the PPR rises, terminating in insect pursuit and capture (“terminal buzz”), when the PPR may be as high as 150–200 Hz and signals as short as 0.5 ms (Griffin, 1958). Figure 1 shows an example of a bat sonar pulse sequence recorded in a laboratory flight room. The bat is first flying around the room. It then detects and captures a tethered insect.
Train of pulses produced by E. fuscus catching an insect in a laboratory. Insect capture occurs at time 0 s. Initially, the bat produces pulses at a rate of ∼10–20 Hz. In the field, such calls can be as long as 20 ms, although in the laboratory, shorter calls are observed. This is commonly called the search stage. As the bat detects and then starts to pursue an insect, the pulses are produced more frequently. During the terminal buzz, bats produce calls at rates as high as 200 Hz with durations as short as 0.5 ms. Several sounds are followed by echoes, which are seen in the trace as a second signal of low amplitude after the initial pulse recording.
The sonar beam produced by E. fuscus is directional and aligned with its head (Hartley and Suthers, 1989). The directionality of the sonar beam restricts the spatial extent from which the bat’s sonar system can gather information. The sonar beam direction of a bat, in analogy to gaze in visual animals, can be considered a component of acoustic gaze (Ghose and Moss, 2003), because it defines the region of space from which the animal’s sensory system can acquire information. The sonar beam pattern of the echolocating bat enables us to measure the gaze direction of an animal, which relies on audition as its primary distal sense. The temporal patterning of the sounds produced by the bat also enables us to objectively demarcate different behavioral states during echolocating flight. Echolocating bats, therefore, provide an excellent animal model to study the link between gaze and locomotion, in different behavioral states, in an animal that is not guided by vision.
Behavioral methods.
We trained five bats of the species E. fuscus to fly individually in a large (7.3 × 6.4 × 2.0 m3) laboratory room (Fig. 2). The room walls and ceiling were lined with sound absorbent acoustic foam (Sonex One; Acoustical Solutions, Richmond, VA) to reduce reverberations. The room was illuminated by dim, long wavelength light (>650 nm; light from normal incandescent bulbs passed through a filter plate; Plexiglas G #2711; Atofina Chemicals, Philadelphia, PA) to which the bat is insensitive (Hope and Bhatnagar, 1979). Images from two high-speed video cameras (CCD-based cameras operating at 240 frames per second, synchronized to 1/2 frame accuracy; Kodak MotionCorder; Eastman Kodak, San Diego, CA) were used to reconstruct the three-dimensional flight path of the bat and the trajectory of the prey. Simultaneously, a U-shaped array of 16 microphones (Ghose and Moss, 2003) recorded horizontal cross sections of the sonar beam pattern emitted by the bat.
Laboratory flight room. The bats were trained to fly in a flight room 7.3 × 6.4 × 2 m3. The room walls and ceiling were covered with sound absorbent foam to reduce reverberations. Illumination was dim red lighting (wavelength, >650 nm) to exclude the bat’s use of vision. Two digital video cameras operating at 240 frames per second recorded the three-dimensional position of the bats and tethered insects during the experiments. An array of 16 microphones was used to record the sonar beam pattern of the bats as they flew in the room.
The bats were trained to catch insects (mealworms) suspended from a tether. The insects were tethered at the end of a 1-m-long monofilament line. Each insect was initially concealed in a trap-door mechanism, which was placed at random points on the ceiling. After release from the trap door, the insect was held stationary at the end of the tether. The duration of the drop and the jerking motion of the insect at the end of the drop were short compared with the time it took the bat to reorient its flight and capture the insect after detection. This paradigm allows us to study the bat’s flight behavior as it attacks a target without having to compensate for target movements (Wilson and Moss, 2004). Each experimental trial consisted of two parts. During the first part of the experiment, the insect was concealed in the trap door, and bat was allowed to fly in the room. This allowed us to investigate the relationship between the bat’s acoustic gaze and its flight motor planning when no target was present. After a period of 1–30 s, the prey was released from the trap door. This led to the approach and attack stages, in which the bat localized, tracked, and intercepted the tethered insect. This allowed us to investigate the relationship between gaze and locomotion as the bat progressed through the behavioral states associated with different stages of foraging flight.
Computation of acoustic gaze direction.
The sonar beam is horizontally symmetrical about the midline of the bat’s head (Hartley and Suthers, 1989). Every time the bat produces a vocalization, the direction of its sonar beam axis can be computed from the reconstructed sonar beam pattern obtained from the microphone array (Ghose and Moss, 2003). The sound intensity incident at each microphone j is corrected for spherical loss and atmospheric absorption to yield the normalized, corrected intensity ICj. From Figure 3, we see that the axis of the sonar beam may be computed as the direction of H⃗, where H⃗ = Σj I⃗Cj. Here, I⃗Cj is a vector directed from the bat to microphone j with magnitude proportional to the corrected intensity ICj. The horizontal aspect of the sonar beam axis is also aligned with the head direction of the bat. The spatial extent of the sonar beam limits the region of space the bat can sample with one vocalization, and the bat centers its sonar beam axis on a target of interest (Ghose and Moss, 2003). We define the bat’s acoustic gaze as the region of space sampled by its beam pattern and use the sonar beam axis to infer the acoustic gaze direction.
Computation of head direction. I⃗j is a vector directed from the bat to microphone j with magnitude proportional to the corrected intensity Ic for that microphone. H⃗ is the resultant of the summation of vectors from all the microphones. The direction of H⃗ is the direction of the head.
Computing linkage between gaze and locomotion.
From Figure 4, the bat’s velocity direction, θflight, was computed as the direction of the tangent to its flight path. As a measure of the bat’s flight motor output, we computed the time derivative of this quantity, flight, that measures the rate of turn of the bat in flight. This was computed by numerically differentiating the changes in the angle of the tangent to the bat’s flight path for each video frame. The flight data were smoothed using cubic spline interpolation to remove artifacts introduced when the bat’s position was manually digitized from the stereo video data. The computation of
flight depends only on the kinematics of the flight path and is not affected by vocalization timing. Because of the geometry of the microphone array, only the horizontal component of the gaze direction of the bat could be computed. The axis of the sonar beam in E. fuscus corresponds to the direction of the head. We computed gaze angle as the horizontal angle, θgaze, between the axis of the sonar beam (H⃗) and the bat’s flight direction each time the bat produced a sonar call. We studied the relationship between gaze angle at any instant t when the bat vocalized, θgaze(t), and the rate of turn of the bat at various times τ relative to that instant,
flight(t + τ). We considered a range of values of τ, both positive (flight motor output lagging acoustic gaze) and negative (flight motor output leading acoustic gaze).
Variables considered. The tangent to the bat’s flight path gives the velocity direction, θflight, measured with respect to a fixed world reference (dotted line). The rate of turn of the bat, flight, is computed as the time derivative of this quantity at each point on the flight path. The target direction (dashed line from bat to target), θtarget, is measured with respect to the common fixed world reference (dotted line). H⃗ is the axis of the sonar beam (aligned with head direction). This is computed each time, t, the bat emits a vocalization. The angle between H⃗ and the tangent at time t gives the acoustic gaze angle θgaze for each bat vocalization. In the analysis, θgaze(t) is correlated to
flight(t + τ) for a range of τ. The value of
flight depends on the kinematics of the flight path, and the value of θgaze depends on the bat’s gaze direction. Both of these quantities are not a priori dependent on vocalization timing.
Results
Flight behavior
An example of an insect capture trial is shown in Figure 5 (corresponding to movie S1, available at www.jneurosci.org as supplemental material). Figure 5A shows a top view of the reconstructed flight path of the bat as it intercepts a tethered insect released from a trap door. The straight black lines denote the direction of the sonar beam axis during each call, the gray line is the flight trajectory of the bat, and the thin curved black line is the trajectory of the target. In the example shown, the bat is initially flying in an empty room. Its sonar beam is directed to the reader’s left side, and it is also steering to the left (Fig. 5B, t1), producing sounds at a relatively low rate (Fig. 5G, 10 Hz). The prey is released from the trap door at point z and suspended from the tether when the bat is at point t2 (Fig. 5D, target height drops). After the prey is presented, the bat turns its sonar beam to lock onto the prey (150 ms after t2). The bat begins to increase the repetition rate of its sonar calls. During the attack stage, the bat redirects its flight to intercept its prey and the PPR rises to high values (>100 Hz). During the last 50 ms before capture, vocalizations were either absent or too faint to analyze reliably and are not shown. In a previous study of the sonar beam pattern in flying echolocating bats, we reported that the bat centers its sonar beam axis tightly onto selected prey during the attack stage. The accuracy of this lock-on is approximately ±3° (Ghose and Moss, 2003).
Flight path and sonar beam axis of an echolocating bat. A, Top view of a bat capturing an insect. The bat (gray line) flies from the top of the panel to the bottom. The straight black lines indicate the sonar beam axis direction of the bat each time it makes a sonar call. When the bat is near t2, the target is dropped from a trap door (point z). The tick marks are in meters. B, C, Schematic insets showing relative orientations of bat’s sonar beam axis, flight (body) direction, and emitted sonar beam pattern (coded in gray scale) at points t1 and t3. D, The heights of the bat (solid line) and target (dotted line) over time. The target is initially concealed in a trap-door mechanism. E, Linear speed of the bat over time. The bat brakes and rises slightly as it turns to intercept the insect. F, Bat’s sonar beam axis direction, flight direction, and bat-to-target direction over time for the trajectory shown. All angles are with respect to an external fixed reference. The bat locks the sonar beam on the target after time t2. Deg, Degrees. G, Pulse production rate over time. The pulse rate increases as the bat locks the sonar beam onto the target. During the last 50 ms before capture, vocalizations were either absent or too faint to analyze reliably and are not shown. For animations of this and other insect interceptions, see supplemental movies (available at www.jneurosci.org as supplemental material).
Division of stages of bat flight based on PPR
We demarcated the different behavioral states of the bat from its sonar PPR. Griffin first reported the dramatic increase in PPR by echolocating bats during the final attack of insect prey and termed it the buzz (Griffin, 1958; Griffin et al., 1960). Subsequent studies in echolocating bats have used changes in PPR to infer changes in the bat’s behavioral state (Kick and Simmons, 1984; Schnitzler et al., 1987; Kalko, 1995). Foraging flight has been divided into four stages, according to the bat’s vocal behavior: searching, approaching, tracking, and attacking (Kick and Simmons, 1984). The tracking and attack phases correspond to the terminal I and terminal II stages described by some authors (Kalko, 1995). During searching, the bat is producing pulses at a very low rate (5–10 Hz). After the bat detects the insect, the bat moves into the approach stage and the pulse rate rises (20–50 Hz). It then transitions to the tracking stage (50 Hz) (Kick and Simmons, 1984) and finally to the attacking stage (up to 200 Hz). These PPR values are estimates from field studies and vary with species of bat (for review, see Schnitzler and Kalko, 2001).
Here, we demarcate the different stages of insect capture behavior of E. fuscus under laboratory conditions using the PPR values we obtained in our experiment. Figure 6 is a histogram of the pulse production rates of the bats’ vocalizations taken during the experiment. In conjunction with observations of the bat’s insect pursuit behavior (Fig. 5A,E), we used the valley points of the distribution as the dividers for the different stages of foraging flight. PPR values <50 Hz were assigned to the search/approach stage, PPR values ranging from 50 to 100 Hz were assigned to the tracking stage, and PPR values >100 Hz were assigned to the attacking stage. This demarcation is based on the PPR values of the emitted vocalizations and is independent of measurements of the sonar beam direction and flight path.
Histogram of PPR of the sonar vocalizations produced by the bats. The PPR distribution is trimodal. The peak centered around 160 Hz corresponds to the buzz (Griffin, 1958) when the bat is attacking prey. There is a smaller peak around 75 Hz corresponding to the tracking stage (Kick and Simmons, 1984), followed by a large number of calls with pulse rates <50 Hz, corresponding to periods when the bat is flying in an empty room or has just detected its prey and is beginning to increase its PPR (search/approach). We chose the valley points of the distribution as the dividers for the different stages of foraging behavior. Data are from five bats, 1525 calls, over 38 trials.
Relationship between acoustic gaze and flight behavior
The bat can direct its acoustic gaze (sonar beam axis) substantially off its flight path (Ghose and Moss, 2003). However, we noted that during all stages of flight, there was a strong linear relationship between the acoustic gaze angle at time t and the rate of change of flight direction at a time t + τ, with the gaze leading the flight direction. The gain of linear relationship k depended on the bat’s behavioral state.
We collected data from five bats and a total of 38 bat flights. We computed the correlation between acoustic gaze angle [θgaze(t), which is the angle between the axis of the sonar beam and the bat’s flight direction] and flight turn rate [flight(t + τ)] for τ values ranging from −200 to +200 ms, during the different stages of flight behavior. Figure 7A shows how the correlation between acoustic gaze angle (θgaze) and flight turn rate (
flight) changes with the lag, τ, during search/approach flight; the correlation peaks at τmax = 148 ms, 60 ms ≤ τmax ≤ 230 ms, 95% confidence interval (CI). Figure 7B shows the scatter plot of the sonar beam axis to flight angle versus the flight turn rate for τmax = 148 ms. The gain, ksearch/approach (slope of the line), is 3.21 ± 0.32 s−1 (correlation coefficient r = 0.77 ± 0.01; n = 473 vocalizations). Figure 7, C and D, shows data from the tracking stage. In this stage, the maximum r occurred at τmax = 128 ms (60 ms ≤ τmax ≤ 170 ms) with ktrack = 4.24 ± 0.48 s−1 (r = 0.86 ± 0.01; n = 186 vocalizations). Figure 7, E and F, shows data from the attacking stage. In this stage, the maximum r occurred at τmax = 96 ms (60 ms ≤ τmax ≤ 140 ms) with kattack = 6.26 ± 0.40 s−1 (r = 0.84 ± 0.01; n = 709 vocalizations).
Acoustic gaze is adaptively coupled to flight motor output. A, C, E, The black line shows the correlation coefficient (r) between the acoustic gaze angle (θgaze) of the bat and the flight turn rate (flight), for different lag values (τ) for the three behavioral states. The gray lines adjacent to r show the 95% CI for r. The vertical dotted lines show the corresponding CI for τmax. There is no significant difference between the τmax values for the three states, but they are significantly greater than zero, indicating that the acoustic gaze anticipates locomotor planning. B, D, F, Scatter plot of θgaze and
flight at τmax. The regression line is shown overlaid in gray. The gain (slope, k) increases as the bat progresses from the search/approach to tracking to attack stages of foraging flight. The offset (intercept, c) is negligible. Pairwise comparisons show that the slopes are significantly different from each other, with ksearch/approach < ktracking < kattack. This suggests that the bat’s behavioral state modulates the gain of the linkage. Data are from five bats and a total of 38 flights.
The value of τmax decreases as the bat progresses from the search/approach stage to the attack stage of flight. The overlap of the τmax confidence intervals between the behavioral states, however, indicates the differences across the three conditions are not statistically significant. τmax in all stages is significantly greater than zero, indicating that the acoustic gaze always leads the flight motor output. Pairwise comparisons between the gain (k) values for the three stages show that the gains are significantly different from each other: for search/approach and tracking, t = 4.63; for tracking and attack, t = 7.2; for search/approach and attack, t = 12.7 (p < 0.001; Bonferroni’s correction applied) with ksearch/approach < ktracking < kattack.
These results can be summarized by the following general control law: Here, θgaze(t) is the acoustic gaze angle, the angle between the sonar beam axis and flight vector.
flight(t + τ) is the rate at which the bat turns in flight, k is a state-dependent gain factor, and τ is the constant time by which the flight lags the gaze direction. The offsets obtained in the data are negligible and are not included in the control law.
The rate of turn for a given gaze angle increases as the pulse interval decreases (larger values of k are obtained during phases when the repetition rate is higher). Because the bat is receiving information from the environment at the rate at which pulses are produced, it might be possible that the bat is keeping the angle it turns per pulse proportional to the gaze angle and independent of the pulse rate. We do not, however, find evidence for this in our data.
Confidence intervals of τ(lag) and r (correlation) values
We used the Fisher transform (Howell, 1997) to compute 95% confidence intervals on the correlation coefficient r. To compute the confidence interval for τmax, we considered the range of τ values for which the experimentally obtained r value was not significantly different from the experimental peak r value (Fig. 7A,C,E, top gray line), as follows:
where ρ′ is the Fisher-transformed population correlation coefficient, zα/2 is 1.96 for 95% confidence limits, and n is the sample size. The confidence intervals on τmax indicate that the τmax for the different behavioral states do not significantly differ. However, they are all significantly greater than zero.
Discussion
Here, we report how the relationship between gaze direction and locomotor output changes as an animal progresses from searching behavior to target-directed behavior. We also show a control law linking gaze direction and locomotor output in an acoustically guided animal. We show that during foraging flight, the bat’s acoustic gaze (direction of the sonar beam axis) leads its flight motor output. This relationship may be expressed as a delayed linear law linking acoustic gaze angle with flight turn rate. The gain (slope) of this linkage changes with behavioral state of the bat, as inferred from the repetition rate of its sonar vocalizations.
To the best of our knowledge, this is the first time an adaptive control law linking gaze and locomotion has been described for any species. It requires less energy to maneuver a small part of the body, such as the head or eye, than to change the orientation or direction of motion of the whole animal. By controlling sensory gaze through a part of the body that is light and independently movable, animals are able to conserve energy when redirecting gaze (Oommen et al., 2004). When an animal is not executing target-directed locomotion, therefore, one may expect the gaze direction to be uncoupled from locomotion direction, because the animal may be scanning the environment. Our results show, however, that the linkage between gaze and locomotion in E. fuscus exists during all stages of flight, from search to attack. At low pulse rates, the bat is either searching for a target or has just detected a potential target. Dramatic increases in the bat’s pulse rate have been interpreted as the animal making a decision to pursue a detected target (Kick and Simmons, 1984). We show that the flight turn rate associated with a given gaze angle increases at higher repetition rate stages.
This is the first study to describe a law linking gaze to locomotion in an auditory guided animal. A previous study of spatial memory in bats (Phyllostomus discolor) navigating a very small (1 m diameter) octagonal arena illuminated with visible light suggested that the bat’s head direction leads its flight direction (Höller and Schmidt, 1996). These experiments were explicitly designed to test spatial memory so the animals were very familiar with the arena. The experimenters concluded that vision and spatial memory were more dominant than echolocation in guiding locomotion under the conditions they set up. Höller and Schmidt’s study did not reveal the relationship between acoustic gaze direction and flight control in bats pursuing prey or flying in an extended space. In our study, memory effects were minimized, because the bats were required to fly around and intercept a tethered target placed at random locations and dropped at random times in a large (7.3 × 6.4 × 2.0 m3), empty room. The use of vision in our study was limited by removing light sources visible to the bat (Hope and Bhatnagar, 1979). This paradigm allowed us to test flight guidance by echolocation under different behavioral conditions, which is essential to our conclusions.
There are comparable studies, in insects, showing a similar, but constant, delayed linear linkage between vision and target-directed locomotion. Studies of chasing behavior in flies (Land and Collett, 1974) and walking tiger beetles (Gilbert, 1997) have quantitatively shown a similar delayed linear relationship between visual target location and locomotor output. For the fly, the delay between sensory input and motor output is ∼30 ms (Land and Collett, 1974), and for tiger beetles, the delay is ∼40 ms (Gilbert, 1997). Because of this short latency, a hard-wired visual pursuit system has been proposed for flies that links the output of retinal neurons to flight control neurons with only one interneuron stage (Land and Collett, 1974). Neurons sensitive to the position of appropriately sized visual targets are hypothesized to drive flight motor neurons in direct proportion to retinal position, thereby creating a servo-system to control pursuit. The sensory–locomotor gain for a given fly was considered constant, perhaps a result of the hard-wired nature of the system. The concept of sensory–locomotor gain, as used in these previous studies, although in principle similar to that described in this paper, involves visual target direction and not gaze direction.
Other studies have linked visual gaze direction with locomotor control, although they have not suggested that the linkage is behaviorally adaptable. Field observations of flying birds have led to the hypothesis that a “flying bird follows its beak” (Groebbels, 1929). A study on stationary pigeons demonstrated that there are neck reflexes on wing and tail muscles that cause coordinated movements of wing and tail feathers with deflection of the pigeon’s head (Bilo and Bilo, 1983). It is, however, not known whether such coordinated movements also operate in a flying pigeon and result in a behaviorally adaptable linkage of flight path with visual gaze. Studies in humans have suggested an anticipatory relationship between direction of visual gaze and future locomotion direction for subjects moving along a fixed path (Land and Lee, 1994; Grasso et al., 1998; Hollands et al., 2002), although there is no evidence for behaviorally adaptable gain in this relationship.
The linkage between locomotion and gaze for an echolocating animal, as described in this paper, is similar in principle to that suggested for many visual animals. This finding can be used to consider the generality of theories of sensory–locomotion coordination based on studies of visual animals. For humans, the relationship between gaze direction and motion has been interpreted in the context of visual cues such as optic flow (Wann and Swapp, 2000; Warren et al., 2001; Wilkie and Wann, 2003; Fajen and Warren, 2004). One leading hypothesis is that a visually guided animal steers by centering the focus of expansion (FOE) of optic flow on a locomotor goal (Gibson, 1950, 1966). A theoretical paper by Wann and Swapp (2000) suggests that subjects direct gaze toward a locomotor goal to minimize errors in computing the FOE.
The echolocating bat receives information from the environment in the form of intermittent snapshots of auditory information. An acoustic analog of optic flow information has been proposed for bats that produce constant frequency (CF) sonar signals in combination with frequency-modulated (FM) sweeps (Müller and Schnitzler, 1999). Bats that produce long CF signals are very sensitive to Doppler shifts in the pure tone component of the echo (Schnitzler, 1968; Neuweiler et al., 1980) and may be able to extract flow information from it. FM signals are considered Doppler tolerant (Altes and Titlebaum, 1970), and therefore bats producing only FM signals, like E. fuscus, the species used in these experiments, receive echoes poorly suited to carry Doppler information.
We propose that the relationship between acoustic gaze and locomotion in the bats studied here is not attributable to any analog of optic flow for steering control. In E. fuscus, the sonar beam axis is aligned with the head, which is the auditory reference frame. Researchers who study sensorimotor transformations and those who build mobile robots with movable sensors have grappled with the issue of mapping sensory information into motor commands. Many of these mappings must be learned by the animal or machine (Salinas and Abbott, 1995; Pouget and Snyder, 2000; Cohen and Andersen, 2002). There are often multiple solutions to motor control problems such as locomotion (Bernstein, 1967).
We suggest that the flying echolocating bat constrains and simplifies the conversion of locomotor intention into locomotor action by linking its sensory reference frame to its locomotor output. We speculate that by reducing the gain of the coupling during low signal repetition rate behavioral stages, such as search/approach, the bat is compromising between a complete uncoupling of gaze direction and locomotor output (conserving energy) and maintaining the computational benefits of coupling gaze and locomotion. As the bat progresses toward capturing an insect, as indicated by an increase in PPR, it increases the gain in the linkage between gaze and locomotion, thereby coupling its flight behavior more rigidly to the target position. The bat, therefore, adapts to different behavioral requirements by adjusting one parameter: the gain of the system linking spatial auditory information to flight motor outputs.
Footnotes
-
This work was supported by National Institutes of Health (NIH) P-30 Center Grant DC04664, National Science Foundation Grant IBN0111973, NIH Grants R01MH056366 and R01EB004750 to C.F.M., and by a University of Maryland, College Park Psychology Department Jack Bartlett fellowship to K.G. We thank T. Horiuchi for help with the array circuitry and R. Dooling, W. Hodos, P. S. Krishnaprasad, N. Ulanovsky, C. Chiu, and A. Perez for valuable comments on previous versions of this manuscript.
- Correspondence should be addressed to Kaushik Ghose, Department Psychology, University of Maryland, College Park, MD 20742. Email: kaushik.ghose{at}gmail.com