Figure 14 - uploaded by Ole Næsbye Larsen
Content may be subject to copyright.
The cricket ear is located in the front legs and is a sound receiver with four acoustic inputs. In the case of the right ear, one input is the external surface of the eardrum (IT: ipsilateral tympanum). Sound also propagates through the acoustic trachea from the ipsilateral spiracle (IS) to the inner surface of the eardrum. Finally, sound from the contralateral spiracle (CS) and contralateral ear (CT) may reach the inner surface of the eardrum by propagating through the transverse trachea and the acoustic trachea. Note that the sound from the contralateral inputs has to pass a (double) central membrane (CM) in the transverse trachea. From Michelsen et al (1994) and Michelsen and Löhe (1995). 

The cricket ear is located in the front legs and is a sound receiver with four acoustic inputs. In the case of the right ear, one input is the external surface of the eardrum (IT: ipsilateral tympanum). Sound also propagates through the acoustic trachea from the ipsilateral spiracle (IS) to the inner surface of the eardrum. Finally, sound from the contralateral spiracle (CS) and contralateral ear (CT) may reach the inner surface of the eardrum by propagating through the transverse trachea and the acoustic trachea. Note that the sound from the contralateral inputs has to pass a (double) central membrane (CM) in the transverse trachea. From Michelsen et al (1994) and Michelsen and Löhe (1995). 

Source publication
Article
Full-text available
Directional sound receivers are useful for locating sound sources, and they can also partly compensate for the signal degradations caused by noise and reverberations. Ears may become inherently directional if sound can reach both surfaces of the eardrum. Attempts to understand the physics of such pressure difference receiving ears have been hampere...

Contexts in source publication

Context 1
... and Konishi (1981) for measuring the sound transmission in the interaural canal in barn owls was not flawed by blocking. They used the response of monaural neurons in the cochlear nucleus to measure and compare the effect of sound presented to the ipsi- or contralateral ear. At 3.5 kHz, the eardrum and the interaural canal attenuated sound by an average of 13 dB, whereas the attenuation was about 50 dB at 6–8 kHz, which is the frequency range used by barn owls for locating prey. Of course, this method did not provide phase values, but it was sufficient for establishing the possible role of the interaural sound transmission in the hearing at different frequencies. Several authors have considered the occurrence of ‘nulls’ (a minimum of eardrum vibration and / or neural response) as evidence for pressure difference reception. Such a minimum may be the result of almost identical sounds acting on the two surfaces of the eardrum, but it may also be caused by a local minimum of sound pressure due to diffraction and / or reflections of sound. For example, a 30 dB deep minimum in the pressure amplitude at 28 kHz has been observed at the ear contralateral to the sound source when a moth was mounted 1 to 2 mm above a piece of cork in a free sound field. The directional pattern looked like the ( B 3) pattern in figure 7, but it was caused entirely by the diffraction of sound by the cork plus the animal, and not by any pressure difference properties of the ears (Surlykke and Coro 2004). One difficulty with several earlier studies is that the authors did not describe their set-up in such detail that one can have confidence about the quality of the sound field. Moreover, only few have measured the directional cues in a free sound field. In order to avoid reflections, the animal must be mounted with a few rods that are below one tenth of a wavelength thick. In our experience, mounting rods that are too thick can have dramatic effects on the total diffraction around a preparation, affecting both the amplitude and phase of the sounds reaching the eardrums. A word of warning is in order with respect to the magnitude of a minimum. In an experimental set-up, one can locate a minimum by varying the frequency of sound and the direction of the sound source, searching for an almost motionless eardrum. The occurrence of a minimum is a property of the system under study, but its magnitude is mainly a measure of the patience of the investigator. A common source of error in several early studies of the sound transmission through ears was to neglect mismatches or changes of impedances. Probes (narrow tubes) mounted in front of microphones allow measurements in narrow spaces such as middle ear cavities, but only when the input impedance of the probe is much larger than the source impedance of the cavity. A study of the effect of the interaural canal in chickens carried out by Rosowski and Saunders (1980) illustrates yet another problem with the use of probe microphones. Their CM measurements showed attenuations of 10–15 dB at 1–5 kHz for sound travelling from the auditory meatus in one ear to the middle ear of the other ear. These values are close to those obtained by Larsen et al (2006), but they found attenuations of 25–35 dB by using probe microphones. The most likely reason for this difference is that the probe microphones used were of the traditional type with resonances (modern probes are ‘one way’ and lack resonances). Instead of damping the resonances with resistive material, the authors measured the resonances and used the results for correcting the measured values. Undamped resonances in open tubes depend very much on the impedance at the open end, which is likely to change with the surroundings (in fact, the authors report that small variations of position within the middle ear could produce changes of probe output of as much as 10 dB). A major source of error in studies of hearing in birds was discovered by Larsen et al (1997). They found that many birds fail to ventilate their middle ears during anaesthesia. A ‘negative’ intracranial air pressure then builds up, and the eardrums are gradually abducted towards the middle ears. The eardrums are thus stretched and likely to transmit less sound, especially at low frequencies. This causes an impaired hearing and wrong values for the interaural transmission. In the study by Larsen et al (2006) the intracranial air pressure was kept constant by inserting a very thin injection needle into the interaural canal, thus creating conditions similar to ventilating awake birds. It is not known how many earlier studies of bird hearing are flawed because of a lack of ventilation. With regard to acoustics, crickets have one of the most complicated hearing organs known. The ears are located in the front legs, just below the ‘knee’. Sound acts on the outer surface of the eardrum, and an acoustic trachea connects the inner surface of the eardrum to a spiracle at the lateral surface of the body (figure 14). In addition, a transverse trachea connects the acoustic tracheae at the two sides of the body. The cricket ear is thus an acoustic four-input device (Michelsen et al 1994). Like open organ pipes, the acoustic trachea has one- quarter and three-quarter wavelength resonances. Within the frequency range of 2–20 kHz, the phase delay of the sound reaching the eardrum from the ipsilateral spiracle and acoustic trachea increases about 18 ◦ per kHz (corresponding to the length of the trachea and the reduced propagation velocity already mentioned). In contrast, the amplitude and phase of the sound transmitted from the contralateral spiracle (CS in figure 14) through the transverse trachea change dramatically with frequency between 3 and 10 kHz (figure 15). The phase delay is well above 100 ◦ per kHz, and the amplitude increases by a factor of 4 between 4 and 4.5 kHz, a behaviour similar to that of an 8-pole band-pass filter. The physical mechanism behind this filter is not known, but the central membrane (CM in figure 14) in the transverse trachea is important for its function (Michelsen and Löhe 1995). A hole of 10–25% of its area is sufficient to change the transmission properties to become similar to those observed in the acoustic trachea. This also changes the directional pattern (compare ( A ) and ( B ) in figure 16). Female crickets walk towards males singing the pure tone calling song (about 4.5 kHz in the cricket Gryllus bimaculatus ). The female finds the direction of the male by means of the cardioid directionality pattern of the eardrum (figure 16( A )). Within a narrow band of frequencies around 4.5 kHz, the forward gradient is about 10 dB (figure 16( C )). The reason for this very narrow tuning can be seen in figure 17, in which the sound proportional to the force acting on the eardrum ( P ) is calculated by adding three vectors (the amplitude of the fourth vector (the sound from the other ear) is so small that it was ignored). Obviously, the two largest vectors (IT and IS) are more or less opposing each other at all directions of sound incidence, whereas the third and smallest vector (CS) may ‘help’ either IT (e.g., at 270 ◦ ) or IS (e.g., at 60 ◦ ). In order for the addition to result in the cardioid pattern shown, a proper phase relationship has to exist between CS and the other two vectors. The phase of CS changes dramatically with frequency (figure 15), and a proper phase relationship therefore only exists within a narrow frequency range. In the model calculations of a simple two-input system, we showed that the forward gradient is tuned to an optimum of phase delay (figures 12 and 13), and that 6 dB below the maximum the tuning curves have a bandwidth of about 100 ◦ . For an animal of the size of a cricket, this would correspond to a frequency range of about 6 kHz. The tuning of the forward gradient in the cricket has a bandwidth of only about 1 kHz (figure 16( C )), because the phase delay of the sound arriving at the ear via the transverse trachea changes well above 100 ◦ per kHz. At the time of its discovery, the narrow tuning of the cricket directional hearing was thought to be unique (Michelsen 1998b), but we can now see that it really is a special case of a basic phenomenon. The question is, however, why such a narrowly tuned directional hearing has evolved in crickets? The answer to this question may be found in the acoustic properties of the habitats of field crickets, which are discussed in section 11. In passing, two additional aspects should be mentioned. The acoustical data used in the analysis presented above were obtained from crickets in which the acoustic spiracles were forced open. The spiracular opening is hidden by a cuticular plate, which was fastened in an open position during the experiments. Tiny muscles are attached to the plate. One may speculate that listening crickets may vary the position of the plate and thus the sound entry through the acoustic spiracles. Such an efferent control would make the hearing organ of the cricket even more remarkable. The second aspect is the robot cricket. Robots mimicking aspects of living animals are built for a variety of purposes, mainly as biological models or for use in engineering. The robot cricket was built in order to increase the understanding of the processing of auditory information in the central nervous system that allows animals to identify signals against a noisy background, recognize signals from a particular species and localize and move towards its source (Webb 2001). The work with the robot cricket has been very successful in realizing these goals and has provided valuable information about the minimum central nervous connections necessary (Lund et al 1997, Reeve and Webb 2002). The directional characteristics of the ears have been simulated by arrays of microphones, but the investigators have not been interested in the specific acoustics of crickets’ hearing organs. The latest robot (Torben- Nielsen et ...
Context 2
... frequencies. Several authors have considered the occurrence of ‘nulls’ (a minimum of eardrum vibration and / or neural response) as evidence for pressure difference reception. Such a minimum may be the result of almost identical sounds acting on the two surfaces of the eardrum, but it may also be caused by a local minimum of sound pressure due to diffraction and / or reflections of sound. For example, a 30 dB deep minimum in the pressure amplitude at 28 kHz has been observed at the ear contralateral to the sound source when a moth was mounted 1 to 2 mm above a piece of cork in a free sound field. The directional pattern looked like the ( B 3) pattern in figure 7, but it was caused entirely by the diffraction of sound by the cork plus the animal, and not by any pressure difference properties of the ears (Surlykke and Coro 2004). One difficulty with several earlier studies is that the authors did not describe their set-up in such detail that one can have confidence about the quality of the sound field. Moreover, only few have measured the directional cues in a free sound field. In order to avoid reflections, the animal must be mounted with a few rods that are below one tenth of a wavelength thick. In our experience, mounting rods that are too thick can have dramatic effects on the total diffraction around a preparation, affecting both the amplitude and phase of the sounds reaching the eardrums. A word of warning is in order with respect to the magnitude of a minimum. In an experimental set-up, one can locate a minimum by varying the frequency of sound and the direction of the sound source, searching for an almost motionless eardrum. The occurrence of a minimum is a property of the system under study, but its magnitude is mainly a measure of the patience of the investigator. A common source of error in several early studies of the sound transmission through ears was to neglect mismatches or changes of impedances. Probes (narrow tubes) mounted in front of microphones allow measurements in narrow spaces such as middle ear cavities, but only when the input impedance of the probe is much larger than the source impedance of the cavity. A study of the effect of the interaural canal in chickens carried out by Rosowski and Saunders (1980) illustrates yet another problem with the use of probe microphones. Their CM measurements showed attenuations of 10–15 dB at 1–5 kHz for sound travelling from the auditory meatus in one ear to the middle ear of the other ear. These values are close to those obtained by Larsen et al (2006), but they found attenuations of 25–35 dB by using probe microphones. The most likely reason for this difference is that the probe microphones used were of the traditional type with resonances (modern probes are ‘one way’ and lack resonances). Instead of damping the resonances with resistive material, the authors measured the resonances and used the results for correcting the measured values. Undamped resonances in open tubes depend very much on the impedance at the open end, which is likely to change with the surroundings (in fact, the authors report that small variations of position within the middle ear could produce changes of probe output of as much as 10 dB). A major source of error in studies of hearing in birds was discovered by Larsen et al (1997). They found that many birds fail to ventilate their middle ears during anaesthesia. A ‘negative’ intracranial air pressure then builds up, and the eardrums are gradually abducted towards the middle ears. The eardrums are thus stretched and likely to transmit less sound, especially at low frequencies. This causes an impaired hearing and wrong values for the interaural transmission. In the study by Larsen et al (2006) the intracranial air pressure was kept constant by inserting a very thin injection needle into the interaural canal, thus creating conditions similar to ventilating awake birds. It is not known how many earlier studies of bird hearing are flawed because of a lack of ventilation. With regard to acoustics, crickets have one of the most complicated hearing organs known. The ears are located in the front legs, just below the ‘knee’. Sound acts on the outer surface of the eardrum, and an acoustic trachea connects the inner surface of the eardrum to a spiracle at the lateral surface of the body (figure 14). In addition, a transverse trachea connects the acoustic tracheae at the two sides of the body. The cricket ear is thus an acoustic four-input device (Michelsen et al 1994). Like open organ pipes, the acoustic trachea has one- quarter and three-quarter wavelength resonances. Within the frequency range of 2–20 kHz, the phase delay of the sound reaching the eardrum from the ipsilateral spiracle and acoustic trachea increases about 18 ◦ per kHz (corresponding to the length of the trachea and the reduced propagation velocity already mentioned). In contrast, the amplitude and phase of the sound transmitted from the contralateral spiracle (CS in figure 14) through the transverse trachea change dramatically with frequency between 3 and 10 kHz (figure 15). The phase delay is well above 100 ◦ per kHz, and the amplitude increases by a factor of 4 between 4 and 4.5 kHz, a behaviour similar to that of an 8-pole band-pass filter. The physical mechanism behind this filter is not known, but the central membrane (CM in figure 14) in the transverse trachea is important for its function (Michelsen and Löhe 1995). A hole of 10–25% of its area is sufficient to change the transmission properties to become similar to those observed in the acoustic trachea. This also changes the directional pattern (compare ( A ) and ( B ) in figure 16). Female crickets walk towards males singing the pure tone calling song (about 4.5 kHz in the cricket Gryllus bimaculatus ). The female finds the direction of the male by means of the cardioid directionality pattern of the eardrum (figure 16( A )). Within a narrow band of frequencies around 4.5 kHz, the forward gradient is about 10 dB (figure 16( C )). The reason for this very narrow tuning can be seen in figure 17, in which the sound proportional to the force acting on the eardrum ( P ) is calculated by adding three vectors (the amplitude of the fourth vector (the sound from the other ear) is so small that it was ignored). Obviously, the two largest vectors (IT and IS) are more or less opposing each other at all directions of sound incidence, whereas the third and smallest vector (CS) may ‘help’ either IT (e.g., at 270 ◦ ) or IS (e.g., at 60 ◦ ). In order for the addition to result in the cardioid pattern shown, a proper phase relationship has to exist between CS and the other two vectors. The phase of CS changes dramatically with frequency (figure 15), and a proper phase relationship therefore only exists within a narrow frequency range. In the model calculations of a simple two-input system, we showed that the forward gradient is tuned to an optimum of phase delay (figures 12 and 13), and that 6 dB below the maximum the tuning curves have a bandwidth of about 100 ◦ . For an animal of the size of a cricket, this would correspond to a frequency range of about 6 kHz. The tuning of the forward gradient in the cricket has a bandwidth of only about 1 kHz (figure 16( C )), because the phase delay of the sound arriving at the ear via the transverse trachea changes well above 100 ◦ per kHz. At the time of its discovery, the narrow tuning of the cricket directional hearing was thought to be unique (Michelsen 1998b), but we can now see that it really is a special case of a basic phenomenon. The question is, however, why such a narrowly tuned directional hearing has evolved in crickets? The answer to this question may be found in the acoustic properties of the habitats of field crickets, which are discussed in section 11. In passing, two additional aspects should be mentioned. The acoustical data used in the analysis presented above were obtained from crickets in which the acoustic spiracles were forced open. The spiracular opening is hidden by a cuticular plate, which was fastened in an open position during the experiments. Tiny muscles are attached to the plate. One may speculate that listening crickets may vary the position of the plate and thus the sound entry through the acoustic spiracles. Such an efferent control would make the hearing organ of the cricket even more remarkable. The second aspect is the robot cricket. Robots mimicking aspects of living animals are built for a variety of purposes, mainly as biological models or for use in engineering. The robot cricket was built in order to increase the understanding of the processing of auditory information in the central nervous system that allows animals to identify signals against a noisy background, recognize signals from a particular species and localize and move towards its source (Webb 2001). The work with the robot cricket has been very successful in realizing these goals and has provided valuable information about the minimum central nervous connections necessary (Lund et al 1997, Reeve and Webb 2002). The directional characteristics of the ears have been simulated by arrays of microphones, but the investigators have not been interested in the specific acoustics of crickets’ hearing organs. The latest robot (Torben- Nielsen et al 2005) is equipped with four microphones that are said to be ‘mimicking the four sound entrances’. It remains to be seen whether the robot cricket will be useful in exploring the strategies for localization of sound sources in complex sound fields. Only little is known about how well the mechanisms of directional hearing work in natural habitats. One exception was the demonstration by Rheinlaender and Römer (1986) that usable directional cues may be absent for some of the animals that can hear conspecific sound signals in dense vegetation. Their experiments were performed with a small neurophysiological set-up, which could be moved without ...
Context 3
... of sound. For example, a 30 dB deep minimum in the pressure amplitude at 28 kHz has been observed at the ear contralateral to the sound source when a moth was mounted 1 to 2 mm above a piece of cork in a free sound field. The directional pattern looked like the ( B 3) pattern in figure 7, but it was caused entirely by the diffraction of sound by the cork plus the animal, and not by any pressure difference properties of the ears (Surlykke and Coro 2004). One difficulty with several earlier studies is that the authors did not describe their set-up in such detail that one can have confidence about the quality of the sound field. Moreover, only few have measured the directional cues in a free sound field. In order to avoid reflections, the animal must be mounted with a few rods that are below one tenth of a wavelength thick. In our experience, mounting rods that are too thick can have dramatic effects on the total diffraction around a preparation, affecting both the amplitude and phase of the sounds reaching the eardrums. A word of warning is in order with respect to the magnitude of a minimum. In an experimental set-up, one can locate a minimum by varying the frequency of sound and the direction of the sound source, searching for an almost motionless eardrum. The occurrence of a minimum is a property of the system under study, but its magnitude is mainly a measure of the patience of the investigator. A common source of error in several early studies of the sound transmission through ears was to neglect mismatches or changes of impedances. Probes (narrow tubes) mounted in front of microphones allow measurements in narrow spaces such as middle ear cavities, but only when the input impedance of the probe is much larger than the source impedance of the cavity. A study of the effect of the interaural canal in chickens carried out by Rosowski and Saunders (1980) illustrates yet another problem with the use of probe microphones. Their CM measurements showed attenuations of 10–15 dB at 1–5 kHz for sound travelling from the auditory meatus in one ear to the middle ear of the other ear. These values are close to those obtained by Larsen et al (2006), but they found attenuations of 25–35 dB by using probe microphones. The most likely reason for this difference is that the probe microphones used were of the traditional type with resonances (modern probes are ‘one way’ and lack resonances). Instead of damping the resonances with resistive material, the authors measured the resonances and used the results for correcting the measured values. Undamped resonances in open tubes depend very much on the impedance at the open end, which is likely to change with the surroundings (in fact, the authors report that small variations of position within the middle ear could produce changes of probe output of as much as 10 dB). A major source of error in studies of hearing in birds was discovered by Larsen et al (1997). They found that many birds fail to ventilate their middle ears during anaesthesia. A ‘negative’ intracranial air pressure then builds up, and the eardrums are gradually abducted towards the middle ears. The eardrums are thus stretched and likely to transmit less sound, especially at low frequencies. This causes an impaired hearing and wrong values for the interaural transmission. In the study by Larsen et al (2006) the intracranial air pressure was kept constant by inserting a very thin injection needle into the interaural canal, thus creating conditions similar to ventilating awake birds. It is not known how many earlier studies of bird hearing are flawed because of a lack of ventilation. With regard to acoustics, crickets have one of the most complicated hearing organs known. The ears are located in the front legs, just below the ‘knee’. Sound acts on the outer surface of the eardrum, and an acoustic trachea connects the inner surface of the eardrum to a spiracle at the lateral surface of the body (figure 14). In addition, a transverse trachea connects the acoustic tracheae at the two sides of the body. The cricket ear is thus an acoustic four-input device (Michelsen et al 1994). Like open organ pipes, the acoustic trachea has one- quarter and three-quarter wavelength resonances. Within the frequency range of 2–20 kHz, the phase delay of the sound reaching the eardrum from the ipsilateral spiracle and acoustic trachea increases about 18 ◦ per kHz (corresponding to the length of the trachea and the reduced propagation velocity already mentioned). In contrast, the amplitude and phase of the sound transmitted from the contralateral spiracle (CS in figure 14) through the transverse trachea change dramatically with frequency between 3 and 10 kHz (figure 15). The phase delay is well above 100 ◦ per kHz, and the amplitude increases by a factor of 4 between 4 and 4.5 kHz, a behaviour similar to that of an 8-pole band-pass filter. The physical mechanism behind this filter is not known, but the central membrane (CM in figure 14) in the transverse trachea is important for its function (Michelsen and Löhe 1995). A hole of 10–25% of its area is sufficient to change the transmission properties to become similar to those observed in the acoustic trachea. This also changes the directional pattern (compare ( A ) and ( B ) in figure 16). Female crickets walk towards males singing the pure tone calling song (about 4.5 kHz in the cricket Gryllus bimaculatus ). The female finds the direction of the male by means of the cardioid directionality pattern of the eardrum (figure 16( A )). Within a narrow band of frequencies around 4.5 kHz, the forward gradient is about 10 dB (figure 16( C )). The reason for this very narrow tuning can be seen in figure 17, in which the sound proportional to the force acting on the eardrum ( P ) is calculated by adding three vectors (the amplitude of the fourth vector (the sound from the other ear) is so small that it was ignored). Obviously, the two largest vectors (IT and IS) are more or less opposing each other at all directions of sound incidence, whereas the third and smallest vector (CS) may ‘help’ either IT (e.g., at 270 ◦ ) or IS (e.g., at 60 ◦ ). In order for the addition to result in the cardioid pattern shown, a proper phase relationship has to exist between CS and the other two vectors. The phase of CS changes dramatically with frequency (figure 15), and a proper phase relationship therefore only exists within a narrow frequency range. In the model calculations of a simple two-input system, we showed that the forward gradient is tuned to an optimum of phase delay (figures 12 and 13), and that 6 dB below the maximum the tuning curves have a bandwidth of about 100 ◦ . For an animal of the size of a cricket, this would correspond to a frequency range of about 6 kHz. The tuning of the forward gradient in the cricket has a bandwidth of only about 1 kHz (figure 16( C )), because the phase delay of the sound arriving at the ear via the transverse trachea changes well above 100 ◦ per kHz. At the time of its discovery, the narrow tuning of the cricket directional hearing was thought to be unique (Michelsen 1998b), but we can now see that it really is a special case of a basic phenomenon. The question is, however, why such a narrowly tuned directional hearing has evolved in crickets? The answer to this question may be found in the acoustic properties of the habitats of field crickets, which are discussed in section 11. In passing, two additional aspects should be mentioned. The acoustical data used in the analysis presented above were obtained from crickets in which the acoustic spiracles were forced open. The spiracular opening is hidden by a cuticular plate, which was fastened in an open position during the experiments. Tiny muscles are attached to the plate. One may speculate that listening crickets may vary the position of the plate and thus the sound entry through the acoustic spiracles. Such an efferent control would make the hearing organ of the cricket even more remarkable. The second aspect is the robot cricket. Robots mimicking aspects of living animals are built for a variety of purposes, mainly as biological models or for use in engineering. The robot cricket was built in order to increase the understanding of the processing of auditory information in the central nervous system that allows animals to identify signals against a noisy background, recognize signals from a particular species and localize and move towards its source (Webb 2001). The work with the robot cricket has been very successful in realizing these goals and has provided valuable information about the minimum central nervous connections necessary (Lund et al 1997, Reeve and Webb 2002). The directional characteristics of the ears have been simulated by arrays of microphones, but the investigators have not been interested in the specific acoustics of crickets’ hearing organs. The latest robot (Torben- Nielsen et al 2005) is equipped with four microphones that are said to be ‘mimicking the four sound entrances’. It remains to be seen whether the robot cricket will be useful in exploring the strategies for localization of sound sources in complex sound fields. Only little is known about how well the mechanisms of directional hearing work in natural habitats. One exception was the demonstration by Rheinlaender and Römer (1986) that usable directional cues may be absent for some of the animals that can hear conspecific sound signals in dense vegetation. Their experiments were performed with a small neurophysiological set-up, which could be moved without interruption of the recordings of nerve impulses from the central nervous system. In one experiment, they recorded the activity of two interneurons, which signalled the responses to sound to the brain from the left and right ears of a bush cricket, respectively. The bush cricket received 20 kHz sound pulses from a loudspeaker, which was 10 m from the animal and 1.5 m above the ...

Similar publications

Conference Paper
Full-text available
Hearing is an extraordinary sense, as the audio signals perceived at the eardrum level of both ears build and define the entire surrounding environment. Sound is seized from all directions, and the listener can identify the distance of the sound source, the amplitude, the arrival time at each ear and the variations of amplitude that differ with cha...

Citations

... There may exist a challenge for small animals, especially insects since the differences are extremely small. However, many small animals demonstrate acute sound source localization ability [5][6][7]. The parasitic fly Ormia ochracea is such an insect. ...
... From a numerical simulation of sound propagation velocity in the right EC, this is supported by a velocity of 186.21 m/s (for model conditions and assumptions, see STAR Methods). This strongly suggests that E. handlirschi utilized a pressure-difference receiver (acoustic resistor) based on a delayed internal input.54 ...
Article
Full-text available
Article An Eocene insect could hear conspecific ultrasounds and bat echolocation Graphical abstract Highlights d A 44-million-year-old amber fossil katydid reveals exquisite ear preservation d Biophysics of wings reveals this species utilized ultrasounds for communication d Modeling of auditory range demonstrates tuning to male sexual signal, as well as to bat cries d Ultrasound discrimination in insects was established by the Eocene Authors Charlie Woodrow, Emine Celiker, Fernando Montealegre-Z Correspondence charlie.woodrow@ebc.uu.se (C.W.), fmontealegrez@lincoln.ac.uk (F.M.-Z.) In brief Woodrow et al. show auditory tuning to male acoustic signals and extended ultrasonic hearing for predator detection in the ear of an Eocene katydid. This remarkable fossil pushes back the evolution of complex auditory processing in insects and suggests that acoustic communication strategies in katydids diversified during the emergence of echolocating bats. SUMMARY Hearing has evolved independently many times in the animal kingdom and is prominent in various insects and vertebrates for conspecific communication and predator detection. Among insects, katydid (Orthoptera: Tet-tigoniidae) ears are unique, as they have evolved outer, middle, and inner ear components, analogous in their biophysical principles to the mammalian ear. The katydid ear consists of two paired tympana located in each foreleg. These tympana receive sound externally on the tympanum surface (usually via pinnae) or internally via an ear canal (EC). The EC functions to capture conspecific calls and low frequencies, while the pinnae passively amplify higher-frequency ultrasounds including bat echolocation. Together, these outer ear components provide enhanced hearing sensitivity across a dynamic range of over 100 kHz. However, despite a growing understanding of the biophysics and function of the katydid ear, its precise emergence and evolutionary history remains elusive. Here, using microcomputed tomography (mCT) scanning, we recovered ge-ometries of the outer ear components and wings of an exceptionally well-preserved katydid fossilized in Baltic amber ($44 million years [Ma]). Using numerical and theoretical modeling of the wings, we show that this species was communicating at a peak frequency of 31.62 (± 2.27) kHz, and we demonstrate that the ear was biophysically tuned to this signal and to providing hearing at higher-frequency ultrasounds (>80 kHz), likely for enhanced predator detection. The results indicate that the evolution of the unique ear of the katydid, with its broadband ultrasonic sensitivity and analogous biophysical properties to the ears of mammals, emerged in the Eocene.
... However, the amplitude difference at the two ears, as well as the timing difference may be very small even in mammals [3], not to mention tinier creatures such as insects. To solve this problem, two ears often become mechanically or acoustically connected, thus creating a pressure-difference receiver, which enables amplification of tiny acoustic cues into larger interaural differences, which can be detected by the nervous system [4][5][6]. ...
Article
Full-text available
Simple Summary Mosquitoes possess one of the best-developed and sensitive hearing systems among insects. Their auditory Johnston’s organs located at the antennae bases include several thousand radially distributed sensory cells. Male mosquitoes use their hearing for acoustic courtship behavior, while the function of hearing in blood-sucking female mosquitoes is poorly studied. In addition to courtship behavior, hearing is presumed to be used for host detection, including the use of human voices as an attraction cue. Since mosquitoes spread dangerous diseases such as West Nile fever, understanding their hearing system is of crucial importance. We studied the auditory system of Culex pipiens female mosquitoes using behavioral and electrophysiological experiments and created a three-dimensional model of the mosquito auditory space. The in-flight position of antennae was found optimal for binaural hearing focused primarily in front of, above and below a mosquito. By varying the antennae position a mosquito can adjust the directional properties of hearing depending on behavioral context. According to our findings, the auditory system of female mosquitoes has enough resolution to estimate the direction to the sound source, while its frequency range enables detection of sounds produced by other flying mosquitoes and human hosts. Abstract The task of directional hearing faces most animals that possess ears. They approach this task in different ways, but a common trait is the use of binaural cues to find the direction to the source of sound. In insects, the task is further complicated by their small size and, hence, minute temporal and level differences between two ears. A single symmetric flagellar particle velocity receiver, such as the antenna of a mosquito, should not be able to discriminate between the two opposite directions along the vector of the sound wave. Paired antennae of mosquitoes presume the usage of binaural hearing, but its mechanisms are expected to be significantly different from the ones typical for the pressure receivers. However, the directionality of flagellar auditory organs has received little attention. Here, we measured the in-flight orientation of antennae in female Culex pipiens pipiens mosquitoes and obtained a detailed physiological mapping of the Johnston’s organ directionality at the level of individual sensory units. By combining these data, we created a three-dimensional model of the mosquito’s auditory space. The orientation of the antennae was found to be coordinated with the neuronal asymmetry of the Johnston’s organs to maintain a uniformly shaped auditory space, symmetric relative to a flying mosquito. The overlap of the directional characteristics of the left and right sensory units was found to be optimal for binaural hearing focused primarily in front of, above and below a flying mosquito.
... In addition, the pinnae experiments do not consider the air in the tympanic cavities, but are static models of the sub-slit cavities, thus real pinnae resonances may differ through coupling of the tracheal branches beneath the tympanum. If both the pinnae and the tracheae are able to function as auditory inputs around the calling song frequency, we may infer that the pressure-difference receiver function of the ear provides strong directional cues, but these cues would then vary with leg position, making further neuronal processing challenging (Jonsson et al., 2016;Michelsen and Larsen, 2008). For higher frequencies, where the trachea is likely to provide a sound pressure loss (Celiker et al., 2020), the ears are more likely to function as simple pressure receivers. ...
... However, at low frequencies (below about 10 kHz for most animals) the amplitude difference at the two ears, as well as the timing difference may be very small even in mammals (Köppl, 2009), not to mention tinier creatures like insects. To solve this problem, two ears become mechanically or acoustically connected, thus creating pressure-difference receiver, which enables amplification of tiny acoustic cues into larger interaural differences which can be effectively processed by the nervous system (Miles et al., 1995;Michelsen and Larsen, 2007;Römer, 2015). only a fraction of sensory cells that is aligned to the vector of a given acoustic wave will generate a significant response to it. ...
Preprint
The task of directional hearing faces most of the animals that possess ears. They approach this task in different ways, but the common trait is the usage of the binaural cues to find the direction to the source of sound. In insects, the task is further complicated by their small size and, hence, minute temporal and level differences between two ears. A way to overcome this problem is to receive the particle velocity component of sound rather than the pressure, as the former naturally involves directionality. However, even in this case, one ear is not enough for directional hearing: a single symmetric flagellar particle velocity receiver cannot discriminate between the two opposite directions along the vector of the sound wave. Insects that use flagellar auditory organs, and mosquitoes in particular, possess a pair of receivers, which presumes the usage of binaural hearing. Its mechanisms are expected to be significantly different from the ones typical for the pressure receivers. However, the directionality of flagellar auditory organs has received little attention. Here we measured the in-flight orientation of a female mosquito antennae and obtained detailed physiological mapping of the Johnston's organ directionality at the level of individual sensory units. By combining these data, we provided a three-dimensional model of the mosquito's auditory space. The natural orientation of the antennae together with angular distribution of sensory units in each of the Johnston's organs was found to be optimal for binaural hearing focused primarily in front of, above and below a flying mosquito.
... BB.1 (Sensing) contains articles dealing with the design and modeling of sensory organs, such as [43] presenting a model of the lateral line of fish to investigate their behavior when affected by external flow fields, [44] presenting novel optical design methods and characterizations in order to study various compound eye concepts fabricated by micro-optics technology and [45] presenting experimental methods and models in order to study the physics of pressure difference receiving ears. ...
Article
Full-text available
The number of published scientific articles is increasing dramatically and makes it difficult to keep track of research topics. This is particularly difficult in interdisciplinary research areas where different communities from different disciplines are working together. It would be useful to develop methods to automate the detection of research topics in a research domain. Here we propose a natural language processing (NLP) based method to automatically detect topics in defined corpora. We start by automatically generating a global state of the art of Living Machines conferences. Our NLP-based method classifies all published papers into different clusters corresponding to the research topic published in these conferences. We perform the same study on all papers published in the journals Bioinspiration & Biomimetics and Soft Robotics. In total this analysis concerns 2099 articles. Next, we analyze the intersection between the research themes published in the conferences and the corpora of these two journals. We also examine the evolution of the number of papers per research theme which determines the research trends. Together, these analyses provide a snapshot of the current state of the field, help to highlight open questions, and provide insights into the future.
... Such binaural auditory systems must satisfy three requirements to function: (1) the distance between the ears must be sufficient to produce recognisable differences in sound arrival time; (2) the ears must be separated by an anatomical structure which is large enough to attenuate sound between them; (3) the ears must be neurologically coupled in order to calculate time and amplitude differences (Brown, 1984;Christensen-Dalsgaard et al., 2021;Christensen-Dalsgaard and Manley, 2005;Lakes-Harlan and Scherberich, 2015;Lauer et al., 2018;Suga, 1989;Zaslavski, 1999). However, animals such as insects are too small to exploit diffractive effects of sound on their bodies to perceive minute differences in sound delays and intensities (Michelsen and Larsen, 2008). As a result, vastly different species have convergently evolved separate mechanisms of hearing to fulfil similar functions (Göpfert and Hennig, 2016;Köppl et al., 2014;Robert, 2005;Warren and Nowotny, 2021), including the detection of ultrasonic frequencies (Strauß et al., 2014). ...
... Thus, multiple pathways provide the interaural phase differences to reliably encode the angle of the sound source. The katydid ear therefore functions as a pressure -time difference receiver (Michelsen and Larsen, 2008;Robert, 2005;Veitch et al., 2021), unlike the mammalian ear which functions as a single input pressure receiver via the ear canal. ...
Article
Full-text available
Early predator detection is a key component of the predator-prey arms race and has driven the evolution of multiple animal hearing systems. Katydids (Insecta) have sophisticated ears, each consisting of paired tympana on each foreleg that receive sound both externally, through the air, and internally via a narrowing ear canal running through the leg from an acoustic spiracle on the thorax. These ears are pressure-time difference receivers capable of sensitive and accurate directional hearing across a wide frequency range. Many katydid species have cuticular pinnae which form cavities around the outer tympanal surfaces, but their function is unknown. We investigated pinnal function in the katydid Copiphora gorgonensis by combining experimental biophysics and numerical modelling using 3D ear geometries. We found that the pinnae in C. gorgonensis do not assist in directional hearing for conspecific call frequencies, but instead act as ultrasound detectors. Pinnae induced large sound pressure gains (20–30 dB) that enhanced sound detection at high ultrasonic frequencies (>60 kHz), matching the echolocation range of co-occurring insectivorous gleaning bats. These findings were supported by behavioural and neural audiograms and pinnal cavity resonances from live specimens, and comparisons with the pinnal mechanics of sympatric katydid species, which together suggest that katydid pinnae primarily evolved for the enhanced detection of predatory bats.
... If the eardrum is backed by a closed cavity, as in Fig. 4a, such an ear would be a pressure receiver, generally being more sensitive than the naked sensillum, but non-directional since pressure is a scalar (Michelsen and Larsen 2008). a b c d Fig. 4 Different types of sound receivers, in a more modern representation than that of Fig. 3: a sensory hair, b pressure-receiver ear, c pressure-gradient receiver with two membranes at a small distance and a sensory template with a sensory organ (top; cf. ...
... 15). The above plot is based on a figure in Beranek (1954); see also Michelsen and Larsen (2008). A careful comparison with Fig. 3 is worthwhile Accordingly, the CNS needs to compute the direction of sound by other means, usually through binaural comparison. ...
... The simplest configuration could just be the two membranes of Fig. 4c. The template in the middle would be driven by the instantaneous pressure difference between sound components on its two sides; the left and right faces of the membranes will translate into template motion, resulting in a figure-eight directional pattern with low membrane amplitudes from frontal and caudal directions; see Michelsen and Larsen (2008) for a more formal treatment. ...
... These results show that the paradigm of success is dominated by the number of personnel and tanks with artillery successfully changing the paradigm of automation and implementation of computer-integrated technologies, which aims to maximize the preservation of personnel [1][2][3]. However, its further development and practical application require the development of small-sized means of determining the coordinates of the sound anomaly (SA) [4][5] and ensuring the assessment of the maximum possible error [6][7][8][9]. This task is becoming especially important and relevant for devices installed on mobile devices such as unmanned aerial vehicles (UAVs) or ground unmanned vehicles (GUVs) or ground unmanned combat vehicles (GUCVs) [1][2][3]. ...
... This task is becoming especially important and relevant for devices installed on mobile devices such as unmanned aerial vehicles (UAVs) or ground unmanned vehicles (GUVs) or ground unmanned combat vehicles (GUCVs) [1][2][3]. In addition to these, the possible use of small-scale means of determining the coordinates for further improvement of monitoring systems for civilian use [4][5] requires the development of methods for estimating the maximum possible error and the formation of measures to reduce it. ...
... In [3] there is also the problem of estimating the error and its impact on the result of reliable data entry into the electronic card of the fire department. Thus, the development of acoustic means of detecting a shot from a small weapon, their classification and formation of design requirements are presented in [4]. However, these issues of choosing a metrological scheme and determining the error were not raised and resolved. ...
... Crickets have one of the most complicated hearing organs known (Michelsen and Larsen, 2008). The ears are located in the forelegs, just below the joint between the femur and tibia. ...
... This connection adds two more sound inputs to the ear from the contralateral ear and from the contralateral spiracle. The cricket ear is thus an acoustic four-input device (Michelsen and Larsen, 2008;Michelsen et al., 1994) and, while the sound inputs from the spiracles and tympana are similar in some respects to those found in tettigoniids, the interactions between the four inputs in crickets adds complexity for researchers too. ...
... The propagation velocity inside the acoustic tracheal tubes has been measured in Teleogryllus commodus by Larsen (1981) as 263 m s −1 and in Gryllus bimaculatus by Michelsen et al. (1994) as 264 m s −1 . These figures are in very close agreement and are well below 344 m s −1 , which is the velocity of sound in adiabatic propagation (Michelsen and Larsen, 2008). ...
Chapter
Full-text available
Acoustic communication is one of the most well-known behavioural traits of the Orthoptera. Orthopteran insects and the sounds they produce are both extremely diverse, and species-specific sounds are an extremely important tool in orthopteran taxonomy and systematics. For most species, acoustic signalling is the most important means of communication. It plays a vital role in mating, mate choice, intrasexual competition, interspecific interactions with predators and parasitoids, and the divergence of populations and species. The enormous diversity of the orthopterans has provided researchers with a wealth of model systems for studying anatomy, physiology, neurobiology, bioacoustics, communication, life-history traits, behaviour, evolutionary ecology, and speciation, all areas in which acoustic communication is important. We first reviewed orthopteran sound signalling nearly 20 years ago (Robinson and Hall, 2002), since when there has been an enormous amount of further work. This second review will look mainly at research published since that first review.