ArticlePDF AvailableLiterature Review

History and Evolution of the Tuning Fork

Authors:

Abstract

This paper is a summary of the evolution of the tuning fork, a crucial part of the cranial nerves and auditory examination. The tuning fork is a two-pronged fork that resonates at a specific pitch when struck against a surface and has been proven to be incredibly useful in diagnosing and detecting hearing disorders. The tuning fork, an unassuming device in modern medicine, traces its origins back to an era when scientific understanding and medical diagnostics were in their nascent stages. Since its inception, this unpretentious instrument has played a pivotal role in the hands of healthcare practitioners, aiding in the diagnosis and assessment of various medical conditions. This paper embarks on a captivating journey through time to explore the origin, evolution, and significant milestones in the development of the tuning fork. From the first suggestion of differentiating hearing disorders to present-day tuning forks, this paper maps the different stages that the tuning fork has gone through and how its use has changed over time. Along the way, we will discover how the tuning fork has harmonized with music, medicine, and various scientific pursuits, enriching our understanding of sound and resonance while leaving an indelible mark on the course of human history. Delving into the historical context of its creation, this review uncovers the ingenious minds that birthed this innovative device and the pivotal moments that brought it to the forefront of human endeavors.
Review began 12/15/2023
Review ended 12/28/2023
Published 01/01/2024
© Copyright 2024
Eraniyan et al. This is an open access
article distributed under the terms of the
Creative Commons Attribution License CC-
BY 4.0., which permits unrestricted use,
distribution, and reproduction in any
medium, provided the original author and
source are credited.
History and Evolution of the Tuning Fork
Keerthi Eraniyan , Latha Ganti
1. Biomedical Sciences, University of Central Florida, Orlando, USA 2. Emergency Medicine & Neurology, University of
Central Florida College of Medicine, Orlando, USA 3. Medical Science, The Warren Alpert Medical School of Brown
University, Providence, USA
Corresponding author: Latha Ganti, latha_ganti@brown.edu
Abstract
This paper is a summary of the evolution of the tuning fork, a crucial part of the cranial nerves and auditory
examination. The tuning fork is a two-pronged fork that resonates at a specific pitch when struck against a
surface and has been proven to be incredibly useful in diagnosing and detecting hearing disorders. The
tuning fork, an unassuming device in modern medicine, traces its origins back to an era when scientific
understanding and medical diagnostics were in their nascent stages. Since its inception, this unpretentious
instrument has played a pivotal role in the hands of healthcare practitioners, aiding in the diagnosis and
assessment of various medical conditions. This paper embarks on a captivating journey through time to
explore the origin, evolution, and significant milestones in the development of the tuning fork. From the
first suggestion of differentiating hearing disorders to present-day tuning forks, this paper maps the
different stages that the tuning fork has gone through and how its use has changed over time. Along the way,
we will discover how the tuning fork has harmonized with music, medicine, and various scientific pursuits,
enriching our understanding of sound and resonance while leaving an indelible mark on the course of
human history. Delving into the historical context of its creation, this review uncovers the ingenious minds
that birthed this innovative device and the pivotal moments that brought it to the forefront of human
endeavors.
Categories: Neurology, Medical Education
Keywords: medical instruments, history of neurology, medical education, neurologic examination, tuning fork
Introduction And Background
From its initial use as a musical tool to its incorporation into neurology, audiology, and beyond, the tuning
fork's journey in medicine is a testament to human ingenuity and our unyielding pursuit of knowledge. This
review aims to provide a thorough understanding of the tuning fork's historical significance in medicine,
elucidating its continued relevance as a diagnostic aid in the ever-evolving landscape of healthcare. The
tuning fork’s distinctive ability to produce a pure, consistent tone quickly garnered the appreciation of
musicians. However, its applications transcended the realm of music and promptly found their way to the
dynamic world of medicine. As medicine and technology advanced, so did the tuning fork's applications. The
vibration and sound emitted by the tuning fork are used to observe sound conduction through bone [1]. Used
frequently in Weber and Rinne tests to mainly diagnose hearing disorders, the tuning fork gradually acquired
its pivotal role in auditory examination. The Weber test differentiates sensorineural hearing loss from
conductive hearing loss [2]. The Rinne test assesses for conductive hearing loss [3]. Sensorineural hearing
loss is hearing loss caused by damage to the neural pathway to the auditory cortex while conductive hearing
loss is hearing loss caused by damage to the outer or middle ear. Using the same principle, the tuning fork is
also used to detect fractures, which conduct sound less well through bone [4-6].
Review
The transformative historical narrative of the tuning fork unfolded almost 400 years ago. The tuning fork’s
roots can be traced back to the sixteenth century when medical practitioners were beginning to realize the
importance of observing bone conduction. It was 1546 when anatomist Giovanni Filippo Ingrassia first
described the stapes, the auditory ossicle that conducts sound waves [7]. Four years later, Italian physician
Geralamo Cardano proposed a novel concept that sound could be transmitted through the solid skull bone to
reach the inner ear. This concept would later be known as bone conduction, though it had not been explicitly
discovered yet (Figure 1) [8].
1 2, 3
Open Access Review
Article DOI: 10.7759/cureus.51465
How to cite this article
Eraniyan K, Ganti L (January 01, 2024) History and Evolution of the Tuning Fork. Cureus 16(1): e51465. DOI 10.7759/cureus.51465
FIGURE 1: Infographic depicting the early history of the tuning fork
Image created by Keerthi Eraniyan
The first application of this phenomenon to differentiate hearing disorders was done by Hieronymus
Capivacci in 1589. Before the invention of the tuning fork, medical researchers and practitioners had to
innovate tools to mimic the vibration and consistent pitch that the tuning fork emitted. Capivacci placed one
end of a rod against the vibrating string of a musical instrument called the zither and placed the other end
between a patient's teeth. If the patient was able to hear the tone, he diagnosed them as having a hearing
disorder resulting from damage to the tympanic membrane. If the patient was not able to hear the tone, the
hearing disorder was diagnosed as damage to the innermost part of the ear. It was not until 1603 that bone
conduction acquired its name, frequently referred to as “Ingrassia’s phenomenon.” Ingrassia had discovered
that a vibrating table fork could be heard when pressed against the teeth, and this observation was only
published in 1603, 23 years after his death. This is dubbed the birth of bone conduction. This idea of using a
2024 Eraniyan et al. Cureus 16(1): e51465. DOI 10.7759/cureus.51465 2 of 4
table fork to elicit pitch and vibrations was expanded on by G.C. Schellhammer, who ingeniously utilized a
simple table fork to distinguish between different types of hearing disorders based on patients' responses to
auditory vibrations. He held the stem of a vibrating tuning fork between a patient’s teeth. In one instance,
the fork was touching the teeth, and in the other, it was simply held in the mouth. Through this test,
Schellhammer discovered that sound conveyed through the teeth was transmitted through the cranial
bones, not the Eustachian tube [8-11].
It was 1711 when the tuning fork was invented. Musician John Shore was originally a renowned trumpeter,
who performed in the courts of King James II and played for George Frideric Handel. However, he split his lip
after a concert and was never able to play the trumpet again. A musician at heart, Shore switched to playing
the lute, a string instrument that required to be tuned regularly. However, to play this instrument, he
encountered the challenge of tuning it accurately, an essential aspect for achieving harmonious melodies.
With his ingenious spirit, Shore set out to create a solution, resulting in the invention of the tuning fork.
This innovative device, with its ability to produce a consistent and precise pitch, allowed him to tune the
lute with ease. Shore’s original tuning fork gave a pitch at 440 Hz, which is a musical note of A4. It began as a
small steel fork with a stout stem and two flat, elongated prongs, which upon vibration produced a constant
musical pitch. As Shore used his tuning fork before each of his concerts, the tuning fork found its place as a
musical instrument employed in concert halls, churches, and chamber music ensembles throughout Europe.
As the tuning fork became more widely used, the need for standardization of pitch became apparent. In 1834,
German physicist Johann Heinrich Schreiber devised a method to precisely manufacture forks with different
frequencies and created a set of 54 tuning forks ranging from 220 Hz to 440 Hz in 4 Hz intervals. The
‘scientific’ pitch was 512 Hz or a musical note of C5. This frequency came to be the standard pitch used in
medicine [12].
The tuning fork reached its medical spotlight in 1834 when the Weber test was created. The Weber test was
created to distinguish between conductive and sensorineural hearing loss in a patient with impaired hearing.
That is, the patient had one ear that heard worse than the other ear. Weber placed a struck tuning fork at any
point in the midline of the skull. If the tone was heard in the worse-hearing ear, the patient has conductive
hearing loss. If the tone was heard in the opposite ear, the patient has sensorineural hearing loss. Despite its
importance in diagnosing hearing disorders, the Weber test was not adopted by medical practitioners until
almost a decade later, when French military physician Jean Pierre Bonnafont promulgated the Weber test as
a diagnostic assessment [6-7].
While Weber’s test assessed hearing loss in both ears, Rinne’s test would come to assess hearing loss in one
ear in 1855. Rinne’s test differentiated between air conduction and bone conduction. It was performed by
holding a vibrating tuning fork to the mastoid bone behind the ear to conduct sound through the bone, and
then in front of the external ear to conduct sound through the air. In a patient with healthy ears, Rinne
realized that the tuning fork was heard better when merely placed next to the ear. In other words, one
without a hearing disorder would have better air sound wave conduction than bone sound wave conduction.
If bone conduction is better than air conduction, the patient has conductive hearing loss. If someone has
impaired hearing but passes the Rinne test, they must have an auditory nerve disorder because the entire
conduction apparatus is declared to be normal. It was with the creation of the Rinne and Weber tests that
medical practitioners began to realize the unparalleled importance of the tuning fork [12-13].
In the late nineteenth century, the tuning fork was widely used in medicine. Certain technical improvements
were made to the tuning fork to improve its utility. In 1870, Politzer installed clamps on the prongs of the
fork to deaden overtones when it was struck. Könlg discovered in 1878 that the clamps could be shifted to
vary the tone of the fork up to an octave. To ensure sufficient coupling with the skull when testing bone
conduction, Lucae fixed a metal knob to the shaft in 1886. In 1899, a small hammer was fixed to the shaft to
activate the fork when driven with a spring, and a wedge-shaped figure was drawn on the clamp to control
the amplitude of the vibration. All of these improvements collectively shaped the tuning fork to what we
now know as the current tuning fork [8].
While the tuning fork itself evolved minimally for the entirety of the twentieth century, it demonstrated its
versatility when yet another use for it was found. During the early twentieth century early years, tuning
forks found new applications in medicine when they were used to stimulate "vibration sense" on bony
prominences. Vibration sense is the basic feeling of vibration against bone. This method, recognized as a
crude test for neural pathways used in proprioception, which enables us to perceive body part location,
movement, and action, was a concept developed by Landry, Bell, Bastian, Ferrier, and others during the
nineteenth century. It played a vital role in diagnosing sensory and posterior column nerve disorders [7].
Conclusions
The tuning fork’s history has proven to be a captivating journey through time, revealing its profound impact
on various aspects of human knowledge and endeavor. From its humble origins as a musical instrument, the
tuning fork's evolution into a versatile diagnostic tool in medicine stands as a testament to human ingenuity
and adaptability. Over 400 years, the tuning fork has evolved from a simple metal rod pressed against a
vibrating string to a revolutionary medical tool. Through the centuries, the tuning fork’s applications have
been shaped and honed down, from understanding sound transmission through the skull to using it to
2024 Eraniyan et al. Cureus 16(1): e51465. DOI 10.7759/cureus.51465 3 of 4
differentiate hearing disorders. From the past to the present, the tuning fork's unwavering resonance serves
as a reminder of the profound impact that a simple yet ingenious invention can have on the course of human
history and the enhancement of human well-being.
Additional Information
Author Contributions
All authors have reviewed the final version to be published and agreed to be accountable for all aspects of the
work.
Concept and design: Latha Ganti, Keerthi Eraniyan
Acquisition, analysis, or interpretation of data: Latha Ganti, Keerthi Eraniyan
Critical review of the manuscript for important intellectual content: Latha Ganti, Keerthi Eraniyan
Supervision: Latha Ganti
Drafting of the manuscript: Keerthi Eraniyan
Disclosures
Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the
following: Payment/services info: All authors have declared that no financial support was received from
any organization for the submitted work. Financial relationships: All authors have declared that they have
no financial relationships at present or within the previous three years with any organizations that might
have an interest in the submitted work. Other relationships: All authors have declared that there are no
other relationships or activities that could appear to have influenced the submitted work.
References
1. Barnes WH: The tuning fork tests . J Natl Med Assoc. 1922, 14:95-8.
2. Sichel JY, Eliashar R, Dano I: Explaining the Weber test . Otolaryngol Head Neck Surg. 2000, 122:465-6.
10.1016/S0194-5998(00)70071-4
3. Chole RA, Cook GB: The Rinne test for conductive deafness. A critical reappraisal . Arch Otolaryngol Head
Neck Surg. 1988, 114:399-403. 10.1001/archotol.1988.01860160043018
4. Misurya RK, Khare A, Mallick A, Sural A, Vishwakarma GK: Use of tuning fork in diagnostic auscultation of
fractures. Injury. 1987, 18: 63-4. 10.1016/0020-1383(87)90391-3
5. Moore MB: The use of a tuning fork and stethoscope to identify fractures . J Athl Train. 2009, 44:272-4.
10.4085/1062-6050-44.3.272
6. TS K: Tuning fork tests; a historical review . Ann Otol Rhinol Laryngol. 1946, 55:423-30.
10.1177/000348944605500215
7. Pearce JM: Early days of the tuning fork . J Neurol Neurosurg Psychiatry. 1998, 65:728, 733.
10.1136/jnnp.65.5.728
8. Kelley NH: Historical aspects of bone conduction . Laryngoscope. 1937, 47:102-9. 10.1288/00005537-
193702000-00005
9. Placke L, Appelbaum EN, Patel AJ, Sweeney AD: Bone conduction implants for hearing rehabilitation in
skull base tumor patients. J Neurol Surg B Skull Base. 2019, 80:139-48. 10.1055/s-0039-1677690
10. Feldmann H: History of the tuning fork. I: Invention of the tuning fork, its course in music and natural
sciences. Pictures from the history of otorhinolaryngology, presented by instruments from the collection of
the Ingolstadt German Medical History Museum [Article in German]. Laryngorhinootologie. 1997, 76:116-
22. 10.1055/s-2007-997398
11. Bickerton RC, Barr GS: The origin of the tuning fork . J R Soc Med. 1987, 80:771-3.
10.1177/014107688708001215
12. Feldmann H: History of the tuning fork. II: evolution of the classical experiments by Weber, Rinne and
Schwabach [Article in German] . Laryngorhinootologie. 1997, 76:318-26. 10.1055/s-2007-997435
13. Feldmann H: History of the tuning fork. III: on the way to quantitative pure-tone measurement. Pictures
from the history of otorhinolaryngology, represented by instruments from the collection of the Ingolstadt
German Medical History Museum [Article in German]. Laryngorhinootologie. 1997, 76:428-34. 10.1055/s-
2007-997457
2024 Eraniyan et al. Cureus 16(1): e51465. DOI 10.7759/cureus.51465 4 of 4
... In chronic cases, tonal audiometry revealed the presence of either conductive or mixed hearing loss; mixed hearing loss indicates a complication with labyrinthine fistula, with cochlear involvement being a factor of poor prognosis. These findings are consistent with numerous specialized studies, which confirm that instrumental acumetry and tonal audiometry are important methods for determining the degree and type of hearing loss, thereby guiding the appropriate therapeutic approach [14]. frequencies ( Figure 6). ...
... In chronic cases, tonal audiometry revealed the presence of either conductive or mixed hearing loss; mixed hearing loss indicates a complication with labyrinthine fistula, with cochlear involvement being a factor of poor prognosis. These findings are consistent with numerous specialized studies, which confirm that instrumental acumetry and tonal audiometry are important methods for determining the degree and type of hearing loss, thereby guiding the appropriate therapeutic approach [14]. In the studied group, patients with mixed hearing loss predominated (66.89%), with an average age of 41.56 years, which was higher than that of patients with conductive hearing loss (36.38 years) ( Table 8). ...
Article
Full-text available
Background/Objectives. Otomastoiditis, an inflammatory condition affecting the middle ear and mastoid cells, poses significant risks for hearing impairment. This study aimed to analyze the clinical presentations, anatomical variations, and audiometric outcomes associated with acute and chronic otomastoiditis over a five-year period at the ENT Clinic of the Clinical County Emergency Hospital of Craiova. Methods. A retrospective clinical–statistical analysis was conducted on 145 patients aged 2 to 78 years, who were treated for otomastoiditis. The study involved a comprehensive review of clinical and audiometric data, with a focus on the type of hearing loss (conductive or mixed), audiometric thresholds, and the relationship between the anatomical form of the disease and the severity of hearing loss. Results. The majority of cases (93.83%) were chronic otomastoiditis, with 66.89% of patients presenting with mixed hearing loss and 33.10% with conductive hearing loss. Audiometric assessments revealed significant air conduction deficits, particularly at low and mid-range frequencies, with losses averaging 50–55 dB in cases of conductive hearing loss. Chronic cases demonstrated notable bone conduction impairments, indicating progressive cochlear damage. Statistical analysis identified a moderate correlation between the anatomical form of the disease and the severity of hearing loss, particularly in patients with cholesteatomatous-suppurative forms. Conclusions. This study underlines the critical need for the early and precise diagnosis of otomastoiditis, supported by audiometric evaluations. Our findings emphasize the substantial risk of progressive cochlear damage in chronic cases, underscoring the necessity for timely intervention to mitigate long-term hearing loss. These results offer valuable insights for clinicians, potentially guiding improved therapeutic approaches and contributing to enhanced patient outcomes in managing chronic otomastoiditis.
Article
Bone conduction implants transfer sound to the inner ear through direct vibration of the skull. In patients with skull base tumors and infections, these devices can bypass a dysfunctional ear canal and/or middle ear. Though not all skull base surgery patients opt for bone conduction hearing rehabilitation, a variety of these devices have been developed and marketed over time. This article reviews the evolution and existing state of bone conduction technology.
Article
Umfeld: Die Tatsache, dass man Schall über die Kopfknochen wahrnehmen kann, wurde 1550 erstmals von dem Universalgelehrten G. Cardano in Pavia beschrieben. H. Capivacci, Arzt in Pavia, erkannte wenige Jahre später, dass dieses Phänomen diagnostisch genutzt werden könnte, um zu unterscheiden, ob die Ursache einer Hörstörung im Mittelohr oder im Hörnerven liegt. Der deutsche Arzt G. Chr. Schelhammer experimentierte 1684 mit einer einfachen Gabel aus dem Eßbesteck, die er durch Anschlagen zum Klingen brachte, und erweiterte damit die Versuche von Cardano und Capivacci. Ein echter Bedarf, diese Erkenntnisse diagnostisch einzusetzen, bestand aber noch für lange Zeit nicht. Erfindung der Stimmgabel: John Shore, der als Trompeter und Lautenist in London unter H. Purcell und G. F. Händel wirkte, hat 1711 die Stimmgabel erfunden. Eine Stimmgabel aus dem Nachlaß von Händel, wohl die älteste erhaltene Stimmgabel überhaupt, wird hier erstmals abgebildet. Von dem Erfinder der Stimmgabel sind verschiedene Anekdoten überliefert, die hier wiedergegeben werden. Die Stimmgabel als musikalisches Gerät setzte sich rasch in ganz Europa durch. Die Physik der Stimmgabel: Der deutsche Physiker E. F. F. Chladni in Wittenberg untersuchte um 1800 als erster systematisch die Schwingungsweise der Stimmgabel mit ihren Knotenpunkten. Er und andere bemühten sich darüber hinaus, auf der Basis von Stimmgabeln Musikinstrumente zu bauen, die sich aber nicht durchsetzen konnten. J. H. Scheibler bei Düsseldorf stellte 1834 einen Satz von 54 Stimmgabeln zusammen, die sich im Bereich von 220-440 Hz um jeweils 4 Hz unterschieden. Derartige Stimmgabelsätze waren lange Zeit unentbehrlich für die exakte Bestimmung von Tonhöhen. Der französische Physiker J. Lissajous in Paris baute eine Standard-Stimmgabel, die den Kammerton a1 mit 435 Hz international festlegen sollte, doch blieb dies umstritten. Der deutsche Physiker K. R. Koenig in Paris konstruierte zahlreiche akustische Präzisionsinstrumente, darunter eine Stimmgabel, die durch ein Uhrwerk mechanisch in Dauerschwingung gehalten wurde. H. Helmholtz in Heidelberg baute 1863 elektromotorisch betriebene Stimmgabeln, mit denen er seine Versuche zur Lehre von den Tonempfindungen ausführte. Stimmgabeln blieben bis zur Erfindung der Elektronenröhre das unentbehrliche Instrument zur Erzeugung definierter sinusförmiger Schwingungen. Die Geschichte dieser Entwicklung wird ausführlich dargestellt. Der diagnostische Gebrauch der Stimmgabel in der Otologie soll in einem weiteren Artikel behandelt werden. Summary Background: G. Cardano, physician, mathematician, and astrologer in Pavia, Italy, in 1550 described how sound may be perceived through the skull. A few years later H. Capivacci, also a physician in Padua, realized that this phenomenon might be used as a diagnostic tool for differentiating between hearing disorders located either in the middle ear or in the acoustic nerve. The German physician G. C. Schelhammer in 1684 was the first to use a common cutlery fork in further developing the experiments initiated by Cardano and Capivacci. For a long time to come, however, there was no demand for this in practical otology. The invention of the tuning fork: The tuning fork was invented in 1711 by John Shore, trumpeter and lutenist to H. Purcell and G. F. Händel in London. A picture of Händel's own tuning fork, probably the oldest tuning fork in existence, is presented here for the first time. There are a number of anecdotes connected with the inventor of the tuning fork, using plays on words involving the name Shore, and mixing up pitch-pipe and pitchfork. Some of these are related here. The tuning fork as a musical instrument soon became a success throughout Europe. The physics of the tuning fork: The German physicist E. F. F. Chladni in Wittenberg around 1800 was the first to systematically investigate the mode of Vibration of the tuning fork with its nodal points. Besides this, he and others tried to construct a complete musical instrument based on sets of tuning forks, which, however, were not widely accepted. J. H. Scheibler in Germany in 1834 presented a set of 54 tuning forks covering the range from 220 Hz to 440 Hz, at intervals of 4 Hz. J. Lissajous in Paris constructed a very elaborate tuning fork with a resonance box, which was intended to represent the international standard of the musical note A with 435 vibrations per second, but this remained controversial. K. R. Koenig, a German physicist living in Paris, invented a tuning fork which was kept in continuous vibration by a clockwork. H. Helmholtz, physiologist in Heidelberg, in 1863 used sets of electromagnetically powered tuning forks for his famous experiments on the sensations of tone. Until the invention of the electronic valve, tuning forks remained indispensible instruments for producing defined sinusoidal vibrations. The history of this development is presented in detail. The diagnostic use of the tuning fork in otology will be described in a separate article.
Article
Hintergrund: Die Stimmgabelversuche nach Weber, Rinne und Schwabach galten lange Zeit als unzuverlässig und schienen oft widersprüchliche Ergebnisse zu liefern. Die Fehlerquellen, die auf physikalischem, physiologischem, pathophysiologischem und psychologischem Gebiet lagen, wurden erst nach und nach erkannt. Sie waren Anlaß, verbesserte Instrumente und Untersuchungstechniken zu entwikkeln. Technische Verbesserungen der Stimmgabeln: Zur Verhinderung von Obertonschwingungen wurden Gewichte an den Enden der Zinken befestigt (Politzer 1870). Durch Verschieben der Gewichte wurde es möglich, die Tonhöhe einer Stimmgabel kontinuierlich zu variieren (König 1878). Ein Fuß aus Hörn oder Metall am Stimmgabelstiel (Lucae 1886) sorgte für eine bessere Ankopplung an den Schädel bei Prüfung der Knochenleitung. Ein an der Gabel befestigter Federhammer (Lucae 1899) ermöglichte eine reproduzierbare Anschlagsstärke. Mit Hilfe eines keilförmigen Musters an der Seitenfläche des Gewichtes (Gradenigo 1899) konnte optisch die Schwingungsamplitude abgelesen werden. Methoden zur Quantifizierung der Hörprüfungsergebnisse: Die Dauer, während der eine abklingende Stimmgabel gehört werden konnte, wurde zum Parameter der Tongehörsschwelle (v. Conta 1864). Durch Bestecke von mehreren Stimmgabeln im Oktavabstand wurde das ganze Hörfeld von der unteren bis zur oberen Tongrenze erfaßt. Das umfangreichste Instrumentarium dieser Art war die kontinuierliche Tonreihe nach Bezold und Edelmann (1894). Sie bestand aus 10 Stimmgabeln mit verschieblichen Gewichten, zwei gedackten Pfeifen und einer Galton-Pfeife. Es gestattete, sämtliche hörbaren Töne zu produzieren. Graphisch-quantitative Darstellung der Hörprüfergebnisse: Die mit diesen Methoden gemessenen Tongehörsschwellen im ganzen Frequenzbereich wurden in übersichtlichen Schemata zusammengefaßt (Hartmann 1885, Gradenigo 1893), die als Vorläufer der modernen Audiogramme gelten können. Diese geschichtlichen Entwicklungen werden ausführlich dargestellt und illustriert. Summary Background: Weber's and Rinne's tuning-fork tests were for a long time considered unreliable, as they often seemd to yield inconsistent results. The sources of error were manifold and lay in the fields of physics, physiology, pathophysiology, and psychology. When the problems came to be understood, more sophisticated instruments and techniques were developed. Technical improvements in tuning forks: The prongs of the tuning fork were fitted with clamps to deaden overtones when it was put into Vibration (Politzer 1870). By shifting the clamps along the prongs the tone of the tuning fork could be varied in a ränge up to one octave (König 1878). A knob of hörn or metal was fixed to the end of the shaft to ensure a good coupling to the skull when testing bone conduction (Lucae 1886). A small hammer fixed to the shaft and driven by a spring would activate the tuning fork with reproducible strength (Lucae 1899). A wedge-shaped figure drawn on the lateral surface of the clamps would allow one to optically control the amplitude of Vibration (Gradenigo 1899). Methods for quantification of measuring hearing acuity: The time during which a patient hears the tuning fork after it has been Struck as compared to that of a normal hearing subject was measured as parameter of hearing acutiy (v. Conta 1864). A number of tuning forks at intervals of one octave each were assembled in sets to cover the whole frequency ränge of hearing. The most sophisticated example of these sets was the Bezold-Edelmann continuous tone series (1894). It comprised ten tuning forks with sliding clamps, two pipes of the organ type, and a Galton whistle. With this Instrumentation it was possible to test the whole ränge of hearing. Graphic presentation of quantitative results of hearing testing: The results of testing the hearing via air conduction and bone conduction measured in duration and calculated as percentage of normal hearing were presented in Charts (Hartmann 1885, Gradenigo 1893) which can be considered precursors of modern audiograms. The evolution of these instruments and methods is described in detail and illustrated by exhibits from the museum.
Article
Article
Nonradiographic tests to identify fractures rely on a patient's report of increased pain at the site of injury. These tests can be misleading and produce false-positive or false-negative results because of differences in pain tolerance. A painless technique using a tuning fork and stethoscope to detect fractures has undergone limited review in the athletic training literature. To determine if the use of a 128-Hz vibrating tuning fork and stethoscope were effective in identifying fractures. Cross-sectional study. University athletic training room or local orthopaedic center when fractures were suspected. A total of 37 patients (19 males, 18 females) volunteered. A diminished or absent sound arising from the injured bone as compared with the uninjured bone represented a positive sign for a fracture. Radiographs interpreted by the attending orthopaedic physician provided the standard for comparison of diagnostic findings. Sensitivity was 0.83 (10:12), specificity was 0.80 (20:25), positive likelihood ratio was 4.2, negative likelihood ratio was 0.21, and diagnostic accuracy was 81% (30:37). The tuning fork and stethoscope technique was an effective screening method for a variety of fractures.
Article
• The Rinne tuning fork test for the detection of conductive hearing loss is the most widely used tuning fork test in clinical medicine. However, the frequency of the tuning fork used to perform the test has not been standardized. In a study of 200 patients with known air-bone gaps, we found that the 256-Hz tuning fork was more sensitive than the 512-, 1024-, and 2048-Hz tuning forks for discriminating conductive from sensorineural hearing deficits. However, the occurrence of false-positive responses limits its usefulness. Therefore, we believe that the 512-Hz tuning fork should become the standard tuning fork used in performance of the Rinne test to screen patients for conductive hearing loss. (Arch Otolaryngol Head Neck Surg 1988;114:399-403)