Long-Term Evolution of the Electrical Stimulation Levels for Cochlear Implant Patients

ENT Service, San Cecilio University Hospital, Granada, Spain.
Clinical and Experimental Otorhinolaryngology (Impact Factor: 0.85). 12/2012; 5(4):194-200. DOI: 10.3342/ceo.2012.5.4.194
Source: PubMed


The stimulation levels programmed in cochlear implant systems are affected by an evolution since the first switch-on of the processor. This study was designed to evaluate the changes in stimulation levels over time and the relationship between post-implantation physiological changes and with the hearing experience provided by the continuous use of the cochlear implant.
Sixty-two patients, ranging in age from 4 to 68 years at the moment of implantation participated in this study. All subjects were implanted with the 12 channels COMBI 40+ cochlear implant at San Cecilio University Hospital, Granada, Spain. Hearing loss etiology and progression characteristics varied across subjects.
The analyzed programming maps show that the stimulation levels suffer a fast evolution during the first weeks after the first switch-on of the processor. Then, the evolution becomes slower and the programming parameters tend to be stable at about 6 months after the first switch-on. The evolution of the stimulation levels implies an increment of the electrical dynamic range, which is increased from 15.4 to 20.7 dB and improves the intensity resolution. A significant increment of the sensitivity to acoustic stimuli is also observed. For some patients, we have also observed transitory changes in the electrode impedances associated to secretory otitis media, which cause important changes in the programming maps.
We have studied the long-term evolution of the stimulation levels in cochlear implant patients. Our results show the importance of systematic measurements of the electrode impedances before the revision of the programming map. This report also highlights that the evolution of the programming maps is an important factor to be considered in order to determine an adequate calendar fitting of the cochlear implant processor.

Download full-text


Available from: Ángel de la Torre,
1 Follower
39 Reads
    • "This is particularly likely with young children, who are often unable to accurately report on the quality of the auditory percept. It is also important to note that changes in auditory percept may alternatively be the result of intracochlear (or even central) physiological changes, without any change in the characteristics of the electrical stimulation through device malfunction [37] [41]. Because of the many uncertainties around ICS and electrode malfunction, CI manufacturers have developed ways of testing the various components of their devices in situ, most of which currently involve retrieving data on component function from the ICS through " back-telemetry " . "
    [Show abstract] [Hide abstract]
    ABSTRACT: To describe the principles and operation of a new telemetry-based function test for the Nucleus The IIM test measures bipolar impedances between all electrode pairs and employs a normalization procedure based on common ground impedances in order to identify abnormal current paths among electrodes. Six European clinics collected IIM data from a total of 192 devices. Reproducibility was high between initial and repeat measurements. The normative analysis demonstrated narrow ranges among devices after normalization of impedance data. The IIM is able to identify abnormal current paths that are not evident from standard impedance telemetry and may otherwise only be found utilising average electrode voltage measurements (AEV). The IIM test was found to be straightforward to perform clinically and demonstrated reproducible data with narrow ranges in normally-functioning devices. Because this test uses a very low stimulation level the IIM test is well suited for children or multiply handicapped CI users who cannot reliably report on their auditory percepts. The new algorithms show potential to improve implant integrity testing capabilities if implemented in future clinical software.
    Biomedizinische Technik/Biomedical Engineering 01/2015; DOI:10.1515/bmt-2014-0058 · 1.46 Impact Factor
  • Source
    • "Three centers have less than 1 annual session (Hannover and London-RNTNE every second year, Nijmegen every third year, and Mumbai on patient's request). It seems that these annual sessions are merely planned for verification and to reassure that the performance has not deteriorated, rather than for modifying the MAPs which remain rather stable after the initial months [15]. From that perspective it seems justified to increase the interval of one year. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The programming of CIs is essential for good performance. However, no Good Clinical Practice guidelines exist. This paper reports on the results of an inventory of the current practice worldwide. A questionnaire was distributed to 47 CI centers. They follow 47600 recipients in 17 countries and 5 continents. The results were discussed during a debate. Sixty-two percent of the results were verified through individual interviews during the following months. Most centers (72%) participated in a cross-sectional study logging 5 consecutive fitting sessions in 5 different recipients. Data indicate that general practice starts with a single switch-on session, followed by three monthly sessions, three quarterly sessions, and then annual sessions, all containing one hour of programming and testing. The main focus lies on setting maximum and, to a lesser extent, minimum current levels per electrode. These levels are often determined on a few electrodes and then extrapolated. They are mainly based on subjective loudness perception by the CI user and, to a lesser extent, on pure tone and speech audiometry. Objective measures play a small role as indication of the global MAP profile. Other MAP parameters are rarely modified. Measurable targets are only defined for pure tone audiometry. Huge variation exists between centers on all aspects of the fitting practice.
    The Scientific World Journal 02/2014; 2014:501738. DOI:10.1155/2014/501738 · 1.73 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background This study compares 4- to 7-year-old cochlear implanted (CI) and specific language impaired (SLI) children in the production of finite verb morphology and mean length of utterance (MLU). It has been hypothesized that, due to reduced exposure to grammatical elements in the ambient language, both groups are delayed in their acquisition of morphosyntax. Method Spontaneous language samples were analyzed for Dutch monolingual CI (N = 48) and SLI children (N = 38) on MLU, number of finite verbs, and number of errors in the target-like production of verbal agreement. CI and SLI children were compared on their linguistic profiles, including MLU and finite verb production, using the norms of typically developing (TD) children. Results Statistical differences between CI and SLI children were found only for finite verb production at ages 5 and 6, in the direction of better outcomes for CI children. Both groups produced significant numbers of verbal agreement errors. Weak linguistic profiles were found for 75% of the SLI children and 35% of the CI children. Conclusion CI and SLI children show both weak performances on the target-like production of verbal agreement. Nevertheless, CI children produce more finite verbs and have stronger linguistic profiles as compared to SLI children.
    Lingua 01/2014; 139. DOI:10.1016/j.lingua.2013.11.010 · 0.71 Impact Factor
Show more