In closed head injury patients impaired sustained attention has been used to explain poor performance of complex tasks. However, this basic capacity was never adequately investigated. We investigated sustained attention in an auditory vigilance task and found no evidence at all for an impairment. With an interval of about three months 8 patients, tested in the first half year after closed head injury, and 8 healthy control subjects were confronted twice with a low event rate vigilance task. This yielded measures of signal detection and response latency. Also the amplitude of 0.10 Hz Heart Rate Variability, a power spectral measure, was calculated to indicate sustained effort. Hypotheses that patients would manifest stronger effects of time-on-task on performance and effort were not supported. Independent of sustained attention patients differed from controls in terms of response latencies and sensitivity of discriminating small differences of loudness especially on the first occasion.
Schizophrenia is considered a brain disease with a quite heterogeneous clinical presentation. Studies in schizophrenia have yielded a wide array of correlations between structural and functional brain changes and clinical and cognitive symptoms. Reductions of grey matter volume (GMV) in the prefrontal and temporal cortex have been described which are crucial for the development of positive and negative symptoms and impaired working memory (WM). Associations between GMV reduction and positive and negative symptoms as well as WM impairment were assessed in schizophrenia patients (symptomatology in 34, WM in 26) and compared to healthy controls (36 total, WM in 26). GMV was determined by voxel-based morphometry and its relation to positive and negative symptoms as well as WM performance was assessed. In schizophrenia patients, reductions of GMV were evident in anterior cingulate cortex, ventrolateral prefrontal cortex (VLPFC), superior temporal cortex, and insula. GMV reductions in the superior temporal gyrus (STG) were associated with positive symptom severity as well as WM impairment. Furthermore, the absolute GMV of VLPFC was strongly related to negative symptoms. These predicted WM performance as well as processing speed. The present results support the assumption of two distinct pathomechanisms responsible for impaired WM in schizophrenia: (1) GMV reductions in the VLPFC predict the severity of negative symptoms. Increased negative symptoms in turn are associated with a slowing down of processing speed and predict an impaired WM. (2) GMV reductions in the temporal and mediofrontal cortex are involved in the development of positive symptoms and impair WM performance, too.
There is an academic dispute regarding the role of the right hemisphere in language processing. Transcranial Magnetic Stimulation (TMS) was used to test the hypothesis that Wernicke's area processes dominant meanings ("teller") whereas its right homologue processes subordinate meanings ("river") of ambiguous words ("bank"; Jung-Beeman, 2005).
Participants were asked to make a semantic decision on ambiguous words that were followed either by unrelated words or by words associated with their dominant or subordinate meanings. A 10 Hz TMS train was applied on each trial over CP5 (left Wernicke), CP6 (right Wernicke) or Cz (vertex) scalp positions, and was synchronized with the word presentation.
Accuracy and d' analysis revealed a TMS LOCATION by MEANING interaction. TMS over Wernicke's area resulted in more accurate responses and higher sensitivity to dominant meaning blocks compared to stimulating the right Wernicke's area and the vertex. In contrast, TMS over the right Wernicke's area resulted in more accurate responses and higher sensitivity to subordinate meaning blocks, compared to stimulating the left Wernicke's area and the vertex.
The left and right Wernicke's areas function as processors of dominant and subordinate meanings of ambiguous words, respectively. While previous research methods have yielded indecisive results, TMS proved to be a useful tool in demonstrating a causal role of the two brain regions in a double dissociation design with healthy subjects.
Although there are many opportunities to study memory in patients with Alzheimer's disease (AD) in the laboratory, there are few opportunities to study memory for real world events in these patients. The September 11, 2001 terrorist attacks provided one such opportunity. Patients with AD, patients with mild cognitive impairment (MCI), and healthy older adults were given a telephone questionnaire in the initial weeks after the event, again three to four months later, and finally one year afterwards to evaluate their memory for the September 11, 2001 terrorist attacks. We were particularly interested in using the attacks as an opportunity to examine the decline of episodic memory in patients with AD, patients with MCI, and older adult controls over a period of months. We found that compared to healthy older adults, patients with AD and MCI showed impaired memory at the initial time point, more rapid forgetting from the initial to the three-month time point, and very similar changes in memory from the three-month to the one-year time point. We speculated that these findings were consistent with patients with AD and MCI showing initial impaired encoding and a more rapid rate of forgetting compared with healthy older adults, but that once the memories had been consolidated, their decay rate became similar to that of healthy older adults. Lastly, although memory distortions were common among all groups, they were greatest in the patients with AD.
In animal studies, brain-derived neurotrophic factor (BDNF) is an important regulator of central nervous system development and synaptic plasticity. WAGR (Wilms tumour, Aniridia, Genitourinary anomalies, and mental Retardation) syndrome is caused by 11p13 deletions of variable size near the BDNF locus and can serve as a model for studying human BDNF haploinsufficiency (+/-). We hypothesized that BDNF+/- would be associated with more severe cognitive impairment in subjects with WAGR syndrome. Twenty-eight subjects with WAGR syndrome (6-28 years), 12 subjects with isolated aniridia due to PAX6 mutations/microdeletions (7-54 years), and 20 healthy controls (4-32 years) received neurocognitive assessments. Deletion boundaries for the subjects in the WAGR group were determined by high-resolution oligonucleotide array comparative genomic hybridization. Within the WAGR group, BDNF+/- subjects (n = 15), compared with BDNF intact (+/+) subjects (n = 13), had lower adaptive behaviour (p = .02), reduced cognitive functioning (p = .04), higher levels of reported historical (p = .02) and current (p = .02) social impairment, and higher percentage meeting cut-off score for autism (p = .047) on Autism Diagnostic Interview-Revised. These differences remained nominally significant after adjusting for visual acuity. Using diagnostic measures and clinical judgement, 3 subjects (2 BDNF+/- and 1 BDNF+/+) in the WAGR group (10.7%) were classified with autism spectrum disorder. A comparison group of visually impaired subjects with isolated aniridia had cognitive functioning comparable to that of healthy controls. In summary, among subjects with WAGR syndrome, BDNF+/- subjects had a mean Vineland Adaptive Behaviour Compose score that was 14-points lower and a mean intelligence quotient (IQ) that was 20-points lower than BDNF+/+ subjects. Our findings support the hypothesis that BDNF plays an important role in human neurocognitive development.
We report on a patient, LM, with a Korsakoff's syndrome who showed the unusual tendency to consistently provide a confabulatory answer to episodic memory questions for which the predicted and most frequently observed response in normal subjects and in confabulators is "I don't know". LM's pattern of confabulation, which we refer to as confabulatory hypermnesia, cannot be traced back to any more basic and specific cognitive deficit and is not associated with any particularly unusual pattern of brain damage. Making reference to the Memory, Consciousness and Temporality Theory - MCTT (Dalla Barba, 2002), we propose that LM shows an expanded Temporal Consciousness - TC, which overflows the limits of time ("Do you remember what you did on March 13, 1985?") and of details ("Do you remember what you were wearing on the first day of summer in 1979?") that are usually respected in normal subjects and in confabulating patients.
We have shown that dichotically presented nonsense syllables with strict acoustic and phonetic control of the stimuli, generate a right ear advantage that is essentially fixed by 5 years of age. What seems to change with age in our sample is total accuracy, phonetic content and nature of the errors, and something for want of a better term we might call “capacity” of the hypothesized left hemisphere speech processor.
Preoperative measures of intelligence are compared with retests 8 and 14 years after frontal lobe surgery. All 5s were chronic psychotics at the time of the first tests. The long-term retests are reported for 33 nonoperated controls, for 16 patients with lower frontal lobe surgery (Orbital Topectomy) and for 18 patients with upper frontal lobe surgery (Superior Topectomy). Previous research showed that both groups of operated patients scored lower on intelligence tests shortly after surgery than they did before. The objectives of the long-term retests concerned: (1) the permanence of changes, and (2) the differential effects of lower and upper forebrain surgery. The lower forebrain patients obtain long-term scores remarkably comparable to those of the nonoperated controls. The Superior Topectomy patients show significant loss 8 years after surgery, and this loss persists 14 years after surgery. The loss is both permanent and appreciable, equivalent to some 10 points in I.Q. Verbal and numerical tests reflect the permanent loss more clearly than perceptual and construction tests. The loss associated with upper frontal lobe surgery involves sustained attention, problem solving, and other intellectual functions.
Hemisphere specialization for mental rotation was investigated utilizing Shepard's (1971) paradigm. In each of two experiments, the procedure involved presenting pairs of novel non-verbal stimuli at various angles of disparity. Subjects were instructed to construct a mental image of one stimulus, rotate this image, and judge whether or not the image was a congruent match with its mate. Both response time and accuracy were measured. In Experiment 1, the testing of right-handed normals revealed a significant left visual field advantage for accuracy (p less than .0001) and response time (p less than .05). In Experiment 2, a comparison of right parietal lesioned patients with both left parietal lesioned patients and matched normal controls likewise revealed significant right lesion effects for accuracy (p greater than .0001) and response time (p greater than .01). Right hemisphere specialization for mental rotation was documented for both normals and brain damaged subjects.
180 normal children, ages five through ten years, were given a set of nine naming tests, each involving 50 timed responses to five randomly recurring pictured objects, colors, letters or numbers. “Automatization” of naming, measured by speed, accuracy, and consistency on these tasks, did not parallel the developmental order of acquisition of the various categories of names; letters and numbers were named relatively faster than were colors and objects as early as age six years. Children's facility in “automatization” of naming different semantic categories is considered in terms of the contributions of overlearning, stimulus discriminability, “operativity”, word frequency and response competition; only the last two appear to be explanatory factors.
In his short paper of 1886, the neogrammarian linguist Delbrück sketches his views on normal language processing and their relevance for the interpretation of some of the symptoms of progressive anomic aphasia. In particular, he discusses proper name impairments, verb and abstract noun superiority and the predominance of semantically related errors. Furthermore, he suggests that part of speech, morphology and word order may be preserved in this condition. This historical document has been lost in oblivion but the original ideas and their relevance for contemporary discussions merit a revival.
The early history of developmental language impairment in late 19th century Britain is considered through the critical examination of three papers appearing in 1891 by Hadden, Golding-Bird and Hale White, and Taylor. They represent innovative investigations of child language disorders whose themes and concerns are resonant today. The term 'idioglossia' was coined to identify this new impairment and reflected the belief by some that these children spoke an invented language. Rather than viewing these children as having some constitutional deficiency, these 19th century physicians were novel in insisting that children with language impairments merited extensive clinical investigation and treatment. Their case descriptions and the subsequent debates regarding classification and prognosis are reviewed. Further consideration is given to how these cases led to questioning the relation between language and speech and other aspects of child development and disorder. Reflection on the early sources of clinical categories provides a new perspective on our current formulations for variation in developmental language trajectories.
According to the 'Kennard Principle', there is a negative linear relation between age at brain injury and functional outcome. Other things being equal, the younger the lesioned organism, the better the outcome. But the 'Kennard Principle' is neither Kennard's nor a principle. In her work, Kennard sought to explain the factors that predicted functional outcome (age, to be sure, but also staging, laterality, location, and number of brain lesions, and outcome domain) and the neural mechanisms that altered the lesioned brain's functionality. This paper discusses Kennard's life and years at Yale (1931-1943); considers the genesis and scope of her work on early-onset brain lesions, which represents an empirical and theoretical foundation for current developmental neuropsychology; offers an historical explanation of why the 'Kennard Principle' emerged in the context of early 1970s work on brain plasticity; shows why uncritical belief in the 'Kennard Principle' continues to shape current research and practice; and reviews the continuing importance of her work.