Bulletin of the World Health Organization

Published by WHO Press

Print ISSN: 0042-9686


Using knowledge management to make health systems work
  • Article

February 2003


97 Reads

Christopher Bailey
During the last quarter-century or so there has been a revolution in both health and information technology. For the globe as a whole we have seen tremendous strides made in life expectancy and disease control, together with an explosion of information technology and techniques. Humanity now has the potential to make all existing health knowledge available simultaneously to the entire population of the planet. By no means everyone has benefited from the overall trend of increased life expectancy, however, or from that of increased knowledge and its communicability. This gap goes beyond the notion of the "digital divide". It is a "knowledge divide", in which large sections of humanity are cut off not just from the information that could help them but from any learning system or community that fosters problem-solving. For instance, where people are dying of HIV/AIDS, tuberculosis (TB) or malaria despite the availability of technologies to control them, it is at least partly because the procedures for using those technologies effectively have not been worked out and learnt locally. Conversely, if scientists, administrators and technicians fail to stop the rise of these diseases in spite of big investments, detailed calculations and good intentions, it is at least partly because they do not know enough about how things actually work locally. Not only do health data tend to be more scarce in the places that have the more serious health problems, but there are fewer systems there for using even the data that are available to solve those problems. The result is failure to make the transition from information to action in the form, for example, of new treatment guidelines or government policy. The discipline of knowledge management (KM) aims to bridge this gap. Starting with the premise that local problems must have local solutions, effective KM in health can provide on an equitable basis the knowledge necessary for local innovation, and then produce new local knowledge that is in turn fed back and shared in a dynamic regenerative process. Although much of the content of KM is often perceived as information technology (IT), it goes in practice beyond the facilitating power of any single IT tool. Used in the way it should be, it harnesses experience through collaboration in direct, humanly interactive problem-solving. Two examples can give some idea of how this works. In the first, two nongovernmental organizations, Partners in Health and the Institute for Healthcare Improvement, have been working with the Peruvian Ministry of Health to improve TB treatment in that country. Last year they set up a pilot programme involving 41 clinics, using business software to link the clinics with a global team of TB experts, thereby forming a specific community of practice for solving problems, sharing innovations and gathering evidence. …

Not Available

February 1955


28 Reads

This report deals with the geographical distribution, prevalence, epidemiology, etiology, serological, clinical, and histopathological features, and treatment of mal del pinto, or pinta, in Mexico.Repository penicillin preparations (PAM and Panbiotic) have been found highly effective in the treatment of this endemic, non-venereal treponematosis.

Studies in oral leukoplakias: Prevalence of leukoplakia among 10 000 persons in Lucknow, India, with special reference to use of tobacco and betel nut*
  • Article
  • Full-text available

February 1967


337 Reads

Oral carcinoma has been shown to be correlated with the use of tobacco in various parts of India. In a large-scale dental survey conducted in Lucknow, Bombay and Bangalore various precancerous conditions were investigated and studied for their possible relation to smoking and chewing habits. This paper reports the prevalence of oral leukoplakia among 10 000 dental-clinic patients in Lucknow and the correlation of the condition with the use of tobacco and betel nut in the study population.The results show that leukoplakia is far more prevalent among users of tobacco, betel nut or both than among non-users. A strikingly high frequency was found among smokers of the local cigarette, the bidi.

Ten-year health service use outcomes in a population-based cohort of 21 000 injured adults: The Manitoba Injury Outcome Study

November 2006


40 Reads

To quantify long-term health service use (HSU) following non-fatal injury in adults. A retrospective, population-based, matched cohort study identified an inception cohort (1988-91) of injured people who had been hospitalized (ICD-9-CM 800-995) aged 18-64 years (n = 21 032) and a matched non-injured comparison group (n = 21 032) from linked administrative data from Manitoba, Canada. HSU data (on hospitalizations, cumulative length of stay, physician claims and placements in extended care services) were obtained for the 12 months before and 10 years after the injury. Negative binomial and Poisson regressions were used to quantify associations between injury and long-term HSU. Statistically significant differences in the rates of HSU existed between the injured and non-injured cohorts for the pre-injury year and every year of the follow-up period. After controlling for pre-injury HSU, the attributable risk percentage indicated that 38.7% of all post-injury hospitalizations (n = 25 183), 68.9% of all years spent in hospital (n = 1031), 21.9% of physician claims (n = 269 318) and 77.1% of the care home placements (n = 189) in the injured cohort could be attributed to being injured. Many people who survive the initial period following injury, face long periods of inpatient care (and frequent readmissions), high levels of contact with physicians and an increased risk of premature placement in institutional care. Population estimates of the burden of injury could be refined by including long-term non-fatal health consequences and controlling for the effect of pre-injury comorbidity.

Daily penicillin serum concentrations following injection of 1.2 mega-units of "all-purpose" penicillin

February 1965


121 Reads

In view of evidence suggesting that 1.2 mega-units of "all-purpose" penicillin (300 000 IU potassium penicillin G, 300 000 IU procaine penicillin G and 600 000 IU benzathine penicillin) did not maintain treponemicidal serum concentrations during the week following injection-which if true, might necessitate a reappraisal of prophylactic and treatment schedules in wide use against syphilis-daily assays were performed to determine the penicillinaemia levels in ambulant adult males for one week following intramuscular injection with this dosage of two "all-purpose" products (168 assays in all, 24 each day).Statistical evaluation of the results showed that the mean daily serum concentrations were, in fact, treponemicidal during the whole week after injection. The means of groups of 24 assays fell within narrow daily ranges on each of the seven post-injection days, suggesting that the long-acting component (benzathine penicillin) gives reliable and predictable daily levels in a high proportion of cases. This is in contrast to those penicillins which rely for their long-acting property on the oily gel in which they are suspended. On the other hand, the extremes of penicillinaemia for any individual in a large group were shown to cover a very wide range, demonstrating that a particular patient's failure to respond to standard treatment or prophylaxis can be due to factors quite unrelated to the quality or specificity of the product or to the sensitivity of the organism causing disease.

Iron, folate, and vitamin B 12 nutrition in pregnancy: a study of 1000 women from southern India

February 1973


42 Reads

As part of a WHO collaborative programme the prevalence of anaemia was studied and the serum concentrations of iron, folate, and vitamin B(12) were measured in 1 000 pregnant women from southern India. The results of the study show a high prevalence of anaemia, resulting from iron and folate deficiency with iron deficiency predominating. Interrelationships between these nutrients and their effect on pregnancy and the fetus were investigated. The results indicate that, in comparison with populations in developed countries, there was a high prevalence of iron and vitamin B(12) deficiency in the community, but the state of folate nutrition was similar to that found elsewhere.

Table 1 : Reactogenicity after ingestion of CVD 103-HgR live oral cholera vaccine or placeboa 
Table 2 : Seroconversion following immunization with a single dose of CVD 103-HgR live oral cholera vaccine 
A single dose of live oral cholera vaccine CVD 103-HgR is safe and immunogenic in HIV-infected and HIV-noninfected adults in Mali

February 1998


100 Reads

Despite considerable experience with single-dose, live, oral cholera vaccine CVD 103-HgR in Asia, Europe, and the Americas, the vaccine had not been evaluated in sub-Saharan Africa or on individuals infected with human immunodeficiency virus (HIV). We therefore conducted a randomized, placebo-controlled, double-blind, cross-over clinical trial in 38 HIV-seropositive (without clinical acquired immunodeficiency syndrome (AIDS)) and 387 HIV-seronegative adults in Mali to assess its safety and immunogenicity. Adverse reactions (fever, diarrhoea and vomiting) were observed with similar frequency among vaccine and placebo recipients. The vaccine strain was not isolated from the coprocultures of any subject. The baseline geometric mean titre (GMT) of serum vibriocidal antibody was significantly lower in HIV-seropositives (1:23) than in HIV-seronegatives (1:65) (P = 0.002). Significant rises in vibriocidal antibody were observed in 71% of HIV-seronegatives and 58% of HIV-seropositives, and in 40% of HIV-seropositives with CD4+ counts below 500 per microliter. Following immunization, the peak vibriocidal GMT in HIV-seronegatives was 1:584 versus 1:124 in HIV-seropositives (P = 0.0006); in HIV-seropositives with CD4+ counts < 500 per microliter, the peak vibriocidal GMT was 1:40 (P = 0.03 versus other HIV-seropositives). CVD 103-HgR was safe in HIV-infected Malian adults, although serological responses were significantly attenuated among HIV-seropositives (particularly in those with CD4+ counts < 500 per microliter) relative to HIV-seronegatives. These results encourage further evaluations of this single-dose, oral cholera vaccine in high-risk populations such as refugees in sub-Saharan Africa.

Table 1 . Estimated regional intake of fruit and vegetables 
Lock K, Pomerlau J, Causer L, Altmann DR, McKee M. The global burden of disease attributable to low consumption of fruit and vegetables: implications for the global strategy on diet. Bull World Health Org 83, 100-108

March 2005


820 Reads

We estimated the global burden of disease attributable to low consumption of fruit and vegetables, an increasingly recognized risk factor for cardiovascular disease and cancer, and compared its impact with that of other major risk factors for disease. The burden of disease attributable to suboptimal intake of fruit and vegetables was estimated using information on fruit and vegetable consumption in the population, and on its association with six health outcomes (ischaemic heart disease, stroke, stomach, oesophageal, colorectal and lung cancer). Data from both sources were stratified by sex, age and by 14 geographical regions. The total worldwide mortality currently attributable to inadequate consumption of fruit and vegetables is estimated to be up to 2.635 million deaths per year. Increasing individual fruit and vegetable consumption to up to 600 g per day (the baseline of choice) could reduce the total worldwide burden of disease by 1.8%, and reduce the burden of ischaemic heart disease and ischaemic stroke by 31% and 19% respectively. For stomach, oesophageal, lung and colorectal cancer, the potential reductions were 19%, 20%, 12% and 2%, respectively. This study shows the potentially large impact that increasing fruit and vegetable intake could have in reducing many noncommunicable diseases. It highlights the need for much greater emphasis on dietary risk factors in public health policy in order to tackle the rise in noncommunicable diseases worldwide, and suggests that the proposed intersectoral WHO/FAO fruit and vegetable promotion initiative is a crucial component in any global diet strategy.

Table 1. Percentage of groundwaters surveyed in 1998 by the British Geological Survey with arsenic levels over 50 mg/l 
Smith, A.H., Lingas, E.O. & Rahman, M. Contamination of Drinking Water by Arsenic in Bangladesh: A Public Health Emergency. Bulletin of the World Health Organization 78: 1093-1103

February 2000


1,045 Reads

The contamination of groundwater by arsenic in Bangladesh is the largest poisoning of a population in history, with millions of people exposed. This paper describes the history of the discovery of arsenic in drinking-water in Bangladesh and recommends intervention strategies. Tube-wells were installed to provide "pure water" to prevent morbidity and mortality from gastrointestinal disease. The water from the millions of tube-wells that were installed was not tested for arsenic contamination. Studies in other countries where the population has had long-term exposure to arsenic in groundwater indicate that 1 in 10 people who drink water containing 500 micrograms of arsenic per litre may ultimately die from cancers caused by arsenic, including lung, bladder and skin cancers. The rapid allocation of funding and prompt expansion of current interventions to address this contamination should be facilitated. The fundamental intervention is the identification and provision of arsenic-free drinking water. Arsenic is rapidly excreted in urine, and for early or mild cases, no specific treatment is required. Community education and participation are essential to ensure that interventions are successful; these should be coupled with follow-up monitoring to confirm that exposure has ended. Taken together with the discovery of arsenic in groundwater in other countries, the experience in Bangladesh shows that groundwater sources throughout the world that are used for drinking-water should be tested for arsenic.

Epidemiological basis of tuberculosis eradication. 11. Mortality among tuberculosis cases and the general population of Greenland

February 1971


12 Reads

Greenland experienced, during the 1950s, a decline in mortality such as is on record for hardly any other place in the world: from 24 per 1 000 in 1951 to 8 per 1 000 in 1960, a decline of more than 10% per year. Deaths from tuberculosis were especially reduced. Whereas more than one-third of all deaths in 1951 were considered to be due to this disease, practically no deaths are ascribed to it today.This rapid improvement in the health situation in Greenland, which coincides with a large-scale development programme, is documented in detail in the present paper. The study is based partly on official mortality statistics and partly on a 9-year follow-up study of mortality and of morbidity from tuberculosis in the total population of West Greenland registered in 1955. The existence of such data for a developing area is probably unique.

Tuberculin sensitivity and skin lesions in children after vaccination with 11 different BCG strains

February 1974


12 Reads

In previously published studies, a number of BCG strains used in several production laboratories were compared in animal models. Liquid vaccines from the different strains were prepared in one laboratory with a uniform technique, the aim being to obtain uniform in vitro properties. In the studies reported here, such vaccines were compared by means of vaccinating children in India and Denmark and then measuring their post-vaccination skin lesions and tuberculin sensitivity. One strain induced delayed hypersensitivity strikingly weaker than that induced by any of the others, although the vaccine was in no way inferior in terms of exhaustive in vitro tests. Differences among the other strains were slight, although sometimes statistically significant. The implications of such differences are discussed.

Onchocerciasis in Kenya 9, 11 and 18 years after elimination of the vector

February 1967


40 Reads

Elimination of the onchocerciasis vector Simulium neavei through larvicidal operations in focal areas of Kenya in 1946, 1953, and 1955 achieved complete interruption of transmission. Since no treatment was administered to the infected population, the areas provided an opportunity for studying the natural course of the infection in man in the absence of reinfection, with particular emphasis on its average duration and the effect of duration of exposure to the infection. In a follow-up survey conducted in 1964 in four focal areas, approximately 2000 people were examined parasitologically and clinically; slightly over half this group were also given a thorough ophthalmological examination. The results showed that, 11 years after interruption of transmission, live Onchocerca volvulus adults were present in nodules and microfilariae were present in the skin; after 18 years, however, microfilariae were no longer found in the skin. Assuming that in hyperendemic areas parasites are acquired until shortly before interruption of transmission, it can thus be postulated that O. volvulus worms lose their reproductive potentiality after 16 years or possibly earlier. A comparison of recent microfilarial rates with adjusted rates found in earlier surveys seems to indicate that the onchocercal infection, after interruption of transmission, follows a straight regression line, theoretically reaching zero after about 13-17 years.

Studies on trachoma. 11. Evaluation of laboratory diagnostic methods under field conditions

February 1968


12 Reads

The severity of trachoma in endemic areas has, in general, a tendency to decrease as a consequence of control measures and gradual improvements in sanitation and living conditions. The number of mild cases seen where the disease is prevalent is thus increasing and it is becoming more difficult to establish a differential diagnosis in certain cases, and to determine the degree of endemicity of the disease in a given area or community.In order to ascertain whether available laboratory methods could contribute useful data from this point of view, a clinical and laboratory study was carried out on the school population of the island of Djerba, off the south coast of Tunisia, during the school year 1963-64.The ophthalmological findings confirmed that, notwithstanding the large-scale treatment campaigns which had been in operation for 10 years, trachoma was then still highly endemic in the island, but relatively mild.The laboratory studies included microscopical examination of conjunctival scrapings for inclusion bodies, complement-fixation tests on serum specimens and-on a subsample of the populations studied-attempts to isolate the trachoma agent. The results indicated that the tests are more likely to be positive when the clinical signs are more pronounced. In individual cases, laboratory tests can at best confirm an already established clinical diagnosis and contribute little to the differential diagnosis of borderline cases.However, this study also indicated that the laboratory tests may provide useful quantitative indications on the endemicity of the disease in a community or in an area, from the point of view of the density of the agent and of the response to its presence. The techniques used must obviously be uniform enough to allow for a comparison with results obtained elsewhere or at different times.

Table 1 : Constant and coefficients of best-fit polynomial equationsa for the mean and SD of MUAC-for-age b 
Table 2 : MUAC-for-age reference data for boys aged 6-59 monthSa 
Table 3 : MUAC-for-age reference data for girls aged 6-59 monthSa 
de Onis M, Yip R & Mei Z.The development of MUAC-for-age reference data recommended by a WHO Expert Committee. Bull. World Health Organ 75: 11−18

January 1997


12,240 Reads

Low mid-upper-arm circumference (MUAC), determined on the basis of a fixed cut-off value, has commonly been used as a proxy for low weight-for-height (wasting). The use of a fixed cut-off value was based on the observation that MUAC showed small age- and sex-specific differences. However, in 1993, a WHO Expert Committee concluded that age independence is not reflected in the true pattern of mid-upper arm growth, recommended the use of MUAC-for-age, and presented age- and sex-specific MUAC reference data developed with observations obtained from a representative sample of children in the USA aged 6-59 months. In this article, we explain the methodology for the development of these data, present age- and sex-specific growth curves and tables and discuss the applications and limitations of MUAC as a nutritional indicator. To develop the reference data, estimates were first obtained for the mean and standard deviation of MUAC for each month of age using 7-month segmental regression equations; a 5th-degree and a 3rd-degree polynomial in age was then used to describe the mean and standard deviation, respectively, of MUAC-for age. These curves show important age-specific differences, and significant sex-specific differences for boys and girls < 24 months of age. Correct interpretation of MUAC with regard to nutritional status requires the use of MUAC-for-age reference data such as those presented here.

Top-cited authors