Article

Transparent development of the WHO rapid advice guidelines

Keio University, Edo, Tōkyō, Japan
PLoS Medicine (Impact Factor: 14). 06/2007; 4(5):e119. DOI: 10.1371/journal.pmed.0040119
Source: PubMed

ABSTRACT Emerging health problems require rapid advice. We describe the development and pilot testing of a systematic, transparent approach used by the World Health Organization (WHO) to develop rapid advice guidelines in response to requests from member states confronted with uncertainty about the pharmacological management of avian influenza A (H5N1) virus infection. We first searched for systematic reviews of randomized trials of treatment and prevention of seasonal influenza and for non-trial evidence on H5N1 infection, including case reports and animal and in vitro studies. A panel of clinical experts, clinicians with experience in treating patients with H5N1, influenza researchers, and methodologists was convened for a two-day meeting. Panel members reviewed the evidence prior to the meeting and agreed on the process. It took one month to put together a team to prepare the evidence profiles (i.e., summaries of the evidence on important clinical and policy questions), and it took the team only five weeks to prepare and revise the evidence profiles and to prepare draft guidelines prior to the panel meeting. A draft manuscript for publication was prepared within 10 days following the panel meeting. Strengths of the process include its transparency and the short amount of time used to prepare these WHO guidelines. The process could be improved by shortening the time required to commission evidence profiles. Further development is needed to facilitate stakeholder involvement, and evaluate and ensure the guideline's usefulness.

Full-text

Available from: Chris Del Mar, Jun 03, 2015
0 Followers
 · 
167 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Clinical practice guidelines (CPGs) play an important role in healthcare. The guideline development process should be precise and rigorous to ensure that the results are reproducible and not vague. To determine the quality of guidelines, the Appraisal of Guidelines and Research and Evaluation (AGREE) instrument was developed and introduced. The aim of the present study was to assess the methodological quality of clinical practice guidelines on glioma. Eight databases (including MEDLINE and Embase) were searched till to August, 2013. The methodological quality of the guidelines was assessed by four authors independently using the AGREE II instrument. Fifteen relevant guidelines were included from 940 citations. The overall agreement among reviewers was moderate (intra-class correlation coefficient = 0.83; 95 % confidence interval [CI], 0.66-0.92). The mean scores were moderate for the domains "scope and purpose" (59.54) and "clarity of presentation" (65.46); however, there were low scores for the domains "stakeholder involvement" (43.80), "rigor of development" (39.01), "applicability" (31.89), and "editorial independence" (30.83). Only one third of the guidelines described the systematic methods for searching, and nearly half of the (47 %) guidelines did not give a specific recommendation. Only four of 15 described a procedure for updating the guideline; meanwhile, just six guidelines in this field can be considered to be evidence-based. The quality and transparency of the development process and the consistency in the reporting of glioma guidelines need to be improved. And the quality of reporting of guidelines was disappointing. Many other methodological disadvantages were identified. In the future, glioma CPGs should be based on the best available evidence and rigorously developed and reported. Greater efforts are needed to provide high-quality guidelines that serve as a useful and reliable tool for clinical decision-making in this field.
    Neurosurgical Review 09/2014; 38(1). DOI:10.1007/s10143-014-0569-z · 1.86 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Decision-making based on reliable evidence is more likely to lead to effective and efficient treatments. Evidence-based dentistry was developed, similarly to evidence-based medicine, to help clinicians apply current and valid research findings into their own clinical practice. Interpreting and appraising the literature is fundamental and involves the development of evidence-based dentistry (EBD) skills. Systematic reviews (SRs) of randomized controlled trials (RCTs) are considered to be evidence of the highest level in evaluating the effectiveness of interventions. Furthermore, the assessment of the report of a RCT, as well as a SR, can lead to an estimation of how the study was designed and conducted.
    01/2014; 15:58. DOI:10.1186/s40510-014-0058-5
  • Source