Reporting and presenting information retrieval processes: the need for optimizing common practice in health technology assessment

Medical Review Board of German Statutory Health Insurances Lower Saxony (MDK Niedersachsen), Germany.
International Journal of Technology Assessment in Health Care (Impact Factor: 1.56). 10/2010; 26(4):450-7. DOI: 10.1017/S0266462310001066
Source: PubMed

ABSTRACT Information retrieval (IR) in health technology assessment (HTA) calls for transparency and reproducibility, but common practice in the documentation and presentation of this process is inadequate in fulfilling this demand.
Our objective is to promote good IR practice by presenting the conceptualization of retrieval and transcription readable to non-information specialists, and reporting of effectively processed search strategies.
We performed a comprehensive database search (04/2010) to synthesize the current state-of-the-art. We then developed graphical and tabular presentation methods and tested their feasibility on existing research questions and defined recommendations.
No generally accepted standard of reporting of IR in HTA exists. We, therefore, developed templates for presenting the retrieval conceptualization, database selection, and additional hand-searching as well as for presenting search histories of complex and lengthy search strategies. No single template fits all conceptualizations, but some can be applied to most processes. Database interface providers report queries as entered, not as they are actually processed. In PubMed, the huge difference between entered and processed query is shown in "Details." Quality control and evaluation of search strategies using a validated tool such as the PRESS checklist is suboptimal when only entry-query based search histories are applied.
Moving toward an internationally accepted IR reporting standard calls for advances in common reporting practices. Comprehensive, process-based reporting and presentation would make IR more understandable to others than information specialists and facilitate quality control.

Download full-text


Available from: Christina Johanna Niederstadt, Jan 04, 2015
1 Follower
  • International Journal of Technology Assessment in Health Care 04/2011; 27(2):188-9; author reply 189-90. DOI:10.1017/S0266462311000080 · 1.56 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background Rules and regulations form the framework of Health Technology Assessments. Legal issues are directly associated with the technology (as patents/licenses) or to the patients and their basic rights (as autonomy). In order to identify the regulations of interest as well as the relevant publications in a systematic and transparent way a specific methodological approach is required. In the absence of adapted methods, our objective was to develop a methodological approach to the systematic retrieval of information on legal issues. Methods 1.Conduct comprehensive literature searches to identify existing methodological approaches to systematic and transparent information retrieval on legal issues associated with health technologies.2.Develop a specifically adapted proposal for a step-by-step information retrieval workflow.3.Apply the proposed procedure to three examples, such as “ultrasound screening in pregnancy.” Results No publications on adapted methods could be identified. We therefore developed a procedure following the workflow of information retrieval for effectiveness assessments. This workflow consists of 8 steps: 0. pre-search: identification of the relevant rules, regulations and patient-related issues, 1. translation of the search question, 2. concept building, 3. identification of synonyms, 4. selection of relevant information sources, 5. design of the search strategies, 6. execution and quality check, 7. saving the results and reporting. Conclusions There are numerous publications on legal issues associated with health technologies. Specifically adapted procedures are qualified to identify them in a systematic and transparent manner using the appropriate sensitivity and precision. A wider application seems to be reasonable in order to further test its practicality against more topics and to modify the proposed method if indicated.
    Zeitschrift für Evidenz Fortbildung und Qualität im Gesundheitswesen 01/2012; 106(7):509–522. DOI:10.1016/j.zefq.2012.05.019
  • [Show abstract] [Hide abstract]
    ABSTRACT: Aim. To report literature search strategies for the purpose of conducting knowledge-building and theory-generating qualitative systematic reviews. Background. Qualitative systematic reviews lie on a continuum from knowledge-building and theory-generating to aggregating and summarizing. Different types of literature searches are needed to optimally support these dissimilar reviews. Data sources. Articles published between 1989Autumn 2011. These documents were identified using a hermeneutic approach and multiple literature search strategies. Discussion. Redundancy is not the sole measure of validity when conducting knowledge-building and theory-generating systematic reviews. When conducting these types of reviews, literature searches should be consistent with the goal of fully explicating concepts and the interrelationships among them. To accomplish this objective, a berry picking approach is recommended along with strategies for overcoming barriers to finding qualitative research reports. Implications. To enhance integrity of knowledge-building and theory-generating systematic reviews, reviewers are urged to make literature search processes as transparent as possible, despite their complexity. This includes fully explaining and rationalizing what databases were used and how they were searched. It also means describing how literature tracking was conducted and grey literature was searched. In the end, the decision to cease searching also needs to be fully explained and rationalized. Conclusion. Predetermined linear search strategies are unlikely to generate search results that are adequate for purposes of conducting knowledge-building and theory-generating qualitative systematic reviews. Instead, it is recommended that iterative search strategies take shape as reviews evolve.
    Journal of Advanced Nursing 05/2012; 69(1). DOI:10.1111/j.1365-2648.2012.06037.x · 1.69 Impact Factor