Harley Z Ramelson

Harvard Medical School, Boston, Massachusetts, United States

Are you Harley Z Ramelson?

Claim your profile

Publications (7)9.59 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: To assses the relationship between methods of documenting visit notes and note quality for primary care providers (PCPs) and specialists, and to determine the factors that contribute to higher quality notes for two chronic diseases.
    Applied Clinical Informatics. 05/2014; 5(2):480-490.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In a previous study, we reported on a successful clinical decision support (CDS) intervention designed to improve electronic problem list accuracy, but did not study variability of provider response to the intervention or provider attitudes towards it. The alert system accurately predicted missing problem list items based on health data captured in a patient's electronic medical record. To assess provider attitudes towards a rule-based CDS alert system as well as heterogeneity of acceptance rates across providers. We conducted a by-provider analysis of alert logs from the previous study. In addition, we assessed provider opinions of the intervention via an email survey of providers who received the alerts (n = 140). Although the alert acceptance rate was 38.1%, individual provider acceptance rates varied widely, with an interquartile range (IQR) of 14.8%-54.4%, and many outliers accepting none or nearly all of the alerts they received. No demographic variables, including degree, gender, age, assigned clinic, medical school or graduation year predicted acceptance rates. Providers' self-reported acceptance rate and perceived alert frequency were only moderately correlated with actual acceptance rates and alert frequency. Acceptance of this CDS intervention among providers was highly variable but this heterogeneity is not explained by measured demographic factors, suggesting that alert acceptance is a complex and individual phenomenon. Furthermore, providers' self-reports of their use of the CDS alerting system correlated only modestly with logged usage.
    Applied Clinical Informatics 01/2013; 4(1):144-52. · 0.39 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Clinical documentation, an essential process within electronic health records (EHRs), takes a significant amount of clinician time. How best to optimize documentation methods to deliver effective care remains unclear. OBJECTIVE: We evaluated whether EHR visit note documentation method was influenced by physician or practice characteristics, and the association of physician satisfaction with an EHR notes module. MEASUREMENTS: We surveyed primary care physicians (PCPs) and specialists, and used EHR and provider data to perform a multinomial logistic regression of visit notes from 2008. We measured physician documentation method use and satisfaction with an EHR notes module and determined the relationship between method and physician and practice characteristics. RESULTS: Of 1088 physicians, 85% used a single method to document the majority of their visits. PCPs predominantly documented using templates (60%) compared to 34% of specialists, while 38% of specialists predominantly dictated. Physicians affiliated with academic medical centers (OR 1.96, CI (1.23, 3.12)), based at a hospital (OR 1.57, 95% CI (1.04, 2.36)) and using the EHR for longer (OR 1.13, 95% CI (1.03, 1.25)) were more likely to dictate than use templates. Most physicians of 383 survey responders were satisfied with the EHR notes module, regardless of their preferred documentation method. CONCLUSIONS: Physicians predominantly utilized a single method of visit note documentation and were satisfied with their approach, but the approaches they chose varied. Demographic characteristics were associated with preferred documentation method. Further research should focus on why variation exists, and the quality of the documentation resulting from different methods used.
    International Journal of Medical Informatics 04/2012; · 2.06 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Accurate clinical problem lists are critical for patient care, clinical decision support, population reporting, quality improvement, and research. However, problem lists are often incomplete or out of date. To determine whether a clinical alerting system, which uses inference rules to notify providers of undocumented problems, improves problem list documentation. Inference rules for 17 conditions were constructed and an electronic health record-based intervention was evaluated to improve problem documentation. A cluster randomized trial was conducted of 11 participating clinics affiliated with a large academic medical center, totaling 28 primary care clinical areas, with 14 receiving the intervention and 14 as controls. The intervention was a clinical alert directed to the provider that suggested adding a problem to the electronic problem list based on inference rules. The primary outcome measure was acceptance of the alert. The number of study problems added in each arm as a pre-specified secondary outcome was also assessed. Data were collected during 6-month pre-intervention (11/2009-5/2010) and intervention (5/2010-11/2010) periods. 17,043 alerts were presented, of which 41.1% were accepted. In the intervention arm, providers documented significantly more study problems (adjusted OR=3.4, p<0.001), with an absolute difference of 6277 additional problems. In the intervention group, 70.4% of all study problems were added via the problem list alerts. Significant increases in problem notation were observed for 13 of 17 conditions. Problem inference alerts significantly increase notation of important patient problems in primary care, which in turn has the potential to facilitate quality improvement. ClinicalTrials.gov: NCT01105923.
    Journal of the American Medical Informatics Association 01/2012; 19(4):555-61. · 3.57 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Clinical Decision Support Consortium has completed two demonstration trials involving a web service for the execution of clinical decision support (CDS) rules in one or more electronic health record (EHR) systems. The initial trial ran in a local EHR at Partners HealthCare. A second EHR site, associated with Wishard Memorial Hospital, Indianapolis, IN, was added in the second trial. Data were gathered during each 6 month period and analyzed to assess performance, reliability, and response time in the form of means and standard deviations for all technical components of the service, including assembling and preparation of input data. The mean service call time for each period was just over 2 seconds. In this paper we report on the findings and analysis to date while describing the areas for further analysis and optimization as we continue to expand our use of a Services Oriented Architecture approach for CDS across multiple institutions.
    AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium 01/2012; 2012:690-8.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100,000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100,000 records to assess its accuracy. Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100,000 randomly selected patients showed high sensitivity (range: 62.8-100.0%) and positive predictive value (range: 79.8-99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts.
    Journal of the American Medical Informatics Association 05/2011; 18(6):859-67. · 3.57 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Creating shareable decision support services is a complex task requiring effort from multiple interdisciplinary role players with a wide variety of experience and expertise. The CDS Consortium research project has developed such a service, defining a multi-layer representation of knowledge and building upon an architectural service design created at Partners Health Care, and is demonstrating its use in both a local and an external institutional setting. The process was iterative, and we encountered unexpected requirements based on decisions made at various points. We report in this paper on challenges we faced while pursuing this research: knowledge representation and modeling, data interchange and standards adoption, the process of getting agreement on content, logistics of integrating into a system that already has multiple CDS interventions, legal issues around privacy and access, inter-team communication and organization.
    AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium 01/2010; 2010:602-6.