There is a trend toward patient-centered care as a means of improving patient satisfaction. The Centers for Medicare and Medicaid Services (CMS) have made this concept more significant with plans to link reimbursement to patient satisfaction measures such as the Hospital Consumer Assessment of Healthcare Providers and Systems survey (HCAHPS).
To generate hypotheses for reasons underlying diminished HCAHPS patient satisfaction survey ratings, with reference to hospitalists.
Observational study conducted using a cognitive interview (CI) technique in a 180-bed community hospital on adult medical, surgical, and critical care inpatients.
Mixed qualitative and quantitative study using both standard responses and open-ended responses. The standard responses were compiled into raw numbers and percentages and the qualitative responses were evaluated for common themes and other useful information.
Notable factors that may affect satisfaction of patients include ability to have all of their questions answered, incomplete discussion of medication side effects, and failure of physicians to listen and form personal connections with them. CONCLUSIO:: Cognitive interview techniques can be used to provide additional detail regarding patient satisfaction beyond that provided by standard surveys.
[Show abstract][Hide abstract] ABSTRACT: To describe the developmental process for the CAHPS Hospital Survey.
A pilot was conducted in three states with 19,720 hospital discharges.
A rigorous, multi-step process was used to develop the CAHPS Hospital Survey. It included a public call for measures, multiple Federal Register notices soliciting public input, a review of the relevant literature, meetings with hospitals, consumers and survey vendors, cognitive interviews with consumer, a large-scale pilot test in three states and consumer testing and numerous small-scale field tests.
The current version of the CAHPS Hospital Survey has survey items in seven domains, two overall ratings of the hospital and five items used for adjusting for the mix of patients across hospitals and for analytical purposes.
The CAHPS Hospital Survey is a core set of questions that can be administered as a stand-alone questionnaire or combined with a broader set of hospital specific items.
Health Services Research 01/2006; 40(6 Pt 2):1977-95. DOI:10.1111/j.1475-6773.2005.00477.x · 2.78 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: This article describes the models and methods that cognitive psychologists and survey researchers use to evaluate and experimentally test cognitive issues in questionnaire design and subsequently improve self-report instruments. These models and methods assess the cognitive processes underlying how respondents comprehend and generate answers to self-report questions. Cognitive processing models are briefly described. Non-experimental methods – expert cognitive review, cognitive task analysis, focus groups, and cognitive interviews – are described. Examples are provided of how these methods were effectively used to identify cognitive self-report issues. Experimental methods – cognitive laboratory experiments, field tests, and experiments embedded in field surveys – are described. Examples are provided of: (a) how laboratory experiments were designed to test the capability and accuracy of respondents in performing the cognitive tasks required to answer self-report questions, (b) how a field experiment was conducted in which a cognitively designed questionnaire was effectively tested against the original questionnaire, and (c) how a cognitive experiment embedded in a field survey was conducted to test cognitive predictions.
Quality of Life Research 01/2003; 12(3). · 2.49 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: To describe how cognitive testing results were used to inform the modification and selection of items for the Consumer Assessment of Health Providers and Systems (CAHPS) Hospital Survey pilot test instrument.
Cognitive interviews were conducted on 31 subjects in two rounds of testing: in December 2002-January 2003 and in February 2003. In both rounds, interviews were conducted in northern California, southern California, Massachusetts, and North Carolina.
A common protocol served as the basis for cognitive testing activities in each round. This protocol was modified to enable testing of the items as interviewer-administered and self-administered items and to allow members of each of three research teams to use their preferred cognitive research tools.
Each research team independently summarized, documented, and reported their findings. Item-specific and general issues were noted. The results were reviewed and discussed by senior staff from each research team after each round of testing, to inform the acceptance, modification, or elimination of candidate items.
Many candidate items required modification because respondents lacked the information required to answer them, respondents failed to understand them consistently, the items were not measuring the constructs they were intended to measure, the items were based on erroneous assumptions about what respondents wanted or experienced during their hospitalization, or the items were asking respondents to make distinctions that were too fine for them to make. Cognitive interviewing enabled the detection of these problems; an understanding of the etiology of the problem informed item revisions. However, for some constructs, the revisions proved to be inadequate. Accordingly, items could not be developed to provide acceptable measures of certain constructs such as shared decision making, coordination of care, and delays in the admissions process.
Cognitive testing is the most direct way of finding out whether respondents understand questions consistently, have the information needed to answer the questions, and can use the response alternatives provided to describe their experiences or their opinions accurately. Many of the candidate questions failed to meet these standards. Cognitive testing only evaluates the way in which respondents understand and answer questions. Although it does not directly assess the validity of the answers, it is a reasonable premise that cognitive problems will seriously compromise validity and reliability.
Health Services Research 01/2006; 40(6 Pt 2):2037-56. DOI:10.1111/j.1475-6773.2005.00472.x · 2.78 Impact Factor
Note: Although carefully collected, accuracy of this list of references cannot be guaranteed.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.