ArticlePDF Available

Online Ratings of Facial Plastic Surgeons: Worthwhile Additions to Conventional Patient Experience Surveys

Authors:

Abstract

Background: Physician review websites are now commonly used by patients. However, in facial plastic surgery, the trends and content in these websites are not well studied. We examined online reviews for U.S. facial plastic surgeons, and compared comment content with the most commonly used patient experience survey, the Consumer Assessment of Healthcare Providers and Systems (CAHPS) administered by Press Ganey. Methods: A retrospective mixed method study was employed to quantitatively compare online ratings and comments of 100 randomly selected U.S. facial plastic surgeons on vitals.com, healthgrades.com, google.com and zocdoc.com. Qualitative content analysis was utilized to categorize themes present in 957 patient-generated (unverified) comments, and compare these with CAHPS survey questions and themes. Results: The physician review websites had favorable ratings of facial plastic surgeons with 84.55% five-star reviews on Healthgrades and 78.40% on Vitals. These ratings were similar across surgeon age (p = 0.44), gender (p = 0.85), and geographic region (p = 0.29). Of sites examined, Healthgrades and Vitals were most frequently used. Analysis of patient comments identified themes aligning with CAHPS content (e.g., physician interactions, efficiency, and recommendation likelihood), as well as additional themes such as patient's outcome perception (55.28% of comments) and finances (86% of negatively rated reviews). Conclusions: These exploratory results suggest that facial plastic surgeons are generally rated positively online, and the comments left on these websites provide additional feedback that is not currently included in CAHPS surveys. In evaluating the patient experience with facial plastic surgery practices, these websites may prove to be useful.
For Peer Review
Online Ratings of Facial Plastic Surgeons: Worthwhile
Additions to Conventional Patient Experience Surveys
Journal:
Facial Plastic Surgery & Aesthetic Medicine
Manuscript ID
FPSAM-2020-0049
Manuscript Type:
Original Investigation
Date Submitted by the
Author:
15-Jan-2020
Complete List of Authors:
Bovenzi, Cory; Thomas Jefferson University, Otolaryngology- Head &
Neck Surgery
Manges, Kirstin; University of Pennsylvania, Perelman School of Medicine
Krein, Howard; Thomas Jefferson University, Otolaryngology- Head &
Neck Surgery
Heffelfinger, Ryan; Thomas Jefferson University, Director, Division of
Facial Plastic and Reconstructive Surgery
Manuscript Keywords (Search
Terms):
online, website, survey, patient satisfaction, ratings, HCAHPS
Primary Subject:
Health Informatics
Secondary Subject:
Facial Rejuvenation - Non-Surgical, Rhinoplasty, Soft Tissue
Reconstruction, Facelift, Blepharoplasty
Abstract:
Background: Physician review websites are commonly used by patients
and patient satisfaction surveys are routinely used by the government to
determine the quality of care by facial plastic surgeons. However, in this
population, physician review website trends and content are not well-
known, nor is the relation of this data with that obtained from patient
satisfaction surveys.
Methods: A retrospective mixed method study was employed to
quantitatively compare online ratings and comments of 100 randomly-
selected US facial plastic surgeons based on surgeon characteristics and
location using data from vitals.com, healthgrades.com, google.com and
zocdoc.com. A qualitative approach was utilized to categorize themes
present in 957 patient-generated comments and compare these to that
of Consumer Assessment of Healthcare Providers and Systems (CAHPS)
survey questions and themes.
Results: The physician review websites had favorable ratings of facial
plastic surgeons with 84.55% five-star reviews on Healthgrades and
78.40% on Vitals. These ratings were similar despite surgeon age
(p=0.44), gender (p=0.85), or geographical region (p=0.29). The most
used sites were Healthgrades and Vitals regardless of geographic area or
surgeon characteristics. Comments left on these sites represented a
number of themes not present in CAHPS surveys; the most common
additional theme was patient evaluation of their outcome (55.28% of
comments).
Conclusions: Facial plastic surgeons are overall positively rated online
and Healthgrades and Vitals appear to be the most important sites for
Mary Ann Liebert, Inc.
For Peer Review
them to focus on. The comments available on physician review websites
provide important information to those seeking care, which is not
encompassed in CAHPS surveys.
Page 1 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
1
Online Ratings of Facial Plastic Surgeons: Worthwhile Additions to Conventional Patient
Experience Surveys
Cory D. Bovenzi, MD*
Department of Otolaryngology- Head & Neck Surgery
Thomas Jefferson University Hospital
Kirstin A. Manges, PhD, RN*
National Clinician Scholar- Perelman School of Medicine
University of Pennsylvania
Howard Krein, MD, PhD
Department of Otolaryngology- Head & Neck Surgery
Thomas Jefferson University Hospital
Ryan Heffelfinger, MD
Department of Otolaryngology- Head & Neck Surgery
Thomas Jefferson University Hospital
Corresponding Author:
Cory D. Bovenzi, MD
925 Chestnut Street, 6th Floor
Philadelphia, PA 19107
cory.bovenzi@jefferson.edu
215-955-6760
* These authors contributed equally to this work and are considered to be co-first authors.
Page 2 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
2
Key Points
Question: What information is available on physician rating websites used to evaluate
facial plastic surgeons and how does this compare with traditional measures?
Findings: A mixed method study determined that Healthgrades and Vitals are the most
common sites used, most ratings are positive, and ratings are not influenced by surgeon
demographics. Comments left on these sites follow themes which are not captured by
the Consumer Assessment of Healthcare Providers and Systems surveys.
Meaning: Physician review websites provide helpful additional data to traditional patient
experience surveys and are an important aspect of healthcare, especially for facial
plastic surgery practices.
Page 3 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
3
Abstract
Background: Physician review websites are commonly used by patients and patient
satisfaction surveys are routinely used by the government to determine the quality of
care by facial plastic surgeons. However, in this population, physician review website
trends and content are not well-known, nor is the relation of this data with that obtained
from patient satisfaction surveys.
Methods: A retrospective mixed method study was employed to quantitatively compare
online ratings and comments of 100 randomly-selected US facial plastic surgeons
based on surgeon characteristics and location using data from vitals.com,
healthgrades.com, google.com and zocdoc.com. A qualitative approach was utilized to
categorize themes present in 957 patient-generated comments and compare these to
that of Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey
questions and themes.
Results: The physician review websites had favorable ratings of facial plastic surgeons
with 84.55% five-star reviews on Healthgrades and 78.40% on Vitals. These ratings
were similar despite surgeon age (p=0.44), gender (p=0.85), or geographical region
(p=0.29). The most used sites were Healthgrades and Vitals regardless of geographic
area or surgeon characteristics. Comments left on these sites represented a number of
themes not present in CAHPS surveys; the most common additional theme was patient
evaluation of their outcome (55.28% of comments).
Page 4 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
4
Conclusions: Facial plastic surgeons are overall positively rated online and
Healthgrades and Vitals appear to be the most important sites for them to focus on. The
comments available on physician review websites provide important information to
those seeking care, which is not encompassed in CAHPS surveys.
Page 5 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
5
Introduction
Increasingly, patients are turning to online physician rating websites (PRWs) prior to
selecting their physician1. Approximately 35% of patients report seeing a physician
because of positive online ratings, while a similar proportion of patients (37%) reported
that negative reviews led them to seek care elsewhere2. In facial plastic surgery PRWs
are particularly important;3 as procedures are often elective, cosmetic, and payed for
out-of-pocket, patients are incentivized to seek out “the best surgeon” for their particular
need. Similarly, PRW use is highest in younger, female, are educated patients1 which
align with facial plastic surgery populations.
With increased patient use of PRWs, maintaining online reputations has become an
important component of practice management. Examples of strategies employed
include hiring private firms to manage PRWs, requiring patients to sign waivers
prohibiting writing online reviews4, and litigation of patients for libel5. However, despite
these negative connotations, there is evidence that the majority of patients rate their
physicians favorably6, especially plastic surgery patients7. Yet, even though PRWs are
widely used by patients, the quality of data provided is unclear.
As the United States transitions towards a value-based healthcare system, there is
increased focus on measuring quality and patient experience. Each specialty within
medicine is inherently different, and it is rare to find a comprehensive quality indicator.
Thus, at the moment, surveys measuring patient experiences are widely used as a
proxy for assessing quality of care delivered. The Agency for Healthcare Quality and
Research (AHRQ) has developed Consumer Assessment of Healthcare Providers and
Page 6 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
6
Systems (CAHPS) surveys specific for different patient interactions within the
healthcare system8. These surveys are typically administered by a third party such as
Press Ganey Associates, a private company9. In this way, patient experience may be
measured and reported to the Center of Medicare and Medicaid (CMS). CMS, in turn,
uses Quality Payment Programs such as the Merit-Based Incentive Payment System
(MIPS) and Alternative Payment Models (APMs) which use the CAHPS surveys to
determine quality, which are linked to reimbursement10. Eventually CAHPS scores are
expected to be published publicly on the CMS Physician Compare website11.
PRW reviews and CAHPS surveys are two forms of patient experience data that play
critical roles in our current system- one for quick practical use by patients, and the other
for government incentive programs. However, the data extracted from each of these has
been shown to be different for hospitals, with PRWs providing additional important
quality measure domains not captured by CAHPS surveys12. However, to date, there
has been no evaluation of how the surveys used by Press Ganey (Outpatient and
Ambulatory Surgery CAHPS, OAS CAHPS; and Surgical Care CAHPS, S-CAHPS)
compare to the data in the commonly-used PRWs for facial plastic surgeons.
Methods
Design
This retrospective concurrent mixed method study explores how PRWs are used to
evaluate performance of facial plastic surgeons13. Specifically, using a national sample
of facial plastic surgeons (n=100), number of reviews and comments were quantitatively
compared across four commonly used PRWs. Then, the relationship between surgeon
Page 7 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
7
characteristics and overall performance ratings for individual surgeons was determined
for the two most frequented used PRWs. Last, a purposeful sample of narrative
comments was qualitatively analyzed to describe patient experiences with facial plastic
surgeons and to compare the topics to CAHPS survey questions. This study was
exempt by the local Institutional Review Board.
Sample
The American Academy of Facial Plastic and Reconstructive Surgery (AAFPRS)
membership directory was used to determine a national of 100 facial plastic surgeons14.
The random sample was constructed by selecting every thirteenth individual from the
AAFPRS directory. To be included, the physician needed to practice in the US, be
board-certified by the American Board of Otolaryngology and/or the American Board of
Facial Plastic and Reconstructive Surgery, and have at least one year of post-fellowship
experience. The sample included data from 2015-2019 and focused on PRWs from
individuals rather than group practices or hospitals. For qualitative analysis, a
purposeful sample13 of comments from Heathgrades was used. Healthgrades was
selected as it is a frequently used PRW with an open-comment format and limited
comment censorship15. To minimize bias, analysis was limited to 40 narrative comments
per surgeon: therefore, if applicable, the 20 most recent lowest/highest rated comments
were included.
Data Collection
In May 2019, an online search was conducted to obtain each surgeon’s publicly
available data from healthgrades.com, vitals.com, google.com, and zocdoc.com.
Surgeon demographics and comments (n=40 per surgeon) were collected from
Page 8 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
8
Healthgrades.com; while the number of reviews/comments and overall star-rating
scores were collected from all sites.
Variables
Overall Ratings. All platforms use a one- to five-star Likert scale and use an average of
reviewers’ scores to determine the overall rating. However, the PRWs vary in overall
rating content focus. Healthgrades15 and ZocDoc16 overall scores reflect likelihood to
recommend, while Google17 and Vitals18 reflect patient’s overall provider experience.
Surgeon Characteristics. Healthgrades was used to identify surgeon demographics:
age, gender, years of post-fellowship experience, and location. Quarterly, Healthgrades
determines information from public databases19. US census definitions of geographical
regions were used20.
Engagement. Two measures of patient engagement were determined for each PRW:
number of reviews and comments per surgeon.
Analysis
Average physician rating scores were analyzed with histograms. Moody’s Median test
and Kruskal Wallis test were used to compare comment/review numbers with
geographical region for all sites. Kruskal Wallis tests were employed to determine the
relationship between surgeon characteristics and overall star-ratings for individual
surgeons for the most frequently used PRWs. All statistical analyses were conducted
using Stata/IC statistical software (release 15, StataCorp LP).
Content analysis13 of narrative review comments was used to describe patient
experiences with facial plastic surgeons. Themes were identified both inductively from
reviewer comments, and deductively from the CAHPS questions. First, two researchers
Page 9 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
9
(CB, KM) independently open-coded comments to identify author-generated inductive
themes. Then, using a codebook reviewer comments were independently re-coded to
examine theme prevalence and to identify representative quotes. Coder interrater
reliability was high (Kappa=0.89). Additionally, word clouds of comments with the
highest/lowest rated reviews were used to illustrate linguistic patterns.
Results
Sample
A nationally representative random sample of 100 facial plastic surgeons was
determined from approximately 1,300 eligible providers. The sample was predominantly
male (n=87), with an average age of 51.3 (Standard deviation [SD]=9.6) years. Of the
100 physicians examined, 95 had at least one review posted on Healthgrades,
compared with 90 on Vitals, 68 on Google, and 16 on ZocDoc. However, in general,
online PRW ratings are most reliable when there are at least five ratings available21. In
our sample, 74 physicians had a reliable rating ( 5 reviews) on Healthgrades,
compared with 75 on Vitals, 38 on Google, and 11 on Zocdoc. Table 1 provides sample
demographic data.
Patient Engagement
Table 2 contains data regarding patient engagement across PRWs. The PRW with the
highest number of reviews was Vitals (n=4129 reviews), followed by Healthgrades
(n=3643), Google (n=1492), and ZocDoc (n=1292).
Page 10 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
10
The median number of reviews varied by geographical region for Healthgrades
(
2=8.24, p=0.041): median number of reviews was higher for surgeons working on the
east coast compared to the south (
2=8.86; p=0.003) or west coast (
2=9.71; p=0.002),
and was higher in the midwest than on the west coast (
2=5.47, p=0.02). No variation in
the median number of reviews per surgeon across geographic regions was found for
Vitals (
2=8.86; p=0.003), Google (
2=1.62; p=0.68), or ZocDoc (
2=2.96; p=0.42).
Appendix A provides geographic data.
Performance Ratings
All four PRWs showed similarly favorably skewed distributions of review ratings. Figure
1 illustrates the distribution of ratings for the top two PRWs: Healthgrades and Vitals.
For Healthgrades, 11.14% were 1-star and 84.55% were 5-star (mean 4.48 stars).
Similarly, for Vitals, 9.76% were 1-star and 78.40% were 5-star (mean 4.40 stars).
There were disproportionately more 5-star and 1-star reviews on both sites compared
with 2, 3, and 4-star ratings (
2=284.11, p<.001).
For Healthgrades, individual ratings did not differ by surgeon characteristics: gender
(
2=0.03; p=0.85), geographical region (
2=4.01; p=0.26), age (
2=2.68; p=0.44), or
years of experience (
2=2.77; p=0.26). The results for Vitals were similar (data not
shown). Appendix B provides additional surgeon demographic data.
Narrative Review Themes
The qualitative analysis sample included 957 narrative review comments regarding 74
facial plastic surgeons on Healthgrades. The average number of reviewer comments
Page 11 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
11
per surgeon was 12.6 (Median= 8; Min/Max: 1/40). The narrative reviews included 125
1-star reviews (13.06%), 16 2-star reviews (1.67%), three 3-star reviews (0.31%), 19 4-
star reviews (1.99%), and 794 5-star reviews (82.97%).
Figure 2 displays word clouds of the most commonly used words in comments with 1-
star (n=125) and 5-star (n=794) reviews. Examples of frequently used words for 1-star
reviews included “wait”, “told”, “rude”, and “never” (as in “never going back”). Examples
of common words unique for 5-star reviews included “results”, “feel”, “happy”, “look”,
and “caring”.
CAHP Questions Thematic Representation
The content of the OAS-CAHP and S-CAHP surveys was sorted into CAHP themes:
overall experience, recommendation, physician interactions, staff interactions,
efficiency, check-in, pre-procedure, anesthesia instructions, discharge instructions,
cleanliness, pain, bleeding, infection, and nausea/vomiting. Table 3 provides the
prevalence of themes in reviewers comments. Appendix C contains each theme,
associated survey questions, and examples of positive/negative representative reviewer
quotes.
Table 3 displays the frequency of CAHPS themes identified in reviewer comments. The
most common theme was overall experience, which was present in 99.27% of
comments (n=950) and, of these, 84.63% reflected a positive overall experience
(n=804) (Table 3). 31.45% of comments discussed likelihood to recommend the
physician. Similarly, likelihood to recommend was positive in 87.59% of comments
addressing this theme. The physician interactions theme encompassed the most
Page 12 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
12
CAHPS questions compared to the other themes analyzed (Appendix Table C1), and,
unsurprisingly, was the second most common theme patients specifically commented
on via PRWs (present in 58.90% (n=950) of comments).
Themes that, when addressed, were most commonly portrayed in a positive manner
were overall experience, physician interactions, recommendation, staff interactions, pre-
procedure information, discharge instructions, cleanliness, and anesthesia instructions.
In contrast, efficiency, check-in, pain, bleeding, and infection were themes more likely to
be associated with negative reviews. Notably, comments on pain, cleanliness, bleeding,
infection, anesthesia instructions, and nausea/vomiting were present in less than 5% of
the comments (Table 3).
Additional Themes
Nine inductive author-generated themes were identified from the reviewer comments,
which may not be reflected by the CAHPS surveys. These include: bedside manner,
outcome (cosmetic and functional), answered questions, follow-up, knowledge,
finances, and personal communication (Table 3).
The theme bedside manner describes the physician’s approach/attitude towards the
patient. Generally, this theme was expressed as the patient commenting on the
personality traits and communication skills of the physician. Bedside manner was
present in 58.62% of comments, and was similar in prevalence and content to the
physician interactions theme generated by CAHPS questions (58.90% of comments).
The theme of answered questions also seemed to be valued, as almost one-third of
reviewers commented on this theme (31.24%, n=299), typically in a positive manner
Page 13 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
13
(89.63%, n=268). Many patients who commented on the physician’s willingness to
answer questions also commented on “willingness to listen”, personalized care, and
“taking enough time” to counsel them. Additionally, surgeon’s knowledge was
mentioned in 14.64% of comments and reviewers frequently described their facial
plastic surgeon being “intelligent,” “up to date”, and “confident.”
Overall outcome was present in 55.28% of comments (n=529) and was the second
most common additional theme and fourth most common theme overall. This theme
captured cosmetic, functional, and non-specified outcomes. Cosmetic outcome was
described in 30.62% of comments (n=293) and was more common than functional
outcomes (15.36%, n=147). Cosmetic outcome-related comments routinely described
patients having increased “confidence”, a “natural” appearance, and willingness to
recommend to others. Functional outcome comments were associated with descriptions
of “improved breathing” from functional rhinoplasty, as well as comments about general
otolaryngologic procedures performed by facial plastic surgeons such as endoscopic
sinus surgery and tympanostomy tube placement.
The follow-up theme described patients’ perceptions around continuity of care with their
facial plastic surgeon. 176 reviewers (18.39%) commented on follow-up, which
predominantly focused on how they felt taken care of during the surgical recovery
period. Negative follow-up reviews tended to have poor communication experiences
with the office and/or were not provided with information about when/where to seek care
if a problem arose. The theme of personal communication is related to the follow-up
theme, which was defined as the physician directly communicating with patients,
outside of the office or hospital. The comments in this theme described their facial
Page 14 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
14
plastic surgeons providing their personal cell phone number, email address, or
physician-initiated personal communication in the immediate postoperative period.
Finances was the sole negative theme described in the additional themes section. While
uncommon (presence of 5.22%, n=50), comments were overwhelmingly negative
(86.00%, n=43) and were associated with a negative outcome where patients felt like
they had “wasted money”. The few comments describing positive experiences
mentioned how the “office helped work with their budget” or how their procedure was
“worth the cost”.
Discussion
This mixed methods study describes nationwide trends across physician review
websites for facial plastic surgeons, and compares content of patient comments to
themes found in traditional patient experience CAHPS surveys. The most widely used
PRWs in Facial Plastic Surgery are Healthgrades and Vitals, which provide similar
ratings regardless of surgeon age, experience, gender, or location. These two PRWs
were also found to be most common in general plastic surgeons7 and
otolaryngologists3. The only other study on PRWs specifically for facial plastic surgeons
focused on Yelp reviews in five large US cities22. However, in our preliminary review of
websites to include, Yelp seemed to be present for fewer facial plastic surgeons and
data for individual surgeons was not separated if there were multiple partners in one
practice. This study adds to the literature by examining multiple PRWs and using a
randomly selected sample of surgeons from a range of urban, suburban, and rural
areas.
Page 15 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
15
The overall 5-star review ratings for facial plastic surgeons in this study was 84.55% on
Healthgrades and 78.40% on Vitals, which is similar to the rates observed in general
plastic surgery7. The average Healthgrades rating of facial plastic surgeons was 4.48,
which is higher than the reported value of 3.97 reported in the study by Sobin and
colleagues – which was limited to facial plastic surgeons in a northeast academic
setting3. In that same study, there was a larger proportion of negative comments
observed (31.6%), which is almost double the frequency seen in our study (15.37%).
This discrepancy may be due to poorer patient satisfaction seen in academic institutions
as one study comparing otolaryngologists posits23.
The bimodal distribution of ratings skewed towards either five-stars or one-star is a
recognized quality for PRW ratings across subspecialties24. This may be due to a
certain threshold of patient emotional involvement influencing their decision to make a
conscious effort to seek out a PRW and leave an online review. In any case, examining
negative and positive comments may provide both patients and providers with valuable
information in real-time. Positive comments may provide insight into what is going well
within a practice. While negative comments, though a source of physician frustration,
are an accessible avenue of transparency in the healthcare system where patients may
keep providers and their staff accountable. Interestingly comments from PRWs for
emergency department ED share similar linguistic patterns for negative reviews for
facial plastic surgeons – as “told,” “rude,” and “never” were common in both
populations25. While positive reviews from populations were more focused on
“helpfulness” and being “friendly” in emergency department reviews, compared to
“results,” “looking,” and “feeling” for facial plastic surgeon reviews25.
Page 16 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
16
When evaluating prevalence between CAHPS-generated themes and author-generated
themes, our findings identified important aspects of quality patient experiences not
captured in CAHPS surveys12. Namely, patient perceptions of outcomes emerged as an
important theme from patient comments on PRWs, but are not touched on in CAHPS
surveys. It is interesting that the subjective experience captured by patients in these
surveys omits their interpretation of whether their treatment or surgery was successful.
As supported by our findings, this is arguably one of the most important factors patients
look for when choosing a facial plastic surgeon, hence the importance of before and
after photos and the rise of professional Instagram pages for facial plastic surgery
practices. In contrast, several CAHPS themes had a low prevalence in review
comments. The lack of comments may indicate the qualities represented by those
themes are either expected by most patients, unimportant to most patients, or present in
a very low frequency.
Limitations
The data obtained from PRWs is not without limitations. One cannot directly compare
ratings across PRWs as they are inherently using different criteria to determine what
constitutes a “5-star rating”. The prevalence of fake reviews on PRWs is unknown and
may skew data in these open online settings26,27; whereas CAHPS data is sent only to
verified patients. Despite the questionable validity of PRWs patient reviews/comments,
it is important to take them all into context as these are what patients see and use to
make decisions about healthcare12.
Page 17 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
17
For quality payment programs through CMS, there is the option to add additional
questions to the OAS-CAHPS and S-CAHPS surveys from the Clinician and Group
CAHPS survey (CG-CAHPS)8, which may theoretically increase the number of themes
captured by these surveys. However, these additional questions do not seem to exhibit
the author-described themes found in this study. Importantly, physician-specific CAHPS
scores were not available to compare with these individual physicians’ online ratings;
however, these measures have shown to capture different phenomena when compared
in the setting of academic otolaryngologists28.
Last, during theme generation, it is possible that others may have identified additional
themes29. Interested parties may be directed to Appendix C describing each theme in
more detail.
Conclusion
Facial plastic surgeons are overall positively rated online and Healthgrades and Vitals
appear to be the most important sites for them to focus on, regardless of individual
characteristics or region within the US. The comments available on physician review
websites provide important information to those seeking care, which is not
encompassed in the CAHPS surveys provided by CMS. Thus, PRWs narrative reviews
provide helpful additional data to supplement traditional patient experience surveys like
CAHPS.
Acknowledgments
None
Page 18 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
18
Authorship Confirmation Statement
The authors confirm all listed authors had substantial contributions to the conception or
design of the work; or the acquisition, analysis, or interpretation of data for the work;
and drafting the work or revising it critically for important intellectual content; and final
approval of the version to be published; and agree to be accountable for all aspects of
the work in ensuring that questions related to the accuracy or integrity of any part of the
work are appropriately investigated and resolved.
Author Disclosures
The authors confirm they do not have competing interests, financial interests, funding,
employment, or other competing interests which may affect the integrity of the research
reported.
Page 19 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
19
References
1. Terlutter R, Bidmon S, Röttl J. Who Uses Physician-Rating Websites?
Differences in Sociodemographic Variables, Psychographic Variables, and
Health Status of Users and Nonusers of Physician-Rating Websites. J Med
Internet Res 2014;16(3):e97
2. Hanauer DA, Zheng K, Singer DC, Gebremariam A, Davis MM. Public
Awareness, Perception, and Use of Online Physician Rating Sites. JAMA.
2014;311(7):734–735. doi:10.1001/jama.2013.283194
3. Sobin L, Goyal P. Trends of online ratings of otolaryngologists: what do your
patients really think of you? JAMA Otolaryngol Head Neck Surg. 2014
Jul;140(7):635-8. doi: 10.1001/jamaoto.2014.818.
4. Woodward C. “Anti-defamation” group seeks to tame the rambunctious world of
online doctor reviews. CMAJ. 2009;180(10):1010.
5. Paavola A. Miami plastic surgeon sues former patients for bad Yelp reviews.
Becker’s Hospital Review. https://www.beckershospitalreview.com/legal-
regulatory-issues/miami-plastic-surgeon-sues-former-patients-for-bad-yelp-
reviews.html. Updated May 3, 2019. Accessed September 1, 2019.
6. Kadry B, Chu LF, Kadry B, Gammas D, Macario A. Analysis of 4999 online
physician ratings indicates that most patients give physicians a favorable rating. J
Med Internet Res. 2011 Nov 16;13(4):e95. doi: 10.2196/jmir.1960.
7. Lewis P, Kobayashi E, Gupta S. An online review of plastic surgeons in southern
California. Ann Plast Surg. 2015 May;74 Suppl 1:S66-70.
Page 20 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
20
8. CAHPS Patient Experience Surveys and Guidance. Agency for Healthcare
Research and Quality, Rockville, MD. https://www.ahrq.gov/cahps/surveys-
guidance/index.html. Updated August 2019. Accessed September 1, 2019.
9. Press Ganey® Associates Medical Practice.
http://www.pressganey.com/index.aspx. Updated September 4, 2019. Accessed
September 4, 2019.
10. Quality Payment Program. US Centers for Medicare and Medicaid Services.
http://www.qpp.cms.gov. Accessed September 1, 2019.
11. Physician Compare. US Centers for Medicare and Medicaid Services.
https://www.medicare.gov/physiciancompare/#. Accessed September 1, 2019.
12. Ranard BL, Werner RM, Antanavicius T, Schwartz HA, Smith RJ, Meisel ZF,
Asch DA, Ungar LH, Merchant RM. Yelp Reviews Of Hospital Care Can
Supplement And Inform Traditional Surveys Of The Patient Experience Of Care.
Health Aff (Millwood). 2016 Apr;35(4):697-705. doi: 10.1377/hlthaff.2015.1030.
13. Creswell, J. W., Klassen, A. C., Plano Clark, V. L., & Smith, K. C. (2011). Best
practices for mixed methods research in the health sciences. Bethesda
(Maryland): National Institutes of Health, 2013, 541-545.
14. American Academy of Facial Plastic and Reconstructive Surgery. American
Academy of Facial Plastic and Reconstructive Surgery, Inc.
http://www.aafprs.org. Accessed May 1, 2019.
15. Patient Satisfaction. Healthgrades Operating Company, Inc.
https://www.healthgrades.com/quality/patient-satisfaction. Accessed May 1,
2019.
Page 21 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
21
16. About Zocdoc. Zocdoc Inc. http://support.zocdoc.com. Accessed May 1, 2019.
17. Google My Business Help: Score ratings for local places. Google LLC.
https://support.google.com/business/answer/4801187?hl=en. Accessed May 1,
2019.
18. Vitals.com: Find a Doctor, Doctor Reviews & Ratings. Vitals Consumer Services,
LLC. http://www.vitals.com. Accessed May 1, 2019.
19. Provider Profile. Healthgrades Operating Company, Inc.
https://helpcenter.healthgrades.com/help/provider-profile. Updated September
14, 2017. Accessed May 1, 2019.
20. Geography Program. United States Census Bureau.
https://www.census.gov/geo/reference/gtc/gtc_census_divreg.html?cssp=SERP.
Accessed September 1, 2019.
21. Bardach NS, Asteria-Peñaloza R, Boscardin WJ, Dudley RA. The relationship
between commercial website ratings and traditional hospital performance
measures in the USA. BMJ Qual Saf. 2013 Mar;22(3):194-202. doi:
10.1136/bmjqs-2012-001360. Epub 2012 Nov 23.
22. Shemirani NL, Castrillon J. Negative and Positive Online Patient Review of
Physician- 1 vs 5 Stars. JAMA Facial Plast Surg. 2017;19(5):435-436.
doi:10.1001/jamafacial.2016.2039.
23. Boss EF, Thompson RE. Patient satisfaction in otolaryngology: Can academic
institutions compete? Laryngoscope. 2012 May;122(5):1000-9. doi:
10.1002/lary.23255. Epub 2012 Mar 27.
Page 22 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
22
24. Dorfman RG, Purnell C, Qiu C, Ellis MF, Basu CB, Kim JYS. Happy and
Unhappy Patients: A Quantitative Analysis of Online Plastic Surgeon Reviews for
Breast Augmentation. Plast Reconstr Surg. 2018 May;141(5):663e-673e. doi:
10.1097/PRS.0000000000004268.
25. Agarwal AK, Pulullo AP, Merchant RM. "Told": the Word Most Correlated to
Negative Online Hospital Reviews. J Gen Intern Med. 2019 Jul;34(7):1079-1080.
doi: 10.1007/s11606-019-04870-6.
26. Malbon, J. Taking Fake Online Consumer Reviews Seriously. J Consum Policy.
2013; 36(2): 139-157. https://doi.org/10.1007/s10603-012-9216-7.
27. Bialer J. What I learned from scraping every single ZocDoc doctor. NY Data
Science Academy. https://nycdatascience.com/blog/student-works/web-
scraping/analyzing-zoc-doc-doctors/. Updated February 23, 2017. Accessed
September 1, 2019.
28. Ryan T, Specht J, Smith S, DelGaudio JM. Does the Press Ganey Survey
Correlate to Online Health Grades for a Major Academic Otolaryngology
Department? Otolaryngol Head Neck Surg. 2016 Sep;155(3):411-5. doi:
10.1177/0194599816652386. Epub 2016 May 24.
29. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis:
Implications for conducting a qualitative descriptive study. Nurs Health Sci. 2013
Sep;15(3):398-405. doi: 10.1111/nhs.12048. Epub 2013 Mar 11.
Page 23 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
23
Correspondence Address
Cory D. Bovenzi, MD
925 Chestnut Street, 6th Floor
Philadelphia, PA 19107
cory.bovenzi@jefferson.edu
215-955-6760
Page 24 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Characteristic
Category
Count
Gender
Male
87
Female
13
Geographic Region
South
33
East
19
Midwest
15
West
33
Age (Years)
30-40
11
40-50
32
50-60
38
60+
19
Experience (Years)
0-10
17
11-20
33
21-30
37
30+
13
Table 1. Demographic data of Facial Plastic Surgeons sampled (n=100)
Page 25 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Table 2. Patient engagement with physician review websites
Engagement
Metric
Total (n)
Mean (SD) per
surgeon
Median
per
Surgeon
Overall
Min/Max
per Surgeon
Reviews
10556
105.56 (76.91)
5.00
0/906
Comments
5610
56.10* (48.51)
2.00
0/760*
Reviews
3643
36.43 (71.13)
13.00
0/499
Comments*
1305
13.05 (19.47)
3.00
0/75
Reviews
4129
41.29 (93.93)
15.50
0/778
Comments
2620
26.20 (83.44)
6.50
0/760
Reviews
1492
14.92 (28.44)
3.00
0/182
Comments
1133
11.33 (24.04)
2.00
0/161
Reviews
1292
12.92 (92.29)
0.00
0/906
Comments
552
5.52 (36.48)
0.00
0/353
*Per website protocols the Healthgrade number of comments are limited to a max of 75 per
surgeon. There were 4 surgeons with > 75 comments.
Page 26 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
1
Figure 1. Distribution of overall star review scores of facial plastic surgeons on Healthgrades
(n=3643 reviews) and Vitals (n=4129 reviews)
*Healthgrades overall scores reflect likelihood to recommend; Vitals reflect patient’s overall
experience with the provider.
Page 27 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Figure 2. Word clouds illustrating linguistic patterns of comments with the lowest (n= 125) and highest (n=794) star-rated reviews.
The letter size and darkness increase with frequency of word use.
2B. Five-star Reviews
2A. One-star Reviews
Page 28 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Page 29 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Table 3. Presence and tone of thematic trends in narrative reviewer comments from Healthgrades
for facial plastic surgeons (n=957 reviewer comments)
Tone of Theme
Source
Theme
Presence
% (n)
Positive
% (n)
Negative
% (n)
CAHPS*
Overall Experience
99.27 (n=950)
84.63 (n=804)
15.37 (n=146)
Physician Interactions
58.90 (n=564)
87.59 (n=494)
12.41 (n=70)
Recommendation
31.45 (n=301)
84.39 (n=254)
15.61 (n=47)
Staff Interactions
30.72 (n=294)
89.12 (n=262)
10.88 (n=32)
Time/Efficiency
19.85 (n=190)
68.95 (n=131)
31.05 (n=59)
Pre-procedure Information
13.48 (n=129)
86.05 (n=111)
13.95 (n=18)
Discharge Instructions
5.85 (n=56)
82.14 (n=46)
17.86 (n=10)
Check-in
5.64 (n=54)
62.96 (n=34)
37.04 (n=20)
Pain
4.39 (n=42)
73.81 (n=31)
26.19 (n=11)
Cleanliness
1.46 (n=14)
100 (n=14)
0 (n=0)
Bleeding
0.31 (n=3)
33.33 (n=1)
66.66 (n=2)
Infection
0.21 (n=2)
0 (n=0)
100 (n=2)
Anesthesia Instructions
0.11 (n=1)
100 (n=1)
0 (n=0)
Nausea/Vomiting
0 (n=0)
N/A
N/A
Author-Inductive
Bedside Manner
58.62 (n=605)
82.48 (n=499)
10.25 (n=62)
Overall Outcome
55.28 (n=529)
86.77 (n=459)
13.23 (n=70)
Answered Questions
31.24 (n=299)
89.63 (n=268)
10.37 (n=31)
Cosmetic Outcome
30.62 (n=293)
88.05 (n=258)
11.95 (n=35)
Follow Up
18.39 (n=176)
78.98 (n=139)
21.02 (n=37)
Functional Outcome
15.36 (n=147)
78.91 (n=116)
21.09 (n=31)
Knowledge
14.63 (n=140)
86.43 (n=121)
13.57 (n=19)
Finances
5.22 (n=50)
14.00 (n=7)
86.00 (n=43)
Personal Communication
3.66 (n=35)
94.29 (n=33)
5.71 (n=2)
*Note: CAHPS = Consumer Assessment of Healthcare Providers and Systems Survey
Page 30 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Appendix
Table A1. Mood’s Median test to examine difference in median number of reviewer
comments for each physician review website by region.
Table A2. Average & median number of Healthgrade reviews of facial plastic surgeons
(n=100) by region
Table A3. Results of follow-up Kruskal-Wallis to examine regional differences in the
number of reviews on Healthgrades for 100 facial plastic surgeons
Table B1. Description of star-ratings for Healthgrades (n=3522 reviews)
Table B2. Description of star-ratings for Vitals (n=4166 reviews)
Table B3. Comparing reviewer ratings on Healthgrades by gender, geographic regions,
age, and experience for 74 surgeons
Table C1. Linking themes to Outpatient and Ambulatory Surgery CAHPS (OAS CAHPS)
and Surgical Care CAHPS (S CAHPS) to representative quotes from Healthgrade
Narrative review comments (n=957).
Table C2. Linking author-generated inductive themes to representative quotes from
Healthgrade Narrative review comments (n=957).
Page 31 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Appendix A
Table A1. Mood’s Median test to examine difference in median number of reviewer comments for each
physicianA review website by region.
Website
Number of Reviews
Overall Median
2 (df=3)
P-Value
Healthgrades
3643
13.00
8.24
0.04*
Vitals
4129
15.50
3.43
0.34
Google
1492
3.00
1.62
0.68
Zoc Doc
1292
0
2.96
0.42
*Significant p-value (< 0.05). A100 physicians.
Table A2. Average & median number of Healthgrade reviews of facial plastic surgeons (n=100) by region
Regions
Compared
Number of
Physicians
Total number of
reviews
Mean reviews per
physician
Median reviews
per physician
East
19
1088
57.26
22
South
33
1203
36.45
12
West
33
640
19.38
9
Midwest
15
591
39.4
22
Table A3. Results of follow-up Kruskal-Wallis to examine regional differences in the number of reviews on
Healthgrades for 100 facial plastic surgeons
Regions Compared
2 (df=1)
P-Value
East-South
8.86
0.003*
East-Midwest
1.11
0.292
East-West
9.71
0.002*
West-Midwest
5.47
0.020*
West-South
0.78
0.376
South-Midwest
0.96
0.327
*Significant p-value (< 0.05).
Page 32 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Appendix B
Table B1. Description of star-ratings for Healthgrades (n=3522 reviews)
Rating
Total number
(%) of reviews*
Average number of
reviews per MD
Min
Max
SD
1
406 (11.14%)
4.27
0
27
5.50
2
52 (1.42%)
0.57
0
7
1.05
3
15 (<1%)
0.17
0
3
0.49
4
90 (2.20%)
0.95
0
22
2.67
5
3080 (84.55%)
32.42
0
501
72.90
*Only providers with Healthgrade review scores were used (n=95 physicians).
Table B2. Description of star-ratings for Vitals (n=4166 reviews)
Rating
Total number
(%) of reviews*
Average number of
reviews per MD
Min
Max
SD
1
403 (9.76%)
4.48
0
51
5.50
2
155 (3.72%)
1.72
0
16
1.05
3
71 (1.7%)
0.79
0
6
0.49
4
263 (6.31%)
2.92
0
35
2.67
5
3237 (78.40%)
35.97
0
728
72.90
*Only providers with Vitals review scores were used (n=90 physicians).
Table B3. Comparing reviewer ratings on Healthgrades by gender, geographic regions, age, and
experience for 74 surgeons
*Only providers with valid Healthgrade review scores (>5 reviews per surgeon) were used.
Characteristic
Category
Number of
Physicians
2
P-Value
Gender
Male
70
0.03
0.85
Female
12
Geographic Region
South
28
4.01
0.26
East
11
Midwest
11
West
23
Age (Years)
30-40
18
2.68
0.44
40-50
18
50-60
23
60+
15
Experience (Years)
0-10
18
3.77
0.29
11-20
19
21-30
22
30+
15
Page 33 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Appendix C
Table C1. Linking themes to Outpatient and Ambulatory Surgery CAHPS (OAS CAHPS) and Surgical Care CAHPS (S CAHPS) to
representative quotes from Healthgrade narrative review comments (n=957).
Theme
OAS and S CAHPS Questions
Positive Comment Example
Negative Comment Example
Check-in
Question(Q) 3 OAS-CAHPS: Did the check-in
process run smoothly?
“Making appointments
have never been easy
and I’ve never had to wait
longer than a few
minutes, if that. The office
staff made me feel so
welcome.”
"We got a call in the morning
to be there at 10, and had to
rush there since we were
previously told to bet there at
12:30, then he wasn't seen
until 3:30!"
Pre-procedure
information
Q1-OAS: Before your procedure, did your doctor or
anyone from the facility give you all the information
you needed about your procedure?
Q2-OAS: Before your procedure, did your doctor or
anyone from the facility give you easy to understand
instructions about getting ready for your procedure?
Q9-OAS: Did the doctors and nurses explain your
procedure in a way that was easy to understand?
Q3-S: Before your surgery, did anyone in this
surgeon's office give you all the information you
needed about your surgery?
Q4-S: Before your surgery, did anyone in this
surgeon's office give you easy to understand
instructions about getting ready for your surgery?
“She took the time to
explain to us the plan of
care, and made sure my
mom understood each
step in the process. It
was an exceptionally
pleasant experience. We
felt ready for the
procedure and like we
were in good hands.”
"Very helpful in explaining
steps of the procedure."
"The doctor did not inform
me of any of the possible
risks."
“He does a very brief
evaluation and does not
explain pros and cons or
details about what surgery
he may provide.”
“I do not feel like I realized
what a big surgery it was or
the risks before I had it
done. If I had, I’m not sure I
would have gone through
with it.”
Cleanliness
Q4-OAS: Was the facility clean?
"The office was very
clean and inviting."
N/A
Anesthesia
Instructions
Q10-OAS: Anesthesia is something that would make
you feel sleepy or go to sleep during your procedure.
Were you given anesthesia?
Q11-OAS: Did your doctor or anyone from the facility
explain the process of giving anesthesia in a way that
was easy to understand?
Q12-OAS: Did your doctor or anyone from the facility
explain the possible side effects of the anesthesia in
a way that was easy to understand?
"The department was the
best I've seen as was the
anesthesia."
N/A
Page 34 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Q20-S: Did this anesthesiologist encourage you to
ask questions?
Q22-S: Did this anesthesiologist answer your
questions in a way that was easy to understand?
Q24-S: Did talking with this anesthesiologist during
this visit make you feel more calm and relaxed?
Q25-S: Using any number from 0 to 10, where 0 is
the worst anesthesiologist possible and 10 is the best
anesthesiologist possible, what number would you
use to rate all your care from this anesthesiologist?
Staff Interactions
(Including Nurse)
Q5-OAS: Were the clerks and receptionists at the
facility as helpful as you thought they should be?
Q6-OAS: Did the clerks and receptionists at the
facility treat you with courtesy and respect?
Q37-S: During these visits, did clerks and
receptionists at this surgeon's office treat you with
courtesy and respect?
"…from start to finish the
staff at this office was
courteous, helpful and
understanding."
“The staff was so
accommodating and
warm and I felt at ease
the day of my procedure.”
“Please BELIEVE the
reviews. Never encountered
such a rude, curt,
unprofessional, and robotic
front-line people-wouldn’t
dignify calling them team or
staff.”
"Staff was rude and
unprofessional"
Physician
Interactions
Q7-OAS: Did the doctors and nurses treat you with
courtesy and respect?
Q8-OAS: Did the doctors and nurses make sure you
were as comfortable as possible?
Q9-OAS: During your office visits before your
surgery, did this surgeon listen carefully to you?
Q11-S: During your office visits before your surgery,
did this surgeon encourage you to ask questions?
Q12-S: During your office visits before your surgery,
did this surgeon show respect for what you had to
say?
Q15-S: After you arrived at the hospital or surgical
facility, did this surgeon visit you before your
surgery?
Q17-S: Before you left the hospital or surgical facility,
did this surgeon discuss the outcome of your surgery
with you?
Q31-S: After your surgery, did this surgeon listen
carefully to you?
"He was very courteous,
personable, and very
respectful of my needs."
“Both Dr. X and staff are
very caring and
professional and respect
the patients time and
concerns.”
“I felt that throughout the
entire experience I was
treated with the utmost
respect. My voice was
heard! I was treated with
care and concern every
step of the way!”
“I feel like Dr. X really
cares about you
personally. She took the
time to listen to my needs
and was present
“… [the doctor] never even
bothered to ask me if I have
any other questions or
concerns. There was also a
total lack of communication
during the procedure: the
doctor simply did his routine
without even warning me
what I should expect. Such
communication is essential
when somebody is poking
inside your ear with a sharp
stick. When I felt
uncomfortable after the
procedure was done, he
simply dismissed that and
said it was normal.”
“…he was brisk and abrupt
and made me full
uncomfortable about wasting
his time.”
Page 35 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Q33-S: After your surgery, did this surgeon
encourage you to ask questions?
Q34-S: After your surgery, did this surgeon show
respect for what you had to say?
throughout the entire
process.”
“Untrustworthy and does
unnecessary surgeries.”
Efficiency/Time
Q10-S: During your office visits before your surgery,
did this surgeon spend enough time with you?
Q3-S: After your surgery, did this surgeon spend
enough time with you?
“In this world of hurried,
busy doctors, Dr. X takes
time and truly listens to
my concerns and offers
an abundance of
solutions.”
"The wait time is 2+ hours.
Clearly they don’t value your
time."
“He rushed in/out, and made
me feel like I was a hassle
not a PAYING patient.”
Discharge
Instructions
Q13-OAS: Discharge instructions include things like
symptoms you should watch for after your procedure,
instructions about your medicine, and home care.
Before you left the facility, did you get written
discharge instructions?
Q14-OAS: Did your doctor or anyone from the facility
prepare you for what to expect during your recovery?
Q26-S: Did anyone in this surgeon's office explain
what to expect during your recovery period?
Q27-S: Did anyone in this surgeon's office warn you
about any signs or symptoms that would need
immediate medical attention during your recovery
period?
Q28-S: Did anyone in this surgeon's office give you
easy to understand instructions about what to do
during your recovery period?
Q28-S: Did anyone in this surgeon's office give you
easy to understand instructions about what to do
during your recovery period?
"He explained my surgery
procedure and recovery
in detail."
“The staff are so detailed
and are home care
packet has all of the
information you would
every want. I am 5 days
out, and there have been
no surprises as she
explained everything to
expect so thoroughly.”
“We knew exactly what to
expect once we got home
and the doctor called to
make sure everything
was going all right that
night!”
"No aftercare instructions
were given and these were
requested for over a week.
Finally, we just stopped
requesting. I’m not sure
even exactly what my son
had done."
“When I asked how his ear
looked he replied “infected”.
That was it...I had to stop
him from leaving & ask if
there was anything I needed
to watch out for before our
follow up in a week.
Anything that would warrant
a call or trip to ER. His reply
“seizures, coma” and left the
room.”
Pain
Q15-OAS: Some ways to control pain include
prescription medicine, over-the-counter pain relievers
or ice packs. Did your doctor or anyone from the
facility give you information about what to do if you
had pain as a result of your procedure?
Q16-OAS: At any time after leaving the facility, did
you have pain as a result of your procedure?
"I had zero pain."
“She told me a head of
time how much pain to
expect and what I could
do to help minimize it.
However, when I got
home, it ended up not
being that bad! It was
nice to know what to
expect.”
"…the pain was intense."
“…when he was finally ready
to perform the procedure he
was rather surly. It was very
rough (a piece of my tongue
was cut for a biopsy) and he
didn't offer a prescription for
pain meds. I don't consider
myself a sissy but I was
miserable for days.”
Page 36 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Nausea/
Vomiting
Q17-OAS: Before you left the facility, did your doctor
or anyone from the facility give you information about
what to do if you had nausea or vomiting?
Q18-OAS: At any time after leaving the facility, did
you have nausea or vomiting as a result of either
your procedure or the anesthesia?
N/A
N/A
Bleeding
Q19-OAS: Before you left the facility, did your doctor
or anyone from the facility give you information about
what to do if you had bleeding as a result of your
procedure?
Q20-OAS: At any time after leaving the facility, did
you have bleeding as a result of your procedure?
"..postop I had absolutely
no pain or bleeding."
“They let us know I
should call if it started
bleeding overnight, but it
was fine!”
"I had a hard time with the
bleeding."
“I saw him twice, and both
times left with a bloody nose.
I asked him to be gentle, but
he said I must be just
sensitive.”
Infection
Q21-OAS: Possible signs of infection include fever,
swelling, heat, drainage or redness. Before you left
the facility, did your doctor or anyone from the facility
give you information about what to do if you had
possible signs of infection?
Q22-OAS: At any time after leaving the facility, did
you have any signs of infection?
N/A
"I have gone through an
infection so bad I was sent
to the ER to get an IV."
Overall
Q23-OAS: Using any number from 0 to 10, where 0
is the worst facility possible and 10 is the best facility
possible, what number would you use to rate this
facility?
Q35-S: Using any number from 0 to 10, where 0 is
the worst surgeon possible and 10 is the best
surgeon possible, what number would you use to rate
all your care from this surgeon
"I have a lot of trust in this
surgeon and facility."
“I can’t imagine a better
experience with Dr. X, it
was perfect and easy. I’d
give 100 out of 5 stars if I
could.”
" I had a very bad
experience."
“This guy gives me the
creeps. Stay far away.”
“If I could give negative
stars, I would. Do your
homework.”
Recommendation
Likelihood
Q24-OAS: Would you recommend this facility to your
friends and family?
"A+ doctor, facility, and
staff. I 100%
recommend!"
“My entire family has
gone to Dr. X. You should
too!”
“Please run as fast as you
can from this guy.”
“Warning! Do not go here. I
guarantee the people giving
positive reviews have been
offered a discount for a good
review, because that’s the
only way this guy can get
business.”
Page 37 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Table C2. Linking author-generated inductive themes to representative quotes from Healthgrade Narrative review comments (n=957).
Theme
Positive Comment Example
Negative Comment Example
Bedside
Manner
"He has a warm, friendly, inviting bedside manner."
“[Doctor] is hands down equally as amazing, talented, caring
and patient. She goes above and beyond to ensure all
patient needs are met.”
“He is amazing, super nice, confident in his work. He truly
cares about his patients and wants to make sure their
care/surgeries are successful.”
“As a physician, he is kind and caring, always available, and
gets to know each patient personally.”
"Very dishonest and rude."
“…he was rude, snippy, and snappy at me. He made
disdainful faces like a bratty little boy at me for no reason.
He did all of the talking, & didn't ask me any questions
about the my particular situation for which I was seeking
treatment.”
“He has a terrible bedside manner and is downright mean.
Run as fast as you can. I wish I had!!!”
He did all the talking. Didn’t do an exam. And
Outcome
(Overall)
"I am ecstatic with my results."
“The results are life changing, thank you so much!”
“At first, I was a little put off by Dr. X’s bedside manner, but
the results speak for themselves. I’m so happy!”
“I couldn’t have imagined a better outcome. I should have
done this a decade ago. You are in good hands with Dr. X.”
“I love it! She clearly did what we discussed, it’s perfect.”
"Did what HE wanted in surgery, I am very dissatisfied"
“Better surgeons at your local butcher shop.”
“When evaluating a physician, it doesn't really matter how
'nice' the doctor or staff is - the bottom line is the skill level
and outcome. Learn from me. Don’t be fooled, go
elsewhere.”
“More selling than results.”
Cosmetic
Outcome
"Gave me the cutest little nose, it’s just what I wanted.”
“My results were amazing. Dr. X took years off of my face
and neck, all with no visible scarring.”
“I was so nervous about the surgery, but now I am so glad I
had it done. I am so happy with the results. It looks so
natural, and I finally feel confident and great about how I
look in pictures.”
“My results were amazing. Dr. X took years off of my face
and neck, all with no visible scarring…He performs
customized procedures for the best outcome.”
“…[Doctor X] has a wonderful gift of making me feel fresh
and beautiful without looking too overdone.”
"I live in horror everyday with the way I look since the
surgery. I feel surgeries are taken very lightly without
considering the traumatic affect they can have on the
patient later on."
“If I could leave no stars I would. I'd also attach pictures of
my very visible, lopsided and uneven nose.”
"Botched' is the only word that can be used to clarify what I
am living with after facial surgery which is having to be
repaired by a more skilled surgeon.”
“Because of how it turned out, even 6 months later I’m
embarrassed to go outside…. It’s making me extremely
depressed and emotionally distressed.”
Functional
Outcome
“I went for a functional rhinoplasty that I had put off for way
too long. Not only are the results, I believe are truly life
changing.”
“I’ve never breathed or slept this well in my life. I had no
idea how sleep deprived I actually was! Since the surgery
my snoring is gone, I’m dreaming, thinking clearer, and lost
weight.”
“Months later I’m still not able to breathe well. It’s a scam.”
"I ended up having nerve damage from the procedure."
“Although Dr. X did what he said he would, I believe he or
his office should strongly warn future patients that there is a
good chance they will lose all smell and taste for ever. I
NEVER even thought this could happen… It is devastating.”
Page 38 of 38Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
For Peer Review
Answered
Questions
"She very much cares about her patients as individuals,
making herself very available for questions in the
weeks/months after my procedure."
“He really listened to what I wanted to accomplish and
answered all of my questions. Dr. X explained which type of
surgery could help achieve my desired goals.”
"He did not explain my condition and he was annoyed when
I asked questions to try to understand."
“The consultation was a joke. He pressured me several
times to go on Instagram to talk live about my procedure. (I
did not consent to this.) How could I ask/answer personal
questions while ‘live’ on social media.”
Follow-Up
"They are very caring and thorough, and have given me
excellent follow up care."
“I just came back from my 3-month follow-up appointment,
and it was wonderful! Dr. X and his staff was so excited to
see how while I’m doing. It feels like they are family and
really care!”
"I was not encouraged to return or follow up"
“He basically dropped me after the procedure- gave me no
education about the condition, and I was not encouraged to
return or ever follow up.”
“… I cannot even get office to call me back…they collected
money, and abandoned me.”
Knowledge
"She is very knowledgeable about current studies in the field
and shared some of these with me"
“From day one her knowledge and thoroughness impressed
me. She spends so much time reviewing what I would like to
see as the end result and setting proper expectations. Sign
of a smart doctor.”
“She is an expert in the field, and knows the latest
techniques. She was trained at a top hospital, so know you
are getting the best care and the real deal.”
"He has no knowledge about the field in which he practices.
It’s scary."
“She did not spend the necessary time with me to evaluate
and just looked at me from across her desk. She would not
admit she was over her head and just recommended
someone else.”
“Do not be taken in by the smooth presentation or persona
of this ‘charming’ man as an it is entirely unrelated to his
level of expertise or surgical skills.”
Finances
"My surgery was worth every penny."
“I loved how they are transparent about cost up front and
help you budget a head. This is so helpful so you know what
to plan for and expect. Very few surgeons do this.”
“When you are in her office she makes you feel like her top
priority. She is never distracted. You are getting what you
pay for.”
“This Dr. cares about one thing only – money. Patients and
their wellbeing mean nothing. This businessman will
misdiagnose you, provide false information and data, and
then overbill you.”
“Stay from this money hungry businessman.”
“…we have gone through a fair amount of financial stress
due to billing issues.”
Personal
Communication
"He was right there after when I had an issue, twice over the
weekend and very accessible after the fact in his office and
on the phone or even an email"
“Dr. X gave her personal cell phone to call with any urgent
matter even though she was on vacation!”
“I called his office for help, just days after the surgery and I
couldn’t get anyone to answer. I ended up going to the
hospital instead.”
“When I got home on a Friday I realized we didn’t even
have a number to call if a problem came up (which it did).”
Page 39 of 38 Mary Ann Liebert, Inc.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
... 4,5 While online ratings are generally favorable, various factors associated with positive or negative reviews have been identified. [6][7][8] Factors such as years in practice, 9,10 bedside manner, 11 answering questions, 12 and even geography 13 were shown to modulate the rating scores of physicians. Whether physician practice and care provided affect ratings is contentious. ...
Article
Background We analyzed online rating scores and comments of head and neck surgeons to understand factors that contribute to higher ratings. Methods Numerical ratings and comments for American Head and Neck Society physicians were extracted from Healthgrades, Vitals, RateMDs, and Yelp, with narrative comments categorized based on content. Physician practice location, education, and residency training were also compiled. Results Patient ratings were significantly higher with supportive staff and affable physician demeanor but showed significant drops with longer wait times and difficulties scheduling appointments or follow‐ups. Physician education and postgraduate training did not significantly affect ratings. Conclusion Online ratings and comments correlated to modifiable factors in clinical practice and may be informative in understanding patient needs.
... Therefore, although patient satisfaction surveys may be a tool to aid in quality performance evaluations, they should still be used judiciously as there may be unintended consequences if healthcare quality measurements are only reflected by patient perception. Additionally, the NPS scoring system may have limitations and not fully encompass a patient's experience, a limitation that applies to many patient experience surveys [8,11,12]. ...
Article
Full-text available
Introduction Patient experience is essential in the overall care; physicians often receive patient reviews evaluating their consultation encounters. Patient experience surveys can be a helpful tool to identify areas to target for improvement. We sought to evaluate what factors influenced breast surgery patients' reviews of their clinic visits. Methods Prospective surveys from 2018-2020 were reviewed from a single institution. Surveys were sent to all patients within 48 hours after visiting one of our breast surgery clinics, and patients were asked their preferred mode of contact for the survey. Patients responded to surveys with scores of 0-10, with 0 as "not likely" and 10 "extremely likely" to recommend the provider's office. Scores 0-6 were considered negative, 7-8 neutral, and 9-10 positive. Positive/Negative comments from patients were reviewed and classified according to mention of surgeon, clinic staff/team, clinic processing, and facility amenities. Results 744 out of 2205 patients contacted responded to the survey, resulting in a 33.7% response rate. Of this cohort, 47.6% (354/744) were new patients, and 52.4% (390/744) were established patients. Interactive voice response (IVR) and email, per patient indicated preferred mode of survey communication, had the highest responses. The average patient score was 9.5. Most ratings were positive (91.3%, 679/744), followed by neutral comments (5.2%, 39/744). There were 3.5% (26/744) which were negative ratings. Of those who responded, 47.7% (355/744) left a comment with their score. Surgeon-specific remarks were often noted in positive comments, followed by clinic staff/team comments. Negative comments most commonly referenced clinic processes. Conclusion Patient satisfaction surveys provide a window into creating the best patient experience. Further efforts to address these factors affecting patient experiences should be made to continue improving patient care.
... However, most studies examining PRWs ratings have typically focused on a certain year (eg, [13,18,21]), a certain medical specialty (eg, [22,23,[36][37][38][39][40]), certain cities or regions (eg, [14,26,41]), or with a (more or less) randomly selected sample of physicians or ratings (eg, [14,21,26,36,41]). There is therefore a need for a more comprehensive examination of PRW ratings, to reveal a more generalizable view of ratings and allow trends in rating habits to be identified. ...
... However, most studies examining PRWs ratings have typically focused on a certain year (eg, [13,18,21]), a certain medical specialty (eg, [22,23,[36][37][38][39][40]), certain cities or regions (eg, [14,26,41]), or with a (more or less) randomly selected sample of physicians or ratings (eg, [14,21,26,36,41]). There is therefore a need for a more comprehensive examination of PRW ratings, to reveal a more generalizable view of ratings and allow trends in rating habits to be identified. ...
Article
Full-text available
Background Feedback from patients is an essential element of a patient-oriented health care system. Physician rating websites (PRWs) are a key way patients can provide feedback online. This study analyzes an entire decade of online ratings for all medical specialties on a German PRW. Objective The aim of this study was to examine how ratings posted on a German PRW have developed over the past decade. In particular, it aimed to explore (1) the distribution of ratings according to time-related aspects (year, month, day of the week, and hour of the day) between 2010 and 2019, (2) the number of physicians with ratings, (3) the average number of ratings per physician, (4) the average rating, (5) whether differences exist between medical specialties, and (6) the characteristics of the patients rating physicians. Methods All scaled-survey online ratings that were posted on the German PRW jameda between 2010 and 2019 were obtained. Results In total, 1,906,146 ratings were posted on jameda between 2010 and 2019 for 127,921 physicians. The number of rated physicians increased constantly from 19,305 in 2010 to 82,511 in 2018. The average number of ratings per rated physicians increased from 1.65 (SD 1.56) in 2010 to 3.19 (SD 4.69) in 2019. Overall, 75.2% (1,432,624/1,906,146) of all ratings were in the best rating category of “very good,” and 5.7% (107,912/1,906,146) of the ratings were in the lowest category of “insufficient.” However, the mean of all ratings was 1.76 (SD 1.53) on the German school grade 6-point rating scale (1 being the best) with a relatively constant distribution over time. General practitioners, internists, and gynecologists received the highest number of ratings (343,242, 266,899, and 232,914, respectively). Male patients, those of higher age, and those covered by private health insurance gave significantly (P<.001) more favorable evaluations compared to their counterparts. Physicians with a lower number of ratings tended to receive ratings across the rating scale, while physicians with a higher number of ratings tended to have better ratings. Physicians with between 21 and 50 online ratings received the lowest ratings (mean 1.95, SD 0.84), while physicians with >100 ratings received the best ratings (mean 1.34, SD 0.47). Conclusions This study is one of the most comprehensive analyses of PRW ratings to date. More than half of all German physicians have been rated on jameda each year since 2016, and the overall average number of ratings per rated physicians nearly doubled over the decade. Nevertheless, we could also observe a decline in the number of ratings over the last 2 years. Future studies should investigate the most recent development in the number of ratings on both other German and international PRWs as well as reasons for the heterogeneity in online ratings by medical specialty.
Article
Full-text available
Objectives The study aimed to extract online comments of otolaryngologists in the 20 most populated cities in the United States from healthgrades.com, develop and validate a natural language processing (NLP) logistic regression algorithm for automated text classification of reviews into 10 categories, and compare 1‐ and 5‐star reviews in directly‐physician‐related and non‐physician‐related categories. Methods 1977 1‐star and 12,682 5‐star reviews were collected. The primary investigator manually categorized a training dataset of 324 1‐star and 909 5‐star reviews, while a validation subset of 100 5‐star and 50 1‐star reviews underwent dual manual categorization. Using scikit‐learn, an NLP algorithm was trained and validated on the subsets, with F1 scores evaluating text classification accuracy against manual categorization. The algorithm was then applied to the entire dataset with comparison of review categorization according to 1‐ and 5‐star reviews. Results F1 scores for NLP validation ranged between 0.71 and 0.97. Significant associations emerged between 1‐star reviews and treatment plan, accessibility, wait time, office scheduling, billing, and facilities. 5‐star reviews were associated with surgery/procedure, bedside manner, and staff/mid‐levels. Conclusion The study successfully validated an NLP text classification system for categorizing online physician reviews. Positive reviews were found to have an association with directly‐physician related context. 1‐star reviews were related to treatment plan, accessibility, wait time, office scheduling, billing, and facilities. This method of text classification effectively discerned the nuances of human‐written text, providing valuable insights into online healthcare feedback that is scalable. Level of evidence: Level 3
Article
Background Patients often evaluate the reputations of plastic surgeons based on their performances on physician review websites. This article aims to compare rating methodologies and conduct a cost-benefit analysis of physician review websites to further understand how plastic surgeons and their patients can utilize review websites to inform their practice and care. Methods A review of online literature, blogs, and 17 of the most common physician review websites was conducted to identify information on review website methodology, cost, and benefits most pertinent to plastic surgeons and their patients. Results Physician review websites utilize various combinations of physician-related and unrelated criteria to evaluate plastic surgeons. Across 17 reviewed platforms, most (71%) utilize star ratings to rate physicians, 18% require an appointment to conduct a review, and 35% feature search engine optimization. Many websites (53%) allow physicians to pay for benefits or extension packages, with benefits offered including advertising, search engine optimization, competitor blocking, social media marketing, consultant services, and data analytics. Competitor blocking was provided by the most number of websites who offered additional services for pay (78%). Conclusions Appointments are not required to post physician reviews on many review websites, and many websites allow physicians to purchase packages to enhance their search engine optimization or consumer reach. Accordingly, plastic surgeons' reputations on review websites may be influenced by factors extraneous to actual patient care. Patients and physicians should be cognizant that physician review websites may not be reflective of factors related to quality of patient care.
Article
Full-text available
Little is known about how real-time online rating platforms such as Yelp may complement the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey, which is the US standard for evaluating patients' experiences after hospitalization. We compared the content of Yelp narrative reviews of hospitals to the topics in the HCAHPS survey, called domains in HCAHPS terminology. While the domains included in Yelp reviews covered the majority of HCAHPS domains, Yelp reviews covered an additional twelve domains not found in HCAHPS. The majority of Yelp topics that most strongly correlate with positive or negative reviews are not measured or reported by HCAHPS. The large collection of patient- and caregiver-centered experiences found on Yelp can be analyzed with natural language processing methods, identifying for policy makers the measures of hospital quality that matter most to patients and caregivers. The Yelp measures and analysis can also provide actionable feedback for hospitals. © 2016 Project HOPE- The People-to-People Health Foundation, Inc.
Article
Full-text available
Background The number of physician-rating websites (PRWs) is rising rapidly, but usage is still poor. So far, there has been little discussion about what kind of variables influence usage of PRWs. Objective We focused on sociodemographic variables, psychographic variables, and health status of PRW users and nonusers. Methods An online survey of 1006 randomly selected German patients was conducted in September 2012. We analyzed the patients’ knowledge and use of online PRWs. We also analyzed the impact of sociodemographic variables (gender, age, and education), psychographic variables (eg, feelings toward the Internet, digital literacy), and health status on use or nonuse as well as the judgment of and behavior intentions toward PRWs. The survey instrument was based on existing literature and was guided by several research questions. ResultsA total of 29.3% (289/986) of the sample knew of a PRW and 26.1% (257/986) had already used a PRW. Younger people were more prone than older ones to use PRWs (t967=2.27, P=.02). Women used them more than men (χ21=9.4, P=.002), the more highly educated more than less educated people (χ24=19.7, P=.001), and people with chronic diseases more than people without (χ21=5.6, P=.02). No differences were found between users and nonusers in their daily private Internet use and in their use of the Internet for health-related information. Users had more positive feelings about the Internet and other Web-based applications in general (t489=3.07, P=.002) than nonusers, and they had higher digital literacy (t520=4.20, P
Article
Background: Online reviews have become modern versions of the word-of-mouth recommendation, and prospective patients are increasingly consulting them before making decisions about their surgical care. The authors' objectives were to (1) identify trends in the use of online reviews, and (2) important reasons for patient satisfaction and dissatisfaction with aesthetic surgery. The authors selected breast augmentation as the primary procedure of interest. Methods: Reviews of the top 10 to 20 most reviewed plastic surgeons in each of six large metropolitan areas were obtained from Google, Yelp, and RealSelf. Reviews were assessed for predefined dimensions of satisfaction and dissatisfaction. Results: A total of 1077 breast augmentation reviews were obtained. Ratings were distributed bimodally, with peaks at five stars and one star. The majority of reviews were positive (87.5 percent). Relative popularity of Google versus Yelp varied across geographic regions, and average rating varied by platform. Between 2011 and 2016, the number of online reviews for breast augmentation grew at an average rate of 42.6 percent per year. Aesthetic outcome was the most commonly cited dimension (69.8 percent of reviews), whereas cost was mentioned in only 7.8 percent of reviews. A substantial minority of negative Yelp (37 percent) and Google (9.4 percent) reviews were written by patients who did not actually undergo surgery. Free-text analysis of heterogeneous reviews (containing positive and negative attributes) classified dimensions as critical, redeemable, or protective. Conclusion: As the influence of online review platforms continues to grow, understanding drivers of positive and negative reviews may help surgeons improve patient satisfaction.
Article
There is an increasing trend of potential patients who use the internet to research physicians. According to a survey performed by Softwareadvice.com in 2013 and 2014,¹ the percentage of patients using online reviews to find their physician increased from 25% to 42%. In addition, the survey found that http://www.Yelp.com was the most trusted website during both years.¹ When discussing facial plastic surgeons, there are no studies regarding Yelp, but there are articles concerning online reviews of otolaryngologists, dermatologists, radiologists, orthopedic surgeons, and urologists.²- 6
Article
Objectives: Analyze the correlation between online-based review websites and the Press Ganey Patient Satisfaction Survey (PGPSS) in an academic otolaryngology department. Study design: Retrospective cross sectional. Setting: Tertiary academic institution. Methods: All available data were collected for Vitals.com and Healthgrades.com, along with PGPSS data for 16 otolaryngology attending physicians from 2012 to 2014. A mean rating was calculated for each topic category for online websites and compared with 7 PGPSS content questions using zero-order correlations. A paired t test was used to analyze the difference between the PGPSS and online scores. Results: There were no statistically significant correlations between time spent with the patient (r = 0.391, P = .208) and overall provider scores (r = 0.193, P = .508) when compared between Vitals.com and the PGPSS. The correlations were not statistically significant when Healthgrades.com was compared with the PGPSS in the items "probability of recommending the provider" (r = -0.122, P = .666) and "trust in provider" (r = -0.025, P = .929). The most important factors in a patient recommending the provider were as follows, per resource: time spent with the patient for Vitals.com (r = 0.685, P = .014), listening for Healthgrades.com (r = 0.981, P ≤ .001), and trust in the provider for the PGPSS (r = 0.971, P ≤ .001). Conclusion: This study suggests that online-based reviews do not have statistically significant correlations with the widely used PGPSS and may not be an accurate source of information for patients. Patients should have access to the most reliable and least biased surveys available to the public to allow for better-informed decisions regarding their health care.
Conference Paper
It has become commonplace for patients to access online reviews of physicians when making choices about health care, just as any consumer would in today's computer-dependent world. Previous studies have shown that online reviews of physicians are generally positive. However, 1 negative review has the potential to adversely affect business and reputations. To characterize the online presence of plastic surgeons in Southern California as portrayed by physician rating websites (PRWs). An extensive online database of board-certified plastic surgeons was used to generate a list of surgeons within a 50-mile radius of Pomona, CA. Ratings from the PRWs HealthGrades.com, Vitals.com, and UCompareHealthcare.com were cataloged by number of reviews and ratings. Two hundred sixty-three surgeons were evaluated with the most-represented cities being Beverly Hills (N = 47), Los Angeles (N = 31), and Newport Beach (N = 27). Ninety-seven percent of the surgeons were rated on at least 1 of the 3 PRWs chosen. In general, surgeons were rated highly, with a mean rating of 85%, SD, 14% (P < 0.01), with a mean of 11.0 ratings per PRW, SD 10.9 (P < 0.01). Total online ratings ranged from 0 to 222 per surgeon. The median number of total reviews was 25 and the mean rating for those surgeons above and below the median were equivocal, at 86% and 85%, respectively (P = 0.284). In this study, we found that plastic surgeons in Southern California have an online presence that can be influenced by their patients; they should be aware of this and conscious of their online reputations. Overall, the ratings were high, regardless of the number of reviews.
Article
Importance The otolaryngologist’s online reputation is of increasing importance. Physician rating websites are becoming increasingly prevalent, and patients are using them to evaluate their current and future physicians. Objective To evaluate patterns in online ratings of otolaryngologists.Design, Setting, and Participants From May 1, 2013, through June 1, 2013, lists of academic program faculty members in the Northeastern United States were compiled, and academic allopathic otolaryngologists from the Eastern Section of the Triological Society were identified. Each faculty member’s name was searched using the Google search engine to link to profiles on the Healthgrades.com and Vitals.com websites.Main Outcomes and Measures State, program, academic position, years in practice, subspecialty, ratings, and reviews were recorded. Ratings were compared using analysis of variance.Results A total of 281 faculty members from 25 programs were identified. A total of 266 otolaryngologists (94.7%) had a profile on Healthgrades, and 247 (87.9%) had a profile on Vitals. Of those with profiles, 186 (69.9%) and 202 (81.8%) had patient reviews on Healthgrades and Vitals, respectively. The mean score was 4.4 of 5.0 on Healthgrades and 3.4 of 4.0 on Vitals. On Vitals, 179 profiles (63.7%) had comments associated with them. Overall, 49 comments (27.3%) were determined to be negative, and 138 otolaryngologists (49.1%) had at least 1 negative comment. Academic position and subspecialty affected reviews on Healthgrades. State and years in practice did not influence reviews.Conclusions and Relevance Most patients use online resources for information on health care professionals. Physician perceptions of these sites tend to be negative. Awareness of the content and rating patterns may help physicians better manage their online reputation.
Article
Patients are increasingly turning to online physician ratings, just as they have sought ratings for other products and services. Much of what is known about these sites comes from studies of the ratings left on them.1 Little is known about the public’s awareness and use of online physician ratings, and whether these sites influence decisions about selecting a physician.