Article

Validating Health Insurance Coverage Survey Estimates: A Comparison of Self-Reported Coverage and Administrative Data Records

Public Opinion Quarterly (Impact Factor: 2.25). 08/2009; DOI: 10.1093/poq/nfn013

ABSTRACT We administered a health insurance coverage survey module to a sample of 4,575 adult Blue Cross and Blue Shield of Minnesota (BCBS) members to examine if people who have health insurance coverage self-report that they are uninsured. We were also interested in whether respondents correctly classify themselves as having commercial, Medicare, MinnesotaCare, and/or Medicaid coverage (the four sample strata). The BCBS of Minnesota sample is drawn from both public and commercial health insurance coverage strata that are important to policy research involving survey data. Our findings support the validity of our health insurance module for determining whether someone who has health insurance is correctly coded as having health insurance coverage, as only 0.4 percent of the BCBS members answered the survey as though they were uninsured. However, we find problems for researchers interested in using survey responses to specific types of public coverage. For example, 21 percent of the Medicaid self-reported coverage came from known enrollees and only 67 percent of the MinnesotaCare self-reported count came from known enrollees. We conclude with a discussion of the study's implications for understanding the Medicaid “undercount” and the validity of self-reported health insurance coverage.

0 Bookmarks
 · 
97 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Provisions within the Affordable Care Act, including the introduction of subsidized, Exchange-based coverage for lower income Americans lacking access to employer coverage, are expected to greatly expand the size and importance of the individual market. Using multiple federal surveys and administrative data from the National Association of Insurance Commissioners, we generate national-, regional-, and state-level estimates of the individual market. In 2009, the number of nonelderly persons with individual coverage ranged from 9.55 million in the Medical Expenditure Panel Survey to 25.3 million in the American Community Survey. Notable differences also exist between survey estimates and National Association of Insurance Commissioners administrative counts, an outcome likely driven by variation in the type and measurement of individual coverage considered by surveys relative to administrative data. Future research evaluating the impact of the Affordable Care Act coverage provisions must be mindful of differences across surveys and administrative sources as it relates to the measurement of individual market coverage.
    Medical Care Research and Review 02/2013; · 2.57 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: To synthesize evidence on the accuracy of Medicaid reporting across state and federal surveys. DATA SOURCES: All available validation studies. STUDY DESIGN: Compare results from existing research to understand variation in reporting across surveys. DATA COLLECTION METHODS: Synthesize all available studies validating survey reports of Medicaid coverage. PRINCIPAL FINDINGS: Across all surveys, reporting some type of insurance coverage is better than reporting Medicaid specifically. Therefore, estimates of uninsurance are less biased than estimates of specific sources of coverage. The CPS stands out as being particularly inaccurate. CONCLUSIONS: Measuring health insurance coverage is prone to some level of error, yet survey overstatements of uninsurance are modest in most surveys. Accounting for all forms of bias is complex. Researchers should consider adjusting estimates of Medicaid and uninsurance in surveys prone to high levels of misreporting.
    Health Services Research 07/2012; · 2.49 Impact Factor

Full-text (4 Sources)

Download
36 Downloads
Available from
Jun 1, 2014