{"subscriber":false,"subscribedOffers":{}} US Physician Practices Spend More Than $15.4 Billion Annually To Report Quality Measures | Health Affairs

DataWatch

DATAWATCH

US Physician Practices Spend More Than $15.4 Billion Annually To Report Quality Measures

Affiliations
  1. Lawrence P. Casalino ( [email protected] ) is the Livingston Farrand Professor of Public Health and chief of the Division of Health Policy and Economics in the Department of Healthcare Policy and Research at Weill Cornell Medical College, in New York City.
  2. David Gans is a senior fellow at the Medical Group Management Association, in Denver, Colorado.
  3. Rachel Weber is a data analyst at the Medical Group Management Association.
  4. Meagan Cea is a research coordinator in the Department of Healthcare Policy and Research at Weill Cornell Medical College.
  5. Amber Tuchovsky is a data analyst at the Medical Group Management Association.
  6. Tara F. Bishop is an associate professor in the Department of Healthcare Policy and Research at Weill Cornell Medical College.
  7. Yesenia Miranda is a research coordinator in the Department of Healthcare Policy and Research at Weill Cornell Medical College.
  8. Brittany A. Frankel is a medical student at Weill Cornell Medical College.
  9. Kristina B. Ziehler is an assistant director at the Medical Group Management Association.
  10. Meghan M. Wong is an assistant director at the Medical Group Management Association.
  11. Todd B. Evenson is chief operating officer at the Medical Group Management Association.
PUBLISHED:Free Accesshttps://doi.org/10.1377/hlthaff.2015.1258

Abstract

Each year US physician practices in four common specialties spend, on average, 785 hours per physician and more than $15.4 billion dealing with the reporting of quality measures. While much is to be gained from quality measurement, the current system is unnecessarily costly, and greater effort is needed to standardize measures and make them easier to report.

TOPICS

The number of quality measures directed at US health care providers by external entities such as Medicare, Medicaid, and private health insurance plans has increased rapidly during the past decade. 13 These measures, such as rates of mammography screening for women or of testing for cholesterol or hemoglobin A1c levels for diabetes, are used to provide publicly reported information for patients and as a basis for financial “pay-for-performance” incentives to physicians. At least 159 measures of outpatient physician care are now publicly available. 1 The movement toward accountable care organizations, the federal Sustainable Growth Rate “fix” legislation, 4 and the private-sector Catalyst for Payment Reform coalition will further emphasize measurement of physician performance. 5

Anecdotally, dealing with these measures imposes a considerable burden on physician practices in terms of understanding the measures, providing performance data, and understanding performance reports from payers, 6 but the extent of that burden has not been quantified. 7 We present results from a national survey of practices representing three common physician specialty and multispecialty practices.

Practices reported that their physicians and staff spent 15.1 hours per physician per week dealing with external quality measures including the following: tracking quality measure specifications, developing and implementing data collection processes, entering information into the medical record, and collecting and transmitting data ( Exhibit 1 ). This is equivalent to 785.2 staff and physician hours per physician per year. The average physician spent 2.6 hours per week (enough time to care for approximately nine additional patients) dealing with quality measures; staff other than physicians spent 12.5 hours per physician per week dealing with quality measures, with the largest proportion (6.6 hours) by licensed practical nurses and medical assistants ( Exhibit 2 ).

Exhibit 1 Hours spent per physician per week dealing with external quality measures, 2014

Exhibit 1
SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research.

Exhibit 2 Mean hours spent on specific activities related to external quality measures per physician per week, 2014–15

Total effortEntering informationReviewing quality reports from external entitiesTracking quality measure specificationsDeveloping and implementing processes to collect dataCollecting and transmitting data to be used in quality measurement
Physicians plus staff15.112.50.50.70.80.7
 Physicians2.62.30.10.10.10.0
 Staff12.510.20.40.60.70.7
  Nurse practitioners and   physician assistants0.90.80.00.00.00.0
  Registered nurses1.41.30.00.00.00.0
  Licensed practical   nurses and medical   assistants6.66.10.10.10.10.3
  Administrators0.9a0.20.30.30.2
  Information technology   experts and   electronic health   record programmers0.3a0.00.10.10.0
  Billing/coding and   medical records   staff2.32.00.10.10.00.1

SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research. NOTE Statistical significance testing was not performed on the values in this exhibit.

aNot applicable.

The per physician time spent by physicians and staff translates to an average cost of $40,069 per physician per year ( Exhibit 3 ), or a combined total of $15.4 billion annually for general internists, family physicians, cardiologists, and orthopedists in the United States. (See online Appendix A1 for the methods used to calculate costs.) 8 Eighty-one percent of practices reported that they spent more or much more effort dealing with external quality measures than three years ago ( Exhibit 4 ). However, only 27 percent believed that current measures were moderately or very representative of the quality of care.

Exhibit 3 Average amount spent per physician per year dealing with external quality measures, 2014–15

PhysiciansNurse practitioners and physician assistantsRegistered nursesLicensed practical nurses and medical assistantsAdministratorsIT experts and EHR programmersBilling/coding and medical records staffTotal
All specialties$19,494$2,840$1,966$7,288$5,262$630$2,588$40,069
By specialty
 Primary care22,0494,2082,7029,1198,8727852,73350,468
 Cardiology20,8261,7921,6565,0192,8963942,34234,924
 Orthopedics15,5851,9631,3206,7132,6906122,58931,471

SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research. NOTES National cost estimates do not include multispecialty practices because of the difficulty of estimating costs for these practices. Appendix A1 provides details on the conversion of hours to dollars per year (see Note  8 in text). p=0.13 comparing cardiology to primary care; p=0.07 comparing orthopedics to primary care. IT is information technology. EHR is electronic health record.

Exhibit 4 Physician practices’ perceptions of external quality measures, 2014

Exhibit 4
SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research. NOTES Responses were on a 5-point Likert scale. For example, for the first item, respondents chose between responses ranging from “not at all representative of the quality of care” to “very representative of the quality of care.” QI is quality improvement.

External entities measure practices’ performance using both claims data and data that practices directly provide, such as patients’ blood pressure levels. 9 These entities often specify measures slightly differently than each other for the same area of performance. For example, for diabetes care, the Medicare Shared Savings Program metric for poor diabetes control is hemoglobin A1c at or below 8 percent, whereas most health plans use the Healthcare Effectiveness Data And Information Set (HEDIS) standard of at or below 9 percent. 10(p1458) This complicates practices’ data collection, reporting, and review processes. 1,10 State and regional agencies currently use 1,367 measures of provider quality, of which only 20 percent are used by more than one state or regional program. 11 A study of twenty-three health insurers found that 546 provider quality measures were used, few of which matched across insurers 10 or with the 1,700 measures used by federal agencies. 1

Study Data And Methods

Data Source

In November 2014 we used the Medical Group Management Association (MGMA) database to invite 1,000 randomly selected practices to respond to a confidential web-based survey, including 250 practices from each of four specialty types: cardiology, orthopedics, primary care (family medicine and general internal medicine), and multispecialty practices that included primary care.

We developed the survey based on our review of the literature; on a survey previously used to estimate the cost to practices of interacting with health insurers; 12 and on interviews with ten leaders of medical groups, medical societies, health plans, evaluators of quality measures, and relevant federal agencies. The survey (Appendix A2) 8 was designed to be completed online by a leader in each practice and focused on time spent by physicians and other staff on specific activities related to reporting and inspecting quality data; questions also addressed practice leaders’ perceptions of the utility of the measures. A total of 394 practices responded (raw response rate, 39.4 percent); after adjustment for practices that were ineligible because they were not the correct specialty type or were not contactable by phone or e-mail, the response rate was 54.3 percent. 13

Methods

Appendix A1 presents details of our analytic methods. 8 Briefly, we developed per physician per week estimates of the time spent by physicians and various types of staff on six categories of activity related to external quality measures. We converted these time estimates into estimates of the cost to practices of dealing with external quality measures. When making comparisons, we used t -tests to compare means.

Limitations

This study had multiple limitations. First, the sample was limited to MGMA members. However, the MGMA membership list is extensive, including approximately 33,000 medical practice leaders. 14 MGMA data have been used and cited as being reasonably nationally representative by such authoritative organizations as the Medicare Payment Advisory Commission. 15 Second, we included only four specialty practice types—but these are common. Third, our response rate was relatively low, although this in itself does not necessarily lead to bias. 16 Fourth, practices having stronger negative feelings about quality measures may have been more likely to respond to the survey, which would likely bias upward our estimates of the time spent on dealing with measures.

Fifth, all estimates came from a single individual in each practice who had the challenging task of estimating the time spent by different categories of practice staff on various tasks. We made our estimates more conservative by trimming outlier values (see Appendix A1 for details). 8 Direct observation would have been more precise but extremely time consuming and expensive even if carried out only in a small number of practices. Sixth, our cost estimates per physician did not include costs to practices of information technology or office space devoted to dealing with quality measures. Finally, our national cost estimates did not include multispecialty practices because of the difficulty of estimating costs for those practices.

Study Results

At least 90 percent of practices in each specialty received data on quality from external entities; between 70.9 percent (orthopedics) and 91.7 percent (primary care) of practices expended effort dealing with external quality measures ( Exhibit 5 ). On average, physicians and staff spent a total of 15.1 hours per physician per week dealing with quality measures, with the average physician spending 2.6 hours per week and other staff spending 12.5 hours ( Exhibits 1 and 2 ).

Exhibit 5 Characteristics of responding physician practices surveyed about external quality measures, 2014–15

No. of physicians in practice
1–910–1920 or morePhysician-owned practiceUse an EHR systemReceive data on quality from external entitiesExpend effort to provide data for quality measurement to external entities and/or review reports from external entities
Primary care81.8%11.6%6.6%94.2%91.7%97.5%91.7%
Cardiology50.029.021.182.990.796.184.2
Orthopedics48.533.018.598.184.390.370.9
Multispecialty21.319.259.687.296.898.990.4
All practices52.522.325.191.490.896.184.5

SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research. NOTES N=394 . For primary care, n=121 . For cardiology, n=76 . For orthopedics, n=103 . For multispecialty, n=94 . EHR is electronic health record.

By far the most time—12.5 hours of physician and staff time per physician per week—was spent on “entering information into the medical record ONLY for the purpose of reporting for quality measures from external entities” ( Exhibit 2 ). The average physician spent 2.3 hours per week entering this information. Licensed practical nurses and medical assistants spent the largest amounts of time—6.1 hours per physician per week—entering information.

Primary care physicians spent 3.9 hours per week dealing with quality measures, compared to 1.7, 1.1, and 3.0 hours for cardiologists, orthopedists, and physicians in multispecialty groups, respectively ( Exhibit 6 ). Primary care practices spent 19.1 hours of physician and staff time per physician per week dealing with quality requirements of external entities; cardiology, orthopedic, and multispecialty practices spent 10.4, 11.3, and 17.6 hours per physician per week, respectively. Time spent varied little by practice size (Appendix A3). 8

Exhibit 6 Mean hours spent per physician per week in dealing with external quality measures, 2014–15

Total effortEntering informationReviewing quality reports from external entitiesTracking quality measure specificationsDeveloping and implementing processes to collect dataCollecting and transmitting data to be used in quality measurement
Physicians and staff
Primary care19.115.30.81.11.11.0
Cardiology 10.4 a8.40.40.40.50.7
Orthopedics 11.3 b10.00.30.30.50.5
Multispecialty17.614.70.70.80.80.7
Physicians only
Primary care3.93.40.20.10.20.1
Cardiology1.71.60.10.10.00.0
Orthopedics1.11.10.00.00.00.0
Multispecialty3.02.60.20.10.10.0

SOURCE Authors’ analysis of responses to web-based survey of physician practices conducted for this research. NOTES The full table, including all types of staff, can be found in Appendix A3 (see Note  8 in text). p values were calculated only for differences between primary care and other specialties for the total effort figures.

ap=0.028 for difference from primary care.

bp=0.05 for difference from primary care.

The time spent by physicians and staff translates to an average cost to a practice of $40,069 per physician per year ( Exhibit 3 ). Primary care practices spent $50,468, compared to $34,924 for cardiology practices and $31,471 for orthopedics practices. If the dollar amounts per physician per year are multiplied by the number of general internists, family physicians, cardiologists, and orthopedists in the United States, the total amount spent annually by physician practices in these specialties dealing with external quality measures is $15.4 billion (Appendix A1). 8 The total amount spent by physicians in all specialties would be higher.

Eighty-one percent of practices reported that the effort they spent on quality measures was increasing compared to three years ago ( Exhibit 4 ). Forty-six percent reported that it was a significant burden to deal with measures that were similar but not identical to each other. Only 27 percent believed that current measures were moderately or strongly representative of the quality of care. Just 28 percent used their quality scores to focus their quality improvement activities. Specialty practices—especially orthopedic practices—were much less likely than primary care or multispecialty practices to report that measures were representative of quality or to use them to focus their attempts to improve quality. Comments from specialist respondents—especially orthopedic practices—argued that most quality measures were relevant for primary care but not for their specialty (Appendix A4). 8

In the free-text section of the survey, 228 practices (58 percent) provided 308 comments. Five major themes recurred: the burden of current measurement requirements on small practices, recommendations to have measures that are uniform across entities, the need for specialty-specific measures, the need for measures that better represent quality, and the need to easily and accurately extract data from electronic health records (EHRs) (see Appendix A4). 8

Discussion

The cost to physician practices of dealing with quality measures is high and rising. Our time and cost estimates of 15.1 hours per physician per week and $15.4 billion per year for the specialties included are much higher than those from a 2006 survey that included a single question about quality measures and from two early studies of small numbers of practices. 12,17,18 The methods used across the studies varied; in addition, the burden of dealing with quality measures has almost certainly increased since they were conducted.

There is much to gain from quality measurement, but the current system is far from being efficient and contributes to negative physician attitudes toward quality measures. 19 Improving the system rapidly will be difficult. Obstacles include the fragmented US health care system, lack of interoperability across EHRs, lack of EHR functionalities to facilitate retrieval of data for quality measures, the cost of change to external entities and to providers, and opposition from vested interests. 5 Increasing efforts to reduce the number of measures and to standardize their use across external entities are being made by the National Quality Forum, the Institute of Medicine, and America’s Health Insurance Plans, as well as by federal agencies such as the Centers for Medicare and Medicaid Services and the Agency for Healthcare Research and Quality. 5,2023 Our data suggest that US health care leaders should make these efforts a priority.

ACKNOWLEDGMENTS

The Physicians Foundation funded this research. The authors thank Rachel Jawahar of Weill Cornell Medical College for her assistance with data analysis for this project. Lawrence Casalino serves on the boards of directors of the American Medical Group Foundation and of the Healthcare Research and Education Trust, on the American Hospital Association Committee on Research, and on the Advisory Committee for the American Medical Association’s physician professional satisfaction and practice sustainability project.

NOTES

  • 1 Blumenthal DMalphrus EMcGinnis JM , editors. Vital signs: core metrics for health and health care progress . Washington (DC) : National Academies Press ; 2015 . p.  B - 9 . Google Scholar
  • 2 Hackbarth GM . Comment on the Centers for Medicare and Medicaid Services list of measures under consideration for December 1, 2014 [Internet]. Washington (DC) : Medicare Payment Advisory Commission ; 2015 Jan 5 [cited 2016 Jan 19 ]. Available from: http://www.medpac.gov/documents/comment-letters/medpac-comment-on-the-cms-list-of-measures-under-consideration-for-december-1-2014.pdf Google Scholar
  • 3 Panzer RJ , Gitomer RS , Greene WH , Webster PR , Landry KR , Riccobono CA . Increasing demands for quality measurement . JAMA . 2013 ; 310 ( 18 ): 1971 – 80 . Crossref, MedlineGoogle Scholar
  • 4 Ryan AM , Press MJ . Value-based payment for physicians in Medicare: small step or giant leap? Ann Intern Med . 2014 ; 160 ( 8 ): 565 – 6 . Crossref, MedlineGoogle Scholar
  • 5 Blumenthal D , McGinnis JM . Measuring vital signs: an IOM report on core metrics for health and health care progress . JAMA . 2015 ; 313 ( 19 ): 1901 – 2 . Crossref, MedlineGoogle Scholar
  • 6 Kansagara D , Tuepker A , Joos S , Nicolaidis C , Skaperdas E , Hickam D . Getting performance metrics right: a qualitative study of staff experiences implementing and measuring practice transformation . J Gen Intern Med . 2011 ; 29 ( Suppl 2 ): S607 – 13 . CrossrefGoogle Scholar
  • 7 Meyer GS , Nelson EC , Pryor DB , James B , Swensen SJ , Kaplan GS , et al. More quality measures versus measuring what matters: a call for balance and parsimony . BMJ Qual Saf . 2012 ; 21 ( 11 ): 964 – 8 . Crossref, MedlineGoogle Scholar
  • 8 To access the Appendix, click on the Appendix link in the box to the right of the article online.
  • 9 Damberg CL , Sorbero ME , Lovejoy SL , Lauderdale K , Wertheimer S , Smith A , et al. An evaluation of the use of performance measures in health care . Santa Monica (CA) : RAND Corporation ; 2011 . Google Scholar
  • 10 Higgins A , Veselovskiy G , McKown L . Provider performance measures in private and public programs: achieving meaningful alignment with flexibility to innovate . Health Aff (Millwood) . 2013 ; 32 ( 8 ): 1453 – 61 . Go to the articleGoogle Scholar
  • 11 Bazinsky K , Bailit M . The significant lack of alignment across state and regional health measure sets . Needham (MA) : Bailit Health Purchasing LLC ; 2013 Sep 10 . Google Scholar
  • 12 Casalino LP , Nicholson S , Gans DN , Hammons T , Morra D , Karrison T , et al. What does it cost physician practices to interact with health insurance plans? Health Aff (Millwood) . 2009 ; 28 ( 4 ): w533 – 43 . DOI: 10.1377/hlthaff.28.4.w533 Go to the articleGoogle Scholar
  • 13 American Association for Public Opinion Research . Standard definitions: final dispositions of case codes and outcome rates for surveys , seventh edition . Deerfield (IL) : AAPOR ; 2011 . Google Scholar
  • 14 Medical Group Management Association . About MGMA [home page on the Internet]. Englewood (CO) : MGMA ; 2015 [cited 2016 Jan 14 ]. Available from: http://www.mgma.com/about/overview Google Scholar
  • 15 Berenson RA , Zuckerman S , Stockley K . What if all physician services were paid under the Medicare fee schedule? An analysis using Medical Group Management Association data . Washington (DC) : Medicare Payment Advisory Commission ; 2010 Mar 19 . (Report No. 10-1). Google Scholar
  • 16 Davern M . Nonresponse rates are a problematic indicator of nonresponse bias in survey research . Health Serv Res . 2013 ; 48 ( 3 ): 905 – 12 . Crossref, MedlineGoogle Scholar
  • 17 Halladay JR , Stearns SC , Wroth T , Spragens L , Hofstetter S , Zimmerman S , et al. Cost to primary care practices of responding to payer requests for quality and performance data . Ann Fam Med . 2009 ; 7 ( 6 ): 495 – 503 . Crossref, MedlineGoogle Scholar
  • 18 West DR , Radcliff TA , Brown T , Cote MJ , Smith PC , Dickinson WP . Costs associated with data collection and reporting for diabetes quality improvement in primary care practices: a report from SNOCAP-USA . J Am Board Fam Med . 2012 ; 25 ( 3 ): 275 – 82 . Crossref, MedlineGoogle Scholar
  • 19 Meltzer DO , Chung JW . The population value of quality indicator reporting: a framework for prioritizing health care performance measures . Health Aff (Millwood) . 2014 ; 33 ( 1 ): 132 – 9 . Go to the articleGoogle Scholar
  • 20 Bipartisan Policy Center . Transitioning from volume to value: consolidation and alignment of quality measures . Washington (DC) : BPC ; 2015 Apr 27 . Google Scholar
  • 21 Cassel CK . Making measurement meaningful . Am J Manag Care . 2015 ; 21 ( 5 ): 332b – c . MedlineGoogle Scholar
  • 22 Conway PH . The Core Quality Measures Collaborative: a rationale and framework for public-private quality measure alignment . Health Affairs Blog [blog on the Internet]. 2015 Jun 23 [cited 2016 Jan 14 ]. Available from: http://healthaffairs.org/blog/2015/06/23/the-core-quality-measures-collaborative-a-rationale-and-framework-for-public-private-quality-measure-alignment/ Google Scholar
  • 23 Berenson RA , Pronovost PJ , Krumholz HM . Achieving the potential of health care performance measures . Princeton (NJ) : Robert Wood Johnson Foundation ; 2013 May . Google Scholar
Loading Comments...