Surgical Precision in Clinical Documentation Connects Patient Safety, Quality of Care, and Reimbursement

by Benjamin J. Kittinger, MD; Anthony Matejicka II, DO; and Raman C. Mahabir, MD


Emphasis on quality of care has become a major focus for healthcare providers and institutions. The Centers for Medicare and Medicaid Services has multiple quality-of-care performance programs and initiatives aimed at providing transparency to the public, which provide the ability to directly compare services provided by hospitals and individual physicians. These quality-of-care programs highlight the transition to pay for performance, rewarding physicians and hospitals for high quality of care. To improve the use of pay for performance and analyze quality-of-care outcome measures, the Division of Plastic Surgery at Scott & White Memorial Hospital participated in an inpatient clinical documentation accuracy project (CDAP). Performance and improvement on metrics such as case mix index, severity of illness, risk of mortality, and geometric mean length of stay were assessed after implementation. After implementation of the CDAP, the division of plastic surgery showed increases in case mix index, calculated severity of illness, and calculated risk of mortality and a decrease in length of stay. For academic plastic surgeons, quality of care demands precise documentation of each patient. The CDAP provides one avenue to hone clinical documentation and performance on quality measures.

Keywords: case mix index; coding; comorbid conditions; diagnosis-related group; documentation


Achieving financial solvency in the current economic, political, and legal climate has become progressively more difficult for physicians and hospitals in the United States. In 2011, healthcare costs increased to nearly 18 percent of the gross domestic product (GDP), continuing the trend of outpacing GDP growth, while economic forecasts project that healthcare costs will increase to 20 percent of the GDP in the next 10 years.1 Concurrently, the Medicare Sustainable Growth Rate formula has continued to decrease rates of reimbursement for outpatient care. Many state legislatures have set limits on malpractice awards, which has alleviated some financial and legal stress for physicians and hospitals, but in other states, malpractice insurance premiums remain a strain on many physicians’ budgets.2 The numerous unknowns of medical practice and the constant adaptation to new regulatory measures has resulted in a challenging environment for many healthcare providers. The Centers for Medicare and Medicaid Services (CMS) continually modifies its numerous quality measure reporting programs in an effort to meet the demand for high-quality healthcare while decreasing costs. To successfully implement quality measure reporting, CMS has used several approaches, including requiring participation, incentivizing providers with nominal increases in reimbursement, and reducing reimbursement for nonparticipating providers. CMS is also decreasing baseline reimbursement in an effort to fund quality measure reporting programs.3

To provide transparency with regard to performance on select quality measures, data from the Division of Plastic Surgery at Scott & White Memorial Hospital were analyzed before and after the implementation of a top-down, multidisciplinary quality-of-care improvement project. The main focus of the quality improvement project involved the integration and utilization of an inpatient clinical documentation accuracy project (CDAP). Metrics used to assess improvement in clinical documentation performance included baseline and quarterly reviews of case mix index (CMI), severity of illness (SOI), risk of mortality (ROM), and geometric mean length of stay (GMLOS). We hypothesized that more accurate clinical documentation would lead to an increase in CMI.


Quality-of-Care Programs and Initiatives in the Current Healthcare System

The future of inpatient healthcare relies on an efficient, low-cost model of delivery that ensures safety, quality of care, and the use of evidence-based best practices. The model used in most CMS quality improvement programs begins with an initial period of data collection, in which quality-of-care measures are identified and assessed. New measures are added, and existing measures may be adjusted or removed annually. This process allows CMS to establish evolving standards for quality of care. More importantly, providers and institutions are compared on the basis of their performance or improvement on each quality measure, and these data are publicly reported. For CMS, the ultimate result is to stratify reimbursement for top, middle, and low-level performers on the basis of quality measures. This process has resulted in quality initiatives and regulations that have a growing influence on reimbursement for providers and hospitals. The Physician Quality Reporting System (PQRS), the Hospital Inpatient Quality Reporting System, and the Value-Based Purchasing program are examples of the evolution toward pay for performance.

PQRS is an outpatient CMS quality-of-care program that incentivizes providers to participate in quality measure reporting. Incentives in the form of slight increases in the percentage of professional fees disbursed are rewarded for quality measure reporting. Participating providers have received decreasing incentives from 2 percent to 0.5 percent for the past several years, with providers receiving a 0.5 percent incentive for quality measure reporting in 2013. As of 2015, providers receive a deduction in reimbursement for not participating. Measures include a wide range of data such as emergency treatment of acute myocardial infarction with aspirin and management of chronic diseases such as diabetes mellitus.4 CMS has used these data to analyze performance on quality measures and establish levels of performance and improvement. The ultimate goal is to link levels of performance and improvement directly to reimbursement in the future. Reporting is now mandatory, and providers who do not participate are penalized via reduced reimbursement. In contrast, providers who participate in quality reporting with PQRS will eventually be stratified according to their performance and improvement. In the outpatient setting, this process may lead to providers’ focusing their practices to target compliant patients with excellent chronic disease management in order to achieve higher reimbursement.

The Hospital Inpatient Quality Reporting System collects data on quality measures from an inpatient perspective. This quality-of-care program, similar to PQRS, currently penalizes hospitals for not reporting data. To fund this quality-of-care program, all hospitals who receive Medicare Part A reimbursement have 1 percent of their baseline reimbursement withheld initially. This 1 percent of reimbursement is ultimately redistributed on the basis of each participating institution’s performance and improvement on quality measures.5 Hospitals that are among the top performers are rewarded for their high performance. The quality-measure data gathered from this reporting program allow CMS to continuously analyze and stratify reimbursement to individual hospitals on the basis of their improvement and performance on quality measures. In addition, these data are publicly available on the Internet, which allows for direct comparison of providers and hospitals.6, 7

In the new culture of pay for performance, plastic surgeons who interact with CMS will need to rely on evidence-based best practices and meticulous documentation, coupled with a knowledge of and strict adherence to quality measures. Without precise clinical documentation, even the best medical care will lead to poor performance on quality measures. Unfortunately, this combination of excellent surgical care and poor documentation most likely represents the current standard in many institutions. As the healthcare system in the United States progresses toward pay for performance, surgeons should make a concerted effort to educate themselves on documentation opportunities. The public availability of these data on the Physician Compare website ( will undoubtedly lead both patients and hospitals to stratify providers by quality measures and outcomes.8, 9 Recognizing the implications of pay for performance, our institution has developed numerous best-practice teams and a CDAP team. Optimizing clinical documentation ensures that accurate fees are billed and potentially received from CMS, while maximizing performance and improvement on quality measures.

Billing and Reimbursement

Although some hospitals rely on private insurers, CMS represents a major payer for most academic institutions. Importantly, most private insurance companies’ reimbursement and coverage of services are based on the CMS guidelines. CMS uses the Inpatient Prospective Payment System to reimburse hospitals for care.10 For billing purposes, inpatients are grouped into diagnosis-related groups (DRGs) by their primary diagnosis, procedures, secondary diagnoses, and several other factors. Different DRG systems are used depending on the hospital, but the system most commonly used to classify DRGs is the Medicare Severity DRG (MS-DRG) system. The baseline numerical value of a DRG is evaluated and revised annually by CMS. To calculate a DRG payment for reimbursement, the DRG baseline numerical value is multiplied by the base payment rate. Additional factors that are used to modify the DRG include the wage index, a cost-of-living adjustment, and whether the care was provided at a teaching hospital (indirect medical expenses), had excessively high cost (outlier cases), or was provided at a hospital that treats many low-income patients (e.g., disproportionate share hospitals).

Reimbursement may be substantially different between two hospitals depending on the above factors. In addition, the MS-DRG system is weighted to account for differences in resource utilization between patients who undergo similar procedures or have the same principal diagnosis. For example, in the MS-DRG system, a baseline payment is based on the primary diagnosis or treatment rendered. If a patient has a comorbid condition (CC), the DRG payment is weighted to account for greater resource utilization. The DRG payment can be weighted even more heavily if a patient has a major comorbid condition (MCC). CMS maintains an extensive list of CCs and MCCs. In short, the MS-DRG system is a three-tier system of billing and reimbursement based on each patient’s principal diagnosis and CCs or MCCs.

A newer, advanced classification system known as the All Patient Refined DRG (APR-DRG) system allows a hospital to bill CMS by accounting for more patient factors than the simple three-tiered MS-DRG system does. These factors include the patient’s ROM and SOI. Using a numerical scale from 1 to 4, these two indices attempt to quantify the risk of death (i.e., ROM) and the extent of organ system derangement (i.e., SOI). CCs and MCCs are not part of the APR-DRG system but directly influence the ROM and SOI indices. Using the DRG and these two indices allows hospitals to bill more proportionally to the care that is received on the basis of patient resource utilization. Since October 2012, the APR-DRG system has been used for all Medicaid billing in the state in which this study was conducted.

Applying These Systems in Medical Practice

Consider two patients who undergo free flap reconstruction. In a nonweighted system, both patients are coded into the same DRG on the basis of the Current Procedural Terminology code for the procedure. The result is identical reimbursement for two patients who may have vastly different hospital resource utilization because of their CCs. Examples are provided in Table 1. SOI, ROM, and GMLOS are three calculated metrics derived from the APR-DRG system. SOI is an attempt to classify organ system derangement or physiologic derangement on a scale from 1 (low) to 4 (high). ROM is an attempt to classify risk of death on a scale from 1 (low risk) to 4 (high risk). GMLOS is the reported mean length of hospitalization for all patients with a particular DRG. When the institution’s CMI is more accurately captured, the expected GMLOS increases even though the observed length of stay does not, which reflects positively in quality and outcome measures. Two examples follow.


Correct CC/MCC: In case 1 (see Table 1), a patient has a panniculectomy (removal of excess skin of the abdominal wall causing impairment of normal activities of daily living). If during the patient’s hospital course no documented CCs, MCCs, or complications occur, the length of stay is 1.8 days and the reimbursement is $5,827. If that same patient is treated for either a urinary tract infection or pneumonia during the admission, the length of stay and reimbursement increase to 3.6 days or 7.3 days and $9,802 or $16,527, respectively. If those events occur but are not accurately documented, the hospital sees two ill effects. First, the patient’s observed length of stay is longer than expected, which is equivalent to a poor outcome. Second, the reimbursement is at the lower rate despite the increased use of resources for the complications.

Correct DRG: Cases 2 and 3 (see Table 1) give examples of the differences when using the correct DRG. The first line of each case represents the initially reported DRG and corresponding information. The second line of each case represents the corrected DRG and information. By accurately coding the DRG, the hospital would see substantial changes in weight assigned under the weighted system, SOI, ROM, GMLOS, and revenue.


The Division of Plastic Surgery had been identified within our institution as having an opportunity for improvement in documentation. After institutional review board approval, the division engaged in a top-down educational effort aimed specifically at improving the institutional culture related to clinical documentation. Clinical providers at all levels of training, including senior staff and resident physicians, were educated on DRGs and documentation. Preprinted forms were added to every patient’s chart to facilitate capturing CCs and events of the hospitalization. These forms were reviewed daily and were also used as part of the discharge summary. The goal of the project was to assess performance and improvement on clinical documentation using metrics including CMI, SOI, ROM, and length of stay. Clinical documentation experts used a standard query methodology. Metrics were assessed before the clinical documentation improvement education and were reassessed quarterly thereafter; the CDAP was fully implemented on January 1, 2012. The data for all primary patients cared for in the Division of Plastic Surgery during 2012 (the study and follow-up period) were included. All consultation patients were excluded because services billed to CMS are billed under the relevant attending physician on the primary team. The included metrics were analyzed over the study period.

Clinical Documentation Accuracy Team

The CDAP team consisted of residents, staff surgeons, and clinical coders. The main goal was to review the accuracy of inpatient clinical documentation through direct interactions between providers and coders. The process began with mandatory, departmentwide teaching sessions on the importance of inpatient coding and documentation. A one-year review was performed to identify the 30 most common CCs and MCCs encountered on the plastic surgery inpatient census. These 30 items were then listed on the standard progress note. On a daily basis, residents were responsible for completing the progress note, and staff surgeons were responsible for reviewing the note for accuracy. Coders also reviewed the notes and notified the surgeons of any potential discrepancies or omitted diagnoses. Surgeons would then determine the appropriateness of the clarifications and make the final determination to incorporate or decline the changes. Surgeons’ compliance with this initiative was recorded and reported to the chief of the section. Initial compliance was low (less than 50 percent) and the number of coder-physician interactions (number of times the coders notified the surgeons of missed or inaccurately documented CCs and MCCs) was high. To encourage improvement, individual provider data were shared with the entire group at the monthly meeting. Compliance improved to more than 95 percent, and the number of coder–physician interactions decreased. Surgeons became much more involved in the inpatient coding process, as evidenced by both the improved compliance and the “esprit de corps” on rounds, during which coding became an integral part of the routine. This interaction between clinicians and healthcare documentation specialists allowed physicians to ensure that they were capturing all relevant diagnoses for treatments that were provided during inpatient hospitalizations.


The Division of Plastic Surgery has an inpatient unit in a 700-bed, multispecialty, level three facility. Data on 558 inpatient admissions were gathered during the study period. CMI, SOI, ROM, and length of stay were measured at baseline and after implementation of the CDAP. CMI (see Figure 1), SOI (see Figure 2), and ROM (Figure 2) gradually increased over the course of one fiscal year. The vertical line in each figure represents the time at which the CDAP was fully implemented. The initial decrease in CMI in quarter 2 is seasonal and seen throughout the institution; therefore, the key comparison is between quarter 4 in 2011 and quarter 4 in 2012. Reporting individual increases in CCs and MCCs and changes in DRGs would be very granular and overly detailed for this report. The increase in the calculated indices (CMI, SOI, and ROM) reflects the aggregate increase in the captured CCs and MCCs and corrected DRGs by the concerted effort of the CDAP. Quality of care is reflected in a comparison of the observed incidents versus the expected incidents. The observed incidents cannot change. The expected incidents are calculated on the basis of CMI, SOI, and ROM. If those measures increase, the expected number of incidents also increases, which changes the ratio of observed versus expected incidents to a more favorable quality outcome. The GMLOS is a calculated index based on the above measures. As those measures increase, an increase in GMLOS is expected. The decrease in actual length of stay reflects other measures being implemented at this same time. Enhanced recovery protocols were also introduced, which resulted in decreased lengths of stay, on average (see Figure 3). This situation created a divergence between the GMLOS calculated by CMS and the actual length of stay.


Although other studies have shown the utility of CDAPs in improving documentation and reimbursement,11–13 this study is, to our knowledge, the first to examine clinical documentation improvement in the plastic surgery literature.14 The information age has allowed healthcare consumers to compare hospitals and providers on the basis of the quality of services provided. Availability of quality measure outcomes on websites allows patients to compare providers and institutions on their performance on quality-of-care metrics. This availability of data will intensify scrutiny of individual providers and hospitals. To perform at the highest levels of healthcare quality, strict adherence to quality reporting and clinical documentation is paramount.

The current healthcare system is moving toward reimbursement of physicians with an increasing emphasis on quality of care. Value-Based Purchasing and PQRS are examples of programs focused on quality measures that directly influence reimbursement. The pay-for-performance model is expected to change substantially as a result of the introduction of ICD-10 on October 1, 2015. Understanding documentation as it relates to coding and billing is essential to practicing medicine and streamlining patient care.

This study has several potential limitations. Several variables can influence CMI. We hypothesized that more accurate clinical documentation would lead to an increase in CMI during the study period. We showed that the increase in CMI was mirrored by increases in ROM and SOI.

Because CMI is based on an average of the DRGs, one potential explanation for the increase in CMI is that the division performed procedures that had higher numerical DRG values. The corresponding increase in SOI and ROM, however, indicates that the division was accurately documenting patient CCs during this time, which could also explain the increase in the numerical DRG value. Interestingly, patients in the division were discharged earlier than the national average GMLOS indicates. This finding may be related to the heightened awareness of individual patient CCs that the clinical team developed as a result of the implementation of the CDAP. The team certainly paid closer attention to what might otherwise have been considered minute or irrelevant details of each patient’s medical record as a result of the CDAP. We believe this attention to detail resulted in higher quality care overall, while also improving the patients’ transitions to home.

For academic plastic surgeons, quality of care demands precise documentation of each patient. This study showed that this goal can be accomplished by ensuring the accuracy of clinical documentation. Plastic surgeons have adhered to a time-honored tradition of innovation and creativity in problem solving. We must continue to adhere to these ideals and must apply the same precision used in the operating room to patients’ medical records. The CDAP provides one avenue to hone clinical documentation and performance on quality measures.


Implementation of a CDAP resulted in increases in CMI, calculated SOI, and calculated ROM and a decrease in length of stay. The Division of Plastic Surgery was able to improve its documentation and, in doing so, improved the recognition of the complexity of the patients it was treating. As transparency in outcomes becomes a reality, it is critical for institutions to be compared with those treating similar patients. In this study, an endeavor to improve documentation proved fruitful in terms of both quality of care and financial reimbursement for the hospital.


Benjamin J. Kittinger, MD, is a resident in the Division of Plastic Surgery in the Department of Surgery at Scott & White Healthcare/Texas A&M Health Science Center College of Medicine in Temple, TX.

Anthony Matejicka II, DO, is a consultant in the Department of Medicine at Community Medical Center–Barnabas Health in Toms River, NJ.

Raman C. Mahabir, MD, is a consultant in the Division of Plastic and Reconstructive Surgery at Mayo Clinic Hospital in Phoenix, AZ.



  1. Keehan, S. P., G. A. Cuckler, A. M. Sisko, A. J. Madison, S. D. Smith, J. M. Lizonitz, et al. “National Health Expenditure Projections: Modest Annual Growth Until Coverage Expands and Economic Growth Accelerates.” Health Affairs 31, no. 7 (2012): 1600–1612.
  2. Encinosa, W. E., and F. J. Hellinger. “Have State Caps on Malpractice Awards Increased the Supply of Physicians?” Health Affairs web suppl. (2005): W5-250–W5-258.
  3. Department of Health and Human Services, Centers for Medicare & Medicaid Services. “Medicare Program; Hospital Inpatient Prospective Payment Systems for Acute Care Hospitals and the Long-Term Care Hospital Prospective Payment System and Fiscal Year 2013 Rates; Hospitals’ Resident Caps for Graduate Medical Education Payment Purposes; Quality Reporting Requirements for Specific Providers and for Ambulatory Surgical Centers.” 42 CFR Parts 412, 413, 424, and 476. Federal Register 77, no. 170 (August 31, 2012): 53268. Available at (accessed February 17, 2015).
  4. Centers for Medicare & Medicaid Services. “Physician Quality Reporting System.” Available at (accessed February 17, 2015).
  5. Centers for Medicare & Medicaid Services. “Hospital Inpatient Quality Reporting Program.” Available at (accessed February 17, 2015).
  6. Centers for Medicare & Medicaid Services. “Hospital Compare.” Available at (accessed February 17, 2015).
  7. Centers for Medicare & Medicaid Services. “Physician Compare.” Available at (accessed February 17, 2015).
  8. Centers for Medicare & Medicaid Services. “Hospital Compare.”
  9. Centers for Medicare & Medicaid Services. “Physician Compare.”
  10. Centers for Medicare & Medicaid Services. “Acute Inpatient PPS.” Available at (accessed February 17, 2015).
  11. Frazee, R. C., A. V. Matejicka II, S. W. Abernathy, M. Davis, T. S. Isbell, J. L. Regner, et al. “Concurrent Chart Review Provides More Accurate Documentation and Increased Calculated Case Mix Index, Severity of Illness, and Risk of Mortality.” Journal of the American College of Surgeons 220, no. 4 (2015): 652–56.
  12. Spellberg, B., D. Harrington, S. Black, D. Sue, W. Stringer, and M. Witt. “Capturing the Diagnosis: An Internal Medicine Education Program to Improve Documentation.” American Journal of Medicine 126, no. 8 (2013): 739–43.
  13. Kim, D., and B. Spellberg. “Does Real-Time Feedback to Residents with or without Attendings Improve Medical Documentation?” Hospital Practice 42, no. 3 (1995): 123–30.
  14. Danzi, J. T., B. Masencup, M. A. Brucker, and C. Dixon-Lee. “Case Study: Clinical Documentation Improvement Program Supports Coding Accuracy.” Topics in Health Information Management 21, no. 2 (2000): 24–29.

Printer friendly version of this article.

Benjamin J. Kittinger, MD; Anthony Matejicka II, DO; and Raman C. Mahabir, MD. “Surgical Precision in Clinical Documentation Connects Patient Safety, Quality of Care, and Reimbursement.” Perspectives in Health Information Management (Winter 2016): 1-11.

Posted in:

Leave a Reply