by Donna Wilson, RHIA, CCS; Kim Hampton-Bagshaw, BSBM, CCS; Therese M. Jorwic, MPH, RHIA, CCS, CCS-P; Jean Bishop, CCA; and Elizabeth Giustina, CCS-P
Establishing a standard coding workflow that incorporates continuous improvement and creating performance benchmarks is key to improving the quality of coded data.
HIM professionals who assign codes or manage coded data take responsibility for translating clinical documentation and health services information into bytes consumable by today’s information systems. Each seeks to be a champion of data integrity, doing his or her best to stay above the fray of competing interests for data use.
The data they code can have far-reaching effects beyond provision of care. Administrative code sets are used for a variety of reporting requirements, including registries, indexing of disease and operations, prospective payment systems, fee schedule building, health plan insurance coverage, medical necessity justification, quality measurement, vital statistics, public health, and much more.
Providing quality healthcare requires that this encoded data meet established quality standards.1 The performance measures and work process flow behind these data, however, have yet to be standardized, making both automation difficult and unreliable and making current industry comparisons imprecise and variable.
The lack of standardization hinders communication related to clinical coding performance and work processes, and it impedes future development. The use of a common language or vocabulary is a fundamental component of performance measure and workflow. Re-engineering the work process and evaluation measures seeks to facilitate use of informatics tools to make the coding process more reliable and efficient.
Workflow process varies by purpose, practice setting, and organizational factors, among other variables. Formidable challenges remain for defining measurement benchmarks, standardization, and continuous quality improvement models that will improve data integrity and code assignment.
To address these challenges, HIM professionals came together through AHIMA in 2007 to re-engineer the coding workflow and performance benchmarks and establish useful tools for a better way to accomplish coding’s ultimate goal—data everyone can count on. The group created a model for the coding workflow process and developed standard tools and best practices for measurement of code assignment reliability and productivity. They laid a foundation for change designed to make the coding process better.
A Universal Coding Workflow
The need for change has been building over several years, accelerated by the increased use of technology for data mining, a quest for automation, use of the code systems for quality of care measures tied to payment incentive, and scrutiny of reimbursement accuracy.
The volunteers were charged with developing processes and measurement systems that can assist in both the identification and subsequent revision or elimination of problem codes, data models, and guidelines or other elements that add cost, complexity, and confusion to coded data results.
The work began with evaluating the traditional coding work process to create an incremental model simple enough to be universally understood and wide-ranging enough to create a continuous quality improvement cycle. The seven-step model can be used in any healthcare setting where codes are assigned for data processing. Coding managers should compare this model to their current workflow and determine if their coding process includes all seven steps, completing the circle with quality assurance.
Two work groups developed standard methods of reliability and productivity measurement. The methods are designed to help both individuals and organizations monitor data integrity. Read about standard methods that take the guesswork out of the coding quality assessment process in the practice brief “Collecting Root Cause to Improve Coding Quality Measurement” on page 71 [Journal of AHIMA 79, no. 3 (Mar. 2008)].
Building measures for evaluation, benchmarking, and documentation improvement at the point of service is a critical factor. Data quality is impossible if the source document is inadequate, ambiguous, inaccurate, or incomplete at the time of coding. Improving source documentation involves clinical documentation improvement programs focused on the results important to patient care and data management.
A Model for the Coding Workflow Process
This model can be used in any healthcare setting where codes are assigned for data processing. It is intended to be simple enough to be universally understood yet comprehensive enough to help create a continuous quality improvement cycle.
Detail on each step in the process may be found in the conference proceedings for the AHIMA 2007 Computer Assisted Coding Standards Workshop and in the resources section of the CAC Standards Community of Practice at www.ahima.org.

Source: AHIMA Computer Assisted Coding Community of Practice
|
Scanning the Coding Landscape
All process change requires an understanding of the current environment. To assess the challenges ahead, the teams scanned the field to identify what’s “out there” now:
- Code assignments are prone to error due to a variety of factors.
- Inter-rater reliability between even expert coders is lower than we would like it to be.
- Coding is most often performed the “old fashioned way,” where text documents are read by coding professionals from paper charts or computer screens and the codes are entered in data fields by coding professionals.
- There is no recognized or official “right” way to measure cost, productivity, and accuracy of code assignment and associated data abstraction. A reference standard (sometimes referred to as a “gold standard”) does not exist.
- Existing coding workflows do not always include a quality review and improvement component (steps 6 and 7 in the coding workflow process figure). These steps include comparing code assignment variances between individuals from the same source data, tracking deficiencies in the codes or code sets to influence change to reduce variability, and correcting the root causes of variance that affect data quality and integrity.
- Automation is possible for some tasks, but there are existing barriers in the current process that make it difficult to accomplish (most often due to external regulatory factors).
- National metrics would uncover “points of pain and weakness” in the existing HIPAA code sets and transactions by allowing valid comparability and uniform variance tracking.
- Rather than require multiple layers of evaluation and review for accuracy checking, healthcare organizations should use tools similar to those used by payers to identify coder performance, specific code, or code set performance.
- Review processes must include the application of appropriate data integrity remedies so variances do not continue to replicate.
- Standardized metrics used for individual performance assessment create fairness and transparency for coding professionals and foster career growth and reduce compliance risk.
Choosing Words with Care
Coding quality initiatives often cross into financial and regulatory compliance territories, where words may carry different and exact definitions. In the table below, the words to the left may in some situations invoke special meaning or an unintended standard to others. They should be used with care or avoided in favor of more neutral terms. The terms on the right are frequently used in review processes involving clinical code assignment to help decrease the potential for misunderstanding.
Words to Use with Care or to Avoid Completely
|
Alternate, Neutral Terms
|
Audit
|
Trace, monitor, evaluate, assess, inspect, check, analyze, measure performance
|
Errors, error rates
|
Variance, inconsistent with reference standard, consistency rate
|
Undercoding
|
Codes omitted, level of care lower than documentation guidelines allow
|
Overcoding
|
Unnecessary or otherwise inappropriate codes assigned, level of care higher than documentation guidelines allow
|
Incentive targets
|
Production benchmark, performance measure, performance standard
|
Gold standard
|
Reference standard, benchmark
|
Accuracy
|
Variance, comparison to standard
|
Precision
|
Specificity
|
|
A “Controlled Vocabulary” for Process Improvement
There are many ways HIM professionals can express their organization’s intent to follow recognized best practices and compliance standards in process improvement. However, they should be conscious of the words they choose, because some words have sensitive and highly specific meanings depending on the audience.
Coding measurement often cross into an organization’s financial and regulatory compliance territories. It is helpful to remember that some words may imply different things in different departments. When discussing, documenting, or planning coding process re-engineering or performance assessment, HIM professionals should use certain words with care or even avoid them completely. The coding profession should consider creating its own controlled vocabulary to communicate clearly to others about measurements, their purpose, and their effectiveness.
The terms shown on the right in the table above are typically neutral terms in a process involving clinical code assignment. The terms on the left, however, should be used with care. It may be preferable that they be avoided altogether.
In some situations, for example, the assertion that a coding manager conducted an “audit” may invoke an unintended standard that the legal entity or professional providing the service (e.g., the coding manager or consultant reviewer) is neither qualified nor authorized to perform. That is especially true if the associated financial transactions are affected, such as in claims submission or resubmission for reimbursement.
Auditing and monitoring as a joint concept were introduced by the Federal Sentencing Guidelines and related Office of the Inspector General compliance program guidance.2 Use of this phrase has special meaning since the enactment of the Sarbanes-Oxley legislation.3
There are two formal types of auditing that are generally understood—financial statement auditing and internal auditing for controls and risk assessment. Each type is governed by professional standards that predate corporate compliance plans for healthcare organizations. In both cases, the concept of auditing is specific and includes the concepts of independence and objectivity.
Standards such as the Generally Accepted Auditing Standards for financial statement audits and Standards for the Professional Practice of Internal Auditing for internal audits also refer to objectivity-related concepts such as professional care and professional skepticism.4,5 It is generally understood that to be referred to as an “audit” the activity should have been conducted by the internal audit department, a division of a healthcare enterprise, or another independent party with reporting responsibility to the CEO or the organization’s board of directors. It is possible that a coding compliance review conducted under the authority and at the direction of the chief compliance officer, who reports to the CEO or board of directors, may be properly named as an audit.
All other actions that support management activities designed to ensure compliance with regulatory requirements for code assignment, including selected activities completed as part of a work plan, should be generally labeled as monitoring. This is a conservative approach that focuses on the data quality, education, or training aspect without introducing the overlaying financial implications.
Using alternative terms builds a more meaningful and positive rapport between reviewer and coder toward consistency and compliance with both internal and external rules. Another benefit leads to reduction of risks associated with the project findings by placing focus on the continuous quality improvement process rather than the punitive aspects of not “getting it right,” when the definition of “right” involves multiple factors, some beyond the control of the individual selecting codes.
Information Guardians
HIM professionals have been referred to as “information guardians” in the healthcare delivery system. Standard methods of measurement serve that higher purpose and protect the integrity of translations from clinical data into codes for other important, secondary uses.
Reliable information for medical research, biosurveillance, decision support, consumer use, health policy development, and patient safety assurance depends on those responsible for code assignment or management of the process. They must take this role seriously and constantly watch for ways to make the process better.
This standards work began with workflow design and performance measures. Now it needs the scrutiny of HIM professionals and other industry stakeholders who use coded data for refinement and confirmation of “goodness of fit.”
This work is a vital step in making the coding process better by embracing a standard workflow, using standard performance measures for coding accuracy, consistency reviews, and production. Together healthcare can create reliable coding results and valid measures for performance assessment.
Notes
- AHIMA. “Quality Healthcare Data and Information.” Position statement. December 2007.
- U.S. Federal Sentencing Guidelines. Available online at www.ussc.gov/guidelin.htm.
- Sarbanes-Oxley Act of 2002. Public law. Available online at http://thomas.loc.gov/cgi-bin/query/z?c107:H.R.3763.ENR:.
- Generally Accepted Accounting Standards. Available online at www.gao.gov/govaud/ybk01.htm.
- Institute of Internal Auditors. International Standards for the Professional Practice of Internal Auditing. Available online at www.theiia.org/guidance/standards-and-practices.
Donna Wilson (donna.wilson@rsfh.com) is revenue integrity manager for Roper St. Francis Healthcare in Charleston, SC. Kim Hampton-Bagshaw is director of coding services for Pyramid Healthcare Solutions. Therese M. Jorwic is an assistant professor at the University of Illinois at Chicago and a consultant for MC Strategies in Atlanta. Jean Bishop is senior manager with Deloitte Financial Advisory Services LLP (the views expressed in this article are those of the authors, and do not necessarily reflect the views of Deloitte Financial Advisory Services LLP). Elizabeth Giustina is with MedPartners in Jacksonville, FL. The authors would like to acknowledge the assistance of Rita Scichilone, MHSA, RHIA, CCS, CCS-P, and Carol Spencer, RHIA.
Article citation:
Wilson, Donna D; Hampton-Bagshaw, Kim; Jorwic, Therese M; Bishop, Jean; Giustina, Elizabeth.
"New Focus on Process and Measure: Raising Data Quality with a Standard Coding Workflow and Benchmarks"
Journal of AHIMA
79, no.3
(March 2008):
54-58.
|