Ruby E. Nicholson, RHIT, and David R. Penney
The health information professional is consistently monitoring coding, documentation, and data quality to insure reimbursement and compliance with healthcare standards. As healthcare costs rise, consumer needs increase, and new consumer needs emerge, it is important for the healthcare industry to evaluate trends and develop systems of care that are effective and efficient as well as affordable for both the patient and provider. Quality data and accurate information are essential to decision making, and the health information professional's diligence in assuring this is critical to the healthcare provider's future.
While it is impossible to present all of the information available for healthcare decision making in one brief paper, we will explore a few that have been instrumental in a community outpatient behavioral health clinic. Keep in mind that each organization has its own character based on the type of services delivered and systems designed. The scenarios presented are intended to offer some insight into possibilities and illustrate how one organization used data to provide meaningful information. This example also demonstrates the importance of the health information professional's role in this process.
Oversight agencies or payers--often federal and state government entities--govern much of a provider's data collection. While this information helps determine system and funding needs, we will be discussing using this same data from the individual provider's perspective. Since development of systems of care that meet the consumer's needs is our primary concern, we will look at data that provides information on the effectiveness and efficiency of services as well as issues related to costs.
The diagnostic profile of persons served is a key variable in any outcomes analysis. Unfortunately, this information is sometimes inaccurate or incomplete. Unlike the physical health setting where doctors provide diagnoses, there are a variety of disciplines in the behavioral health setting that formulate the diagnostic profile of the patient. Sometimes there can even be conflicting information as to the primary diagnosis, or there can be co-occurring diagnoses for co-occurring services. Behavior health providers use a multiaxial system (DSM IV) for developing a clinical profile of an individual's diagnoses, psychosocial and environmental problems, and level of functioning. The five axes in the DSM IV are not only used to give a comprehensive overview of the patient and identify a clinical focus, but provide information for reimbursement and outcomes management. Information from the latter forms the basis for clinical practice, service design, and analysis of effectiveness and efficiency of treatment services. It is therefore critical that the data obtained through a multiaxial assessment be accurate. The analysis that follows explains some of the problems that occur and one organization's approach to insuring quality data and the information derived from this data to determine outcomes of care.
Inaccuracies between the data in the database and information in the clinical record for Axis I (primary diagnosis) along with inconsistencies in the scoring of the patient's level of functioning (Axis V, Global Assessment of Functioning [GAF]) were identified as significant problems. Both required the development of data quality checks for the HIM and Quality Improvement (QI) staff. Systems were designed to review diagnoses, analyze data, and insure accurate data was captured. QI staff members were already reviewing treatment plans for specific requirements when entering review dates into the tasks file. This review was extended to include checking the diagnoses in the database with the diagnoses on the treatment plan. As a result of this monitoring, we found that the diagnoses were often changed by other disciplines (nurse, case manager, social worker, etc.) at treatment plan review while the treating psychiatrist had different diagnoses. A process was developed where doctors' transcribed notes would be reviewed for any diagnoses change and the database updated. Any changes would be noted in the database and notification sent to staff responsible for updating the treatment plan. If there was disagreement about the diagnosis, the doctor's diagnosis would take precedent until both parties could discuss it and come to agreement. All staff requests, other than the medical staff, to update the diagnosis in the database would require a change sheet for QI/ HIM to review. If the diagnosis conflicted with the database, it would not be entered until the staff worker had come to an agreement with the treating psychiatrist. This process not only corrected database and documentation inconsistencies but also provided opportunities for further discussions on client care and improving the quality of services.
Timely correction of diagnoses was critical to the reimbursement process and necessary to prevent incorrect or improper billing. Since many patients receive multiple types of services at the Center, it was also necessary to make sure the appropriate diagnosis was applied to the services provided. (If mental health counseling was being provided, the patient's primary diagnosis could not be polysubstance abuse; neither could billing occur for a substance abuse session for a patient whose sole diagnosis was bipolar disorder.) While compliance reviews could randomly catch some problems, it was necessary to build into the database error reports that would identify services that did not have a reimbursable diagnosis. HIM staff members were also trained to identify discrepancies during the data entry process of program enrollment.
The inherent problem found with the GAF scores for Axis V was rater reliability. It became even more apparent when GAF scores were missing upon entry into Kent Center programs or at discharge from Kent Center programs. Further analysis showed that there was very little movement in scale scores regardless of the time spent in treatment. To remedy the problem with these scores, we brought in trainers to educate staff on how to use the GAF. Once staff was fully versed in GAF scoring, we observed both positive and negative movement in their pre- and post-treatment scores. (See the Figure 1.)
The problem with missing scores tied in with another problem we were encountering with missing data fields in admission paperwork. A monitoring system was set up so that each error was tracked by staff and by program. Monthly reports were sent to management about what kind of mistakes were being made, who was making them, and suggestions for training staff. When comparing September 2002 through March 2003 with September 2003 through March 2004, there was an 81 percent decrease in the number of errors in the admission paperwork. (See Figure 2.) This equated to increased productivity of the HIM staff, increased accuracy and timeliness in reporting data fields, and, to a lesser extent, a decrease in documents being passed between staff more than one time and not available when needed. The improvements in Axis V data now provide the Center with more accurate information on the functional level of patients on discharge and the ability to analyze whether the type and amount of service provided was beneficial to the patient. Diagnostic information from Axis I and V is also analyzed to determine the types of consumer needs and the best practice to meet those needs.
Another data quality concern that The Kent Center encountered was the inaccuracy in the use of an assessment tool for children called CAFAS (Children and Adolescent Functional Assessment Scale). The implementation of this tool requires that staff from an entire program attend a one-day training at which they are tested for their reliability and competency in administering the tool. The need for reliability and competency also requires annual reliability testing. The problem that arose was that a large portion of our clinical staff was not versed in using computer applications. This required training staff not only in navigating and operating within the software, but also in some cases how to operate a personal computer. Since inception in August 2002, QI staff has spent on average of 8.39 hours a month training staff, performing data clean up, executing administrative duties, and reporting the data. While the time spent doing each task has fluctuated over time, the transition from trainer to administrator brought about a disposition of staff time from roughly one-quarter time administrative and three-quarters training to one-quarter training and three-quarters administrative. This is due not only to the staff needing less training as they become familiar with the software, but it was also due to the requirement that reports and data exports be performed, which increases the amount of time cleaning and formatting the data. Overall, the average monthly time spent performing CAFAS duties by administrative staff has decreased by roughly 30 minutes per month since inception, and the trending of hours spent shows a decrease since inception. As far as the outcome results, we have excellent information that shows the functional status upon admission validates the need for services and the provision of services has made a difference in the child's functional level upon discharge. (See Figures 3 and 4.)
HIM staff has always been a part of documentation audits; however, the need to monitor this more closely due to stricter licensing and accreditation requirements and the OIG's compliance initiatives has raised the need for accurate monitoring data for management. The compliance program at The Kent Center tracked a variety of errors including missing treatment plans, progress notes, data entry and billing errors, privacy issues, and other compliance related concerns. A database to house compliance audit findings was created, and reports were written for both administrators and mangers on what was being found in the audits. Forms were created to inform staff of lost revenues due to errors attributable to them. Trainings were conducted for individual programs using data from that program's compliance audits. The Kent Center documentation training targeted the errors found in the compliance audits. The outcome of the increased auditing, monitoring, and notification has led to a 27 percent decrease in compliance errors. (See Figure 5.) Information from compliance audits has helped all staff learn the importance of their part in insuring accurate data and documentation. Incomplete, inaccurate, and missing documentation all equate to loss revenues.
The Kent Center's licensing body requires data submissions that include 11 targeted health conditions: Alcohol abuse, Hypertension, Hypercholesterolemia, Smoking, Asthma, Hepatitis, Obesity, Drug Abuse, Diabetes, COPD, and Life Threatening Viral Infections. These areas are obtained via Yes/No questions that staff asks the client at admission and annually thereafter. This information is used to develop service plans as well as systems of care for individuals that are receiving behavioral health services. While the accuracy of this data is important for individual treatment planning, it is vital for the development of statewide systems of care and assessing the type of co-occurring conditions that incur healthcare costs and impact behavioral health issues. Statewide systems can measure the numbers and types of services accessed by behavioral health patients and provide the basis for future planning.
Obviously, costs for service delivery are a concern for every healthcare provider. A lot of time and money has been spent implementing compliance programs. However, there are other areas where data may identify cost inefficiencies. One such inefficiency is that of appointments not kept, which The Kent Center refers to as FKA (failed to keep appointment). The loss of revenue, interruption of clinical milieu, and decrease in positive outcomes were only a few results of the increasing problem with FKAs. This is an enormous issue in an outpatient behavioral health setting and one that requires close monitoring and accurate data.
When we started analyzing the appointment data, a problem quickly arose. There was a discrepancy between what clinicians were documenting on their time logs and what the receptionist was reporting at the front desk. Sometimes clinicians were not informing the front desk that clients had called and cancelled. While the clinician was documenting cancellation, the receptionist was documenting an FKA. Another problem arose in some clinicians not understanding the need to document FKA and cancellation data on their time logs. These clinicians would use "Admin Time" to fill in the gaps left by clients who did not show or cancelled. Yet another problem was found in that sometimes the client would not check in appropriately with the front desk staff. This led to time on the clinician's log but an FKA on the front desk documentation. All of these issues meant we had significant problems with the reliability of the FKA data. Staff education and training were provided as well as education to clients about checking in upon arrival. Trending of FKAs has improved, and we now have more reliable data.
Auditing, analysis, trending, education and training, monitoring, and re-auditing are all a part of ensuring that quality data is available. Data drives reimbursement, outcomes, systems design, and decision making. HIM professionals have the responsibility of providing or overseeing many of the day-to-day functions that insure data, documentation, and information are accurate and timely. Information validates the need for services and the effectiveness of services provided. Outcome data assist in shaping best practice decisions and positively affect the bottom line. It is not difficult to figure out that quality data provides quality information, which leads to improved systems of care.
Authorization/Informed Consent for Use and Disclosure of Health Care Information Grid
Wisconsin Statutes and the Federal Privacy Law
|Source: 2004 IFHRO Congress & AHIMA Convention Proceedings, October 2004|