Janis L Huston, PhD, MEd, RHIA
Dr. Georg A. Brox
Introduction to the Information for Health NHS Strategy Document
The National Health Service (NHS) of the United Kingdom (UK) published its Information for Health agenda in 1998 to address such issues as information management, information technology (IT), security, and data quality. In the strategy document, a stringent timetable was proposed to implement a NHService-wide electronic health record (EHR), delineating specific levels of development including clinical administrative data, integrated clinical diagnosis and treatment support, clinical activity support, clinical knowledge and decision support, and specialty specific support to the highest level of the advanced multi-media and telemedical record.
It was an ambitious document which any health information management (HIM) professional could get excited about as it appeared that the UK was well down the path of stringent data security policies and short-term targets for full implementation of the EHR. However, after realizing that the HIM profession does not even exist in the UK and after observing that the state of paper-based records was in total chaos, it was evident how unrealistic those targets set out by the agenda really were. With no documentation standards in place for clinical recording and with no one responsible for monitoring the information in the patient records, the goal of quality computer-based patient data appears to be even more remote.
The Data Accreditation Programme for Assessment of Data Quality
Part of the Information for Health strategy document addressed the need for improvement of data quality. Efficiency teams had found that both time and effort were being wasted by NHS organisations because of poor data quality. They finally realized the importance of high quality, consistent, timely, and comprehensive information for supporting patient care, for management and planning purposes, and for accountability. A data accreditation process was proposed to assure that agreed data quality standards were met by health care providers in order to ensure confidence that the information provided was fit for purpose. A Data Accreditation Programme was subsequently developed and put forward in the Information for Health strategy proposal as a recommended action to improve the level of data quality.
The Data Accreditation methodology for acute care providers was first tested in several NHS pilot sites. The methodology involved a three-stage general audit process which was initially voluntary but became mandatory for acute care from 2000/2001. Even though the process became non-voluntary at that time, by March 2002 which was a full two years after the original mandate, only one acute care facility had successfully completed the second stage of the accreditation process.
The Initial Data Accreditation Methodology
Data Accreditation initially involved only four data groups in acute care which included inpatient and outpatient data as well as others unique to the NHS system such as waiting list management data. Generally, it was a three-stage process of reviews aimed at determining whether pre-defined National data quality standards had been achieved by the provider. These ten data quality standards or criteria included:
- Security and confidentiality
- Validation and quality assurance
- Health Records Management
- Completeness and Validity
The three stages of Data Accreditation included 1) the Checklist Review, 2) the Review of Management Processes, and 3) the Data Outputs Review. The process involved an assessment of both internal and external use of the data and it was an ongoing cycle of review. The accreditation process related purely to data on the hospital patient administration system (PAS) which involved mostly administrative or non-clinical patient data in the Stage 1 and Stage 2 reviews. Clinical coding data was to be audited in the Stage 3 Data Outputs Review. These PAS data were considered the cornerstone upon which to build the proposed NHS electronic health record.
The Three Stage Data Accreditation Review Process
Stage One--The Checklist Review comprised an internal assessment carried out by a data quality review team that had to be put into place in each hospital. The purpose of this review was to provide a snapshot of current performance of the hospital in managing patient data. This was a brief, high-level review involving questionnaires for staff that collected, managed, and used the data. From the Checklist, in theory, the data could then be analyzed for completeness and validity. From this Review, it was hoped that it could be established as to whether or not the health care provider was complying with relevant legislation such as the Data Protection Act 1998. It was at this stage that it could be determined whether or not there were current documented policies and procedures for data capture and management in each area of the hospital where data were collected and whether a comprehensive training program was in place. The Review was also meant to establish if there were effective procedures in place for auditing data quality, for validation and quality control, accountability and communications.
Stage Two--The Review of Management Processes included both an internal and external detailed assessment of the processes and procedures used by the health care provider to support and manage the patient data in the main patient-based systems. Basically, this stage was meant to assure that policies and procedures were written, were current, and were actually being used by staff who were inputting patient data into the system. The purpose of this Stage was to assess the processes against 7 of the 10 standards (1 to 7 of the criteria listed above), to provide and validate evidence that policies and procedures matched current practice, and to identify any remedial actions that were necessary.
Stage Three--The Review of Data Outputs, again comprises a two-step phase, intended to include both internal and external audit of clinical data. At this stage, the last three standards from the above list were audited which included Completeness/Validity, Timeliness, and Accuracy. This part of the process dealt with clinical patient data as the data outputs, in the form of auditing clinically coded data by reviewing 300 patient records to validate codes and substantiate documentation in the patient record for the codes entered into the PAS system.
External Assessors’ Experiences in the Accreditation Process
Both authors were Certified External Assessors for the Data Accreditation Programme. Involvement at the beginning of this process allowed observations and input into the development of the Programme. Participation in the Data Accreditation Working Group at the National level provided the opportunity to help to steer the program in the right direction, based on professional experiences in accreditation and audit from outside of the UK system.
Part of the developmental work included clarification of the ten data quality standards in the form of a Compliance Matrix. The Matrix included the definition of each standard, an example of Best Practice, a full clarification and interpretation of the standard, guidance as to what the External Assessor would be looking for during an assessment, the weighting of the requirement, and determination of the acceptable level required for accreditation. Also included in the Matrix were suggested sources of evidence to determine compliance with the standard.
The development of the Compliance Matrix was a valuable tool for each hospital data quality review team to assure they were prepared during the self-assessment phase to request an external audit. The Matrix clearly laid out what was required to meet each standard, how to indicate compliance, and to clarify the types of supporting evidence to supply. The process was very straight forward and with the experiences of the authors which included such rigorous accreditation processes as the Joint Commission on Accreditation of Healthcare Organizations (Joint Commission) and the College of American Pathologists (CAP) laboratory accreditation processes in the United States (US), this appeared to be quite elementary. After all, the standards were mostly focused simply on policies and procedures which should be a standard tool already in place in any health care organization.
What the Assessors found during their Stage 2 reviews, however, was that not only were there no HIM professionals (or equivalent) involved in data quality, but there was no one at all dedicated to that role within the acute care organizations. Simple management tools such as organizational charts were non-existent for the most part and, while job descriptions might exist, they were not necessarily relevant to the jobs that were currently being done. Even more shocking was the fact that there were no standard policy and procedure manuals in existence in these hospitals, particularly those addressing management of patient information. Something as basic as policies and procedures appeared to be a foreign concept to these health care facilities. Managers seemed to have no knowledge as to how to write these documents let alone how to differentiate between a policy and procedure. Every thing that is learned in the basic entry-level HIM management curriculum in the US and Australia appeared to be a new concept to managers working in the NHS.
Impact of the Lack of Policies and Procedures and Clear Lines of Accountability
It is alarming to think that the National Health Service has been operating for over 50 years without utilizing the most basic management tools, particularly in regards to the management of confidential patient information. This is especially true since NHS reforms in the early 1990s changed the health service to a more management-based organization. The glaring gap that is filled by the HIM professional in other health care systems, such as the US and Australia, was blatantly highlighted by this exercise of review in the Data Accreditation process. The concept of the program and its goals were theoretically good, but the distance between what should be normal best practice and what was actually happening at the front lines was appallingly wide. The reality of the lofty targets and plans for the electronic health record laid out in the Information for Health strategy document is, in the opinion of the authors, sorely missing the basics at the ground level. To expect a computerized system to fill these blatant gaps found in the chaotic paper-based systems is unrealistic and naïve at best.
The deeper one drills down into the basic problems with information management in the NHS, the more one realizes that there is no one at all who is held accountable for the recording of patient data (clinical or non-clinical) nor are there mandated documentation standards or even guidelines for health care providers to follow. Unfortunately, even if there were mandates for clinicians, there was no one assigned with the responsibility to monitor and enforce those standards.
The Data Accreditation Programme has served its purpose in highlighting the deficiencies in the system, yet there are unfortunately no plans in place to rectify those deficiencies before moving on to the next level of EHR development. Most of the NHS hospitals which completed Stages One and Two assessments may have passed the low standards required, but the mere ticking of a box on a checklist will not improve the quality of the data. The Data Accreditation audit process served merely to emphasize the symptoms of some major managerial problems.
Current Status of the Data Accreditation Programme
As has been a pattern in the NHS, if one initiative does not succeed, the name is changed and another similar one is put into place. In this case, even though the momentum was gathering and work was starting to highlight the need to focus on data quality in the acute care hospitals, the program disappeared after Stage Two assessments were carried out on most of the NHS hospitals. Stage Three Review of Data Outputs never materialized under the guise of Data Accreditation.
Nevertheless, all is not lost. The latest initiative in this area is now known as the NHS Information Governance program which has adopted some of the Data Accreditation standards and expanded them to include other types of health care providers besides acute care such as primary care and mental health. The Information Governance Standards, developed by the Department of Health and NHS Information Authority, are quite comprehensive and have been applied to a self-assessment tool called the NHS Information Governance Toolkit which is available to NHS organizations through the NHS Intranet. The Toolkit has incorporated a number of the Data Accreditation standards, particularly from the Stage Three Review Data Outputs which will focus on the quality of clinically coded data. These Information Governance Standards will now mandate at least annual external audits of clinical coding quality with a review of a minimum of 100 patient records. This initiative is in its early stages and Information Governance will hopefully help to move forward the ambitious agenda outlined in the original NHS Information for Health plan for managing patient information.
Audit Commission. "Data Remember: Improving the Quality of Patient-Based Information in the NHS." London: The Audit Commission, 2002.
Audit Commission. "Setting the Records Straight: A Study of Hospital Medical Records." Wetherby: Audit Commission Publications, 1995.
Audit Commission. "Setting the Records Straight: A Review of Progress in Health Records Service." Abingdon: Audit Commission Publications, 1999.
Brox, Georg A., and Janis L Huston. "The Application of the MPEG-4 Standard on Telepathology Images for Electronic Patient Records: A Comparative Analysis." Journal of Telemedicine and Telecare 9 (suppl 1) (2003): 19-21.
Brox, Georg A., and Janis L Huston. "The Impact of MPEG-4 Standard on Electronic Reporting for Mobile Multimedia Patient Records." Journal of Telemedicine and Telecare, 8 (suppl 2) (2002): 115-117.
Department of Health. Data Protection Act 1998 Protection and Use of Patient Information. London: Department of Health, 2000.
Department of Health. Delivering 21 st Century IT Support for the NHS: National Strategic Programme. London: Department of Health, 2002.
Huston, Janis L. "A Telemedicine Record Model." Journal of Telemedicine and Telecare 3 (suppl 1), (1997): 86-88.
Huston, Janis L. " Managing Telehealthcare Information." Journal of Healthcare Information Management (HIMSS) 13 no. 4 (1999): 49-58.
Huston, Janis L. "Telemedical Record Documentation." Topics in Health Information Management 19 no. 3 (1999): 59-65.
Huston, Janis L. " Telemedical Record Documentation: A Preliminary Survey." Journal of Telemedicine and Telecare 5 (suppl 1) (1999): 6-8.
Huston, Janis L. "Telemedical Records: Documentation, Legal, and Privacy Concerns." In Confidence 6 no.4 (1998): 1-4.
Huston, Janis L. "Telemedical Records: The Weak Link in Telemedicine." Journal of the American Health Information Management Association 67 no.6 (1996): 69-71.
Huston, Janis L. " The Need for Mandatory Clinical Recording Standards." Clinical Medicine (Journal of the Royal College of Physicians) (in press).
Huston, Janis L. " The State of Patient Records in the UK: A View from Abroad." Biomedical Informatics Today no.34 (2002): 4-5.
Huston, Janis L., and Georg A Brox. "A Breach from Within: How Secure Are Your Electronic Telemedical Records?" Journal of Telemedicine and Telecare 7 (suppl 1) (2001): 79-80.
Huston, Janis L., and Georg A Brox. "Professional Ethics at the Bottom Line." Health Care Manager (in press).
Huston, Janis L., and Timothy A. Smith. "Evaluating a Telemedicine Delivery System." Topics in Health Information Management 16 no.3 (1996): 65-71.
Huston, Terry .L, and Janis L Huston. "Security in the Management of Information Systems." Health Care Supervisor 16 no.4 (1998): 28-34.
Huston, Terry L., and Janis L Huston. "Is Telemedicine a Practical Reality?" Communications of the Association for Computing Machinery (ACM) 43 no. 6 (2000): 91-95.
National Health Service. Building the Information Core: Implementing the NHS Plan. London: Department of Health, 2001.
National Health Service. The NHS Plan: A Summary. London: Department of Health, 2000.
National Health Service Executive. Information for Health: An Information Strategy for the Modern NHS 1998-2005. United Kingdom: National Health Service, 1998.
National Health Service Executive. Working Together with Health Information: A Partnership Strategy for Education, Training and Development. United Kingdom: National Health Service, 1999.
National Health Service Information Authority. Data Accreditation for Acute Providers. Loughborough, Leicestershire: NHS Information Authority, 2000.
Wanless, Derek. Wanless Report: Securing Our Future Health Taking a Long Term View. London: HM Treasury, 2002.
|Source: 2004 IFHRO Congress & AHIMA Convention Proceedings, October 2004|