by Jessica Ryan, MA; Karen Patena, MBA, RHIA, FAHIMA; Wallace Judd, PhD; and Mike Niederpruem, MS, MA, CAE
Abstract
As the health information management (HIM) profession continues to expand and become more specialized, there is an ever-increasing need to identify emerging HIM workforce roles that require a codified level of proficiency and professional standards. The Commission on Certification for Health Informatics and Information Management (CCHIIM) explored one such role—clinical documentation improvement (CDI) practitioner—to define the tasks and responsibilities of the job as well as the knowledge required to perform them effectively. Subject-matter experts (SMEs) defined the CDI specialty by following best practices for job analysis methodology. A random sample of 4,923 CDI-related professionals was surveyed regarding the tasks and knowledge required for the job. The survey data were used to create a weighted blueprint of the six major domains that make up the CDI practitioner role, which later formed the foundation for the clinical documentation improvement practitioner (CDIP) credential. As a result, healthcare organizations can be assured that their certified documentation improvement practitioners have demonstrated excellence in clinical care, treatment, coding guidelines, and reimbursement methodologies.
Keywords: job analysis, survey, clinical documentation improvement (CDI), documentation, Commission on Certification for Health Informatics and Information Management (CCHIIM), credential, exam, health information management (HIM) job roles
Introduction
As the health information management (HIM) profession continues to expand and become more specialized, there is an ever-increasing need to identify emerging HIM workforce roles that require a codified level of proficiency and professional standards. These evolving roles often advance into specialty areas or concentrations within the larger HIM industry and morph into in-demand positions with specialized competencies. The Commission on Certification for Health Informatics and Information Management (CCHIIM) explored one such role—clinical documentation improvement (CDI) practitioner—to define the tasks and responsibilities that the job comprises as well as the knowledge required to perform them effectively. An in-depth job analysis was conducted to codify the role, which later formed the foundation for developing the clinical documentation improvement practitioner (CDIP) credential. As a result, healthcare organizations can now have the confidence that their certified documentation improvement practitioners have demonstrated excellence in clinical care, treatment, coding guidelines, and reimbursement methodologies.
h2>Background
Emerging professions or job roles bring an exciting air of possibility and uncertainty. Professional regulation, standards, and universal competency levels for these new roles are often ambiguous at best, leaving employers and job incumbents alike searching for a legitimate measure of job competence. A job analysis is the best tool to fully study and delineate these new workforce roles. The job analysis can later be used to form the foundation for a certification examination designed to assess the competency level of those interested in pursuing this role.
A job analysis (also known as a practice analysis, job/task analysis, or role delineation study) is conducted to determine the relevant tasks and knowledge, skills, and abilities (KSAs) needed to competently perform those tasks for a particular role. The main goal of a job analysis is to clearly and concisely define, through subject-matter expert (SME) validation, what professionals in that role do on the job.1,2 The job analysis is an essential method for demonstrating the job relatedness of certification examination content, as the empirical study of a workforce role provides a linkage between job-related data and exam content.3 The importance of job analyses is further outlined through National Commission for Certifying Agencies (NCCA) and American National Standards Institute (ANSI) standards and guidelines. NCCA Standard 11 states: “The certification program must employ assessment instruments that are derived from the job/practice analysis and that are consistent with generally accepted psychometric principles.”4 The ANSI standard ANSI/ISO/IEC 17024:2003 further notes that a properly executed job analysis forms the basis of a valid, reliable, and fair assessment that reflects the KSAs required for competent job performance.5
A sound, comprehensive job analysis is integral to the legal defensibility of a credentialing exam, as the content domains and knowledge topics tested must be clearly linked to job-related performance criteria, resulting in content validity.6 Job analyses are often used as evidence of content validation during high-stakes examination legal challenges. Standard 14.14 of the Standards for Educational and Psychological Testing notes: “The content domain to be covered by a credentialing test should be defined clearly and justified in terms of the importance of the content for credential-worthy performance in an occupation or profession. A rationale should be provided to support a claim that the knowledge or skills being assessed are required for credential-worthy performance in an occupation and are consistent with the purpose for which the licensing or certification program was instituted.”7 In addition, the following criteria must be met in order for a job analysis to produce a content-valid examination:
- The exam domains, or main subject matter areas, must be accurately weighted to reflect their relative importance on the job;
- The difficulty level should match minimal competence for the credential; and
- The job analysis should cover the full range of tasks performed in that role.8
CCHIIM conducts routine environmental scans to monitor any changes or growth opportunities in the health information and informatics workforce that affect the profession, and as a result, the commission decided to conduct a CDI practitioner job analysis. Numerous industry trends, such as the increased adoption of electronic health records (EHRs), an increase in health insurance fraud, and the need for complete and accurate documentation to support the requirements of the International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM), all suggest the need for a highly qualified, specialized set of documentation improvement specialists who meet stringent professional guidelines.9 Additionally, general emphasis on revenue cycle processes, regulatory requirements, and continuous quality improvement converge to necessitate this type of credential. Because clinical documentation specialists have expertise in clinical care, coding guidelines, and reimbursement methodologies, a nationally recognized CDI-related credential would distinguish those practitioners as competent to provide direction relative to clinical documentation in the patient’s health record, thus promoting the HIM profession overall.
To explore the business need for and feasibility of developing a new CDI credential, CCHIIM conducted a thorough needs analysis and idea brief outlining the business impact, strategic context (including industry trends and member/customer needs), value proposition, and sustainability of this exam. The commission concluded that the exam would be a natural extension of the American Health Information Management Association (AHIMA) offerings that support clinical documentation improvement, including CDI practice briefs, a CDI tool kit for healthcare organizations and professionals, a practice community, related educational resources, and in 2007, through its House of Delegates, an approved resolution on quality data and documentation in EHRs.10-12 Additionally, creating a salient credential to validate the clinical documentation role was found to be both reactionary and forward-thinking because it would be a response to market demand from clinical documentation specialists already working in the HIM continuum, but also an opportunity to further expand and welcome complementary healthcare professionals to the HIM arena. This research served to solidify the general scope of a CDI-related credential and justify further exploration of developing this exam.
Methods
A task force composed of 19 CDI SMEs met for two days in May 2011 to create a job analysis survey to be sent to CDI industry practitioners. The SMEs on the task force were selected based on their clinical documentation expertise, as all were currently working in roles focused on clinical documentation improvement, education, and/or medical coding quality. A mix of SMEs, as reflected in Table 1, was chosen to reflect diversity in work setting, geographical location, supervisory level, and gender in order to obtain a representative sample of the specialty as a whole.
Table 1: Job Analysis Task Force Demographics
Characteristics |
% (N) |
Gender
Male
Female |
10.5% (2)
89.5% (17) |
Geographic location
Northeast
South
Midwest
West
Pacific (Alaska and Hawaii) |
15.8% (3)
52.6% (10)
10.5% (2)
21.1% (4)
0% (0) |
Work setting
Hospital/health system
Consulting firm
Information technology (IT) vendor
Government agency |
84.2% (16)
5.3% (1)
5.3% (1)
5.3% (1) |
Supervisory level
Specialist
Consultant
Manager
Director/senior director |
47.4% (9)
10.5% (2)
26.3% (5)
15.8% (3) |
The job analysis task force was charged with developing a comprehensive list of knowledge and task statements required of the CDI practitioner role. Additionally, the group had to define the major domains (also known as topics or content areas) that represent the primary job responsibilities or facets of the job. The group determined that the knowledge and task statements would each be mapped to one of the six domains represented in Table 2.
Table 2: CDI Content Domains
No. |
Domain |
1 |
Clinical & Coding Practice |
2 |
Leadership |
3 |
Record Review & Document Clarification |
4 |
CDI Metrics & Statistics |
5 |
Research & Education |
6 |
Compliance |
To help define the scope of the related credential, the task force used an initial list of knowledge and task topics prepared in advance by AHIMA staff together with a small team of experienced CDI specialists. The task force then refined this task and knowledge list and supplemented it with their own insights based on their shared experience on the job. Additionally, the group developed “future topics” to identify potential developmental areas and predicted future job requirements for the CDI field as it continues to evolve. These included tasks that CDI practitioners may not be presently engaged in but will likely be asked to perform in the future, and knowledge areas that CDI practitioners will likely need to learn for the future.
These knowledge areas, tasks, and future topics were used to create the job analysis validation survey. In addition to defining the role in terms of the required knowledge and tasks performed on the job both currently and in the future, the task force also created survey scales regarding frequency and importance (listed in Table 3 and Table 4) to be used in the job analysis survey. A discrete, five-point Likert scale was selected to evaluate frequency, with possible response choices of “Never” (1), “Quarterly” (2), “Monthly” (3), “Weekly” (4), and “Daily” (5). A discrete, three-point Likert scale was used for the importance ratings, with the possible responses of “Not Important” (1), “Somewhat Important” (2), and “Very Important” (3). The task force members selected these rating scales because they felt that they best approximated the rate of occurrence and general importance levels relative to the job.
Table 3: Knowledge and Task Survey Questions

Table 4: Future Knowledge and Task Survey Questions

In order to find the appropriate group of practitioners to survey, random sampling of a targeted sector of the AHIMA membership database was conducted. To meet the criteria for inclusion in the survey, individuals had to be in one of four roles, practice in one of three clinical settings, and have at least one of three credentials, as shown in Table 5. At the time, 12,914 individuals in the AHIMA membership database met those requirements. A random number generator randomly assigned each member a number from 1 to 12,914, with replacement. In the sample, 4,923 candidates received numbers below 5,000 and were included in the survey. Based on the criteria of clinical setting, supervisory level, RHIA and RHIT certification, CCS certification, and RN registration, the sample selected was within 1.5 percent of the distribution of members for each criterion.
Table 5: Survey Demographic Variables
Role |
Setting |
Credentials |
HIM technician |
Acute care |
RN |
Director |
Integrated healthcare |
RHIA |
Manager |
Long-term care |
RHIT |
Clinician |
|
CCS |
Note: RN, registered nurse; RHIA, registered health information administrator; RHIT, registered
health information technician; CCS, certified coding specialist. |
Survey invitations were e-mailed to the 4,923 potential respondents on Friday, June 24, 2011, and the survey closed at midnight on Tuesday, July 12, 2011. The response rate was 14.7 percent, with 733 respondents completing the survey and demographic questions. The sampling error was +/- 1.1 percent at the 95 percent confidence level.
In July, the job analysis task force reconvened to review the survey results. The original weightings given in their preliminary exam blueprint were compared to the weights resulting from the job analysis validation survey. To reconcile the two, the task force voted for the target weights for each content area within the knowledge and task domains. The percentage weighting of each domain was determined based on the aggregate importance and frequency ratings given to each domain. The domains that contained tasks and knowledge statements rated as more important or more frequently performed received higher percentage weights.
For each of the target weights, a range of +/- 2 percent was calculated to create the maximum and minimum percentages for each domain. These maximum and minimum percentage weightings became the weightings for the final exam blueprint and determined the total number of test items included in each domain. A percentage range, as opposed to an absolute percentage, was created to allow for variance between preliminary blueprint expectations and survey responses, serving as a buffer for the margin of error. Additionally, the maximum and minimum domain percentages allowed for some leeway to slightly adjust weightings by topic area as necessary based on industry changes.
Results
The final domain weightings, including the maximum and minimum percentage ranges, preliminary weightings, survey weightings, and target weightings, are shown in Table 6. The target weighting was determined by the task force after comparing the survey data with the original preliminary blueprint.
Table 6
CDI Exam Blueprint, Including Presurvey (Original), Postsurvey, and Target Percentage Weightings by Domain
CDI Practitioner Exam Blueprint |
|
Task |
Original |
Survey |
Target % |
|
|
No. |
Domain |
Weighting |
Weighting |
Weighting |
Max |
Min |
1 |
Clinical & Coding Practice |
26% |
23.2% |
24% |
26% |
22% |
2 |
Leadership |
15% |
14.1% |
15% |
17% |
13% |
3 |
Record Review & Document Clarification |
26% |
25.8% |
26% |
28% |
24% |
4 |
CDI Metrics & Statistics |
15% |
19.1% |
16% |
18% |
14% |
5 |
Research & Education |
9% |
12.4% |
13% |
15% |
11% |
6 |
Compliance |
9% |
5.4% |
6% |
8% |
4% |
|
Total |
99% |
100.0% |
100% |
112% |
88% |
|
|
|
|
|
|
|
|
Knowledge |
Original |
Survey |
Target % |
|
|
No. |
Domain |
Weighting |
Weighting |
Weighting |
Max |
Min |
1 |
Clinical & Coding Practice |
21% |
33.4% |
28% |
30% |
26% |
2 |
Leadership |
13% |
16.0% |
16% |
18% |
14% |
3 |
Record Review & Doc. Clarification |
22% |
14.0% |
21% |
23% |
19% |
4 |
CDI Metrics & Statistics |
12% |
9.9% |
12% |
14% |
10% |
5 |
Research & Education |
20% |
16.5% |
14% |
16% |
12% |
6 |
Compliance |
13% |
10.3% |
10% |
12% |
8% |
|
Total |
101% |
100.0% |
100% |
112% |
88% |
Table 7 and Table 8 reflect the frequency and importance survey ratings for each task and knowledge statement ranked from highest to lowest in each domain. The weighted average of each task and knowledge rating was calculated from the aggregate survey responses. Because the frequency ratings used a scale of 1 to 5 and the importance ratings used a scale of 1 to 3, a scaling factor of 1.667 was used to multiply the importance rating so that its weight would be equal to that of the frequency rating. These corrected mean frequency and importance ratings were used to rank the tasks and knowledge statements within their domains and were also used to calculate the weight for each domain.
Task Frequency and Importance Average Weightings
Domain |
Task Items |
Frequency and Importance Average |
Clinical & Coding Practice |
1 |
Use reference resources for code assignment |
3.367 |
1 |
Identify the principal and secondary diagnoses in order to accurately reflect the patient's hospital course |
3.361 |
1 |
Use coding software |
3.326 |
1 |
Assign and sequence ICD-9-CM codes |
3.313 |
1 |
Use coding conventions |
3.230 |
1 |
Display knowledge of payer requirements for appropriate code assignment (e.g., CMS, APR, APG) |
3.016 |
1 |
Assign appropriate DRG codes |
2.824 |
1 |
Communicate with the coding/HIM staff to resolve discrepancies between the working and final DRGs |
2.740 |
1 |
Participate in educational sessions with staff to discuss infrequently encountered cases |
2.655 |
1 |
Assign CPT and/or HCPCS codes |
2.630 |
1 |
Communicate with coding/HIM staff to resolve discrepancies in documentation for CPT assignment |
2.563 |
Leadership |
2 |
Maintain affiliation with professional organizations devoted to the accuracy of diagnosis coding and reporting |
2.876 |
2 |
Promote CDI efforts throughout the organization |
2.692 |
2 |
Foster working relationship with CDI team members for reconciliation of queries |
2.677 |
2 |
Establish a chain of command for resolving unanswered queries |
2.662 |
2 |
Develop documentation improvement projects |
2.480 |
2 |
Collaborate with physician champions to promote CDI initiatives |
2.331 |
2 |
Establish consequences for noncompliance to queries or lack of responses to queries in collaboration with providers |
2.297 |
2 |
Develop CDI policies and procedures in accordance with AHIMA practice briefs |
2.085 |
Record Review & Document Clarification |
3 |
Identify opportunities for documentation improvement by ensuring that diagnoses and procedures are documented to the highest level of specificity |
3.200 |
3 |
Query providers in an ethical manner to avoid potential fraud and/or compliance issues |
3.072 |
3 |
Formulate queries to providers to clarify conflicting diagnoses |
2.945 |
3 |
Ensure provider query response is documented in the medical record |
2.929 |
3 |
Formulate queries to providers to clarify the clinical significance of abnormal findings identified in the record |
2.896 |
3 |
Track responses to queries and interact with providers to obtain query responses |
2.785 |
3 |
Interact with providers to clarify POA |
2.567 |
3 |
Identify postdischarge query opportunities that will affect SOI, ROM, and ultimately case weight |
2.561 |
3 |
Collaborate with the case management and utilization review staff to effect change in documentation |
2.525 |
3 |
Interact with providers to clarify HAC |
2.327 |
3 |
Interact with providers to clarify the documentation of core measures |
2.287 |
3 |
Interact with providers to clarify PSI |
2.260 |
3 |
Determine facility requirements for documentation of query responses in the record to establish official policy and procedures related to CDI query activities |
2.154 |
3 |
Develop policies regarding various stages of the query process and time frames to avoid compliance risk |
2.113 |
CDI Metrics & Statistics |
4 |
Track denials and documentation practices to avoid future denials |
2.276 |
4 |
Trend and track physician query response |
2.270 |
4 |
Track working DRG (CDS) and coder final code |
2.265 |
4 |
Perform quality audits of CDI content to ensure compliance with institutional policies and procedures or national guidelines |
2.232 |
4 |
Trend and track physician query content |
2.214 |
4 |
Trend and track physician and query provider |
2.181 |
4 |
Trend and track physician query volume |
2.115 |
4 |
Measure the success of the CDI program through dashboard metrics |
1.969 |
4 |
Track data for physician benchmarking and trending |
1.964 |
4 |
Compare institution with external institutional benchmarks |
1.948 |
4 |
Track data for CDI benchmarking and trending |
1.945 |
4 |
Track data for specialty benchmarking and trending |
1.901 |
4 |
Use CDI data to adjust departmental workflow |
1.880 |
Research & Education |
5 |
Articulate the implications of accurate coding |
3.106 |
5 |
Educate providers and other members of the healthcare team about the importance of the documentation improvement program and the need to assign diagnoses and procedures, when indicated, to their highest level of specificity |
2.625 |
5 |
Articulate the implications of accurate coding with respect to research, public health reporting, case management, and reimbursement |
2.582 |
5 |
Monitor changes in the external regulatory environment in order to maintain compliance with all applicable agencies |
2.535 |
5 |
Educate the appropriate staff on the clinical documentation improvement program including accurate and ethical documentation practices |
2.441 |
5 |
Develop educational materials to facilitate documentation that supports severity of illness, risk of mortality, and utilization of resources |
2.174 |
5 |
Research and adapts successful best practices within the CDI specialty that could be utilized at one's own organization |
2.102 |
Compliance |
6 |
Apply AHIMA best practices related to CDI activities |
2.720 |
6 |
Apply regulations pertaining to CDI activities |
2.651 |
6 |
Consult with compliance and HIM departments regarding legal issues surrounding CDI efforts |
2.278 |
Note: ICD-9-CM, International Classification of Diseases, Ninth Revision, Clinical Modification; CMS, Centers for Medicare and Medicaid Services; APR, All Patient Refined; APG, Ambulatory Patient Groups; DRG, diagnosis-related group; HIM, health information management; CPT, Current Procedural Terminology; HCPCS, Healthcare Common Procedural Coding System; CDI, clinical documentation improvement; AHIMA, American Health Information Management Association; POA (present on admission); SOI, severity of illness; ROM (risk of mortality); HAC, hospital-acquired conditions; PSI (patient safety indicators); CDS (clinical documentation specialist). |
Table 8
Knowledge Frequency and Importance Average Weightings
Domain |
Knowledge Item |
Frequency and Importance Average |
Clinical & Coding Practice |
1 |
Medical terminology and anatomy and physiology |
4.89 |
1 |
Diagnostic, laboratory, and surgical procedures |
4.79 |
1 |
Pathophysiology and disease processes and treatment |
4.67 |
1 |
Definitions of principal and secondary diagnoses |
4.57 |
1 |
Pharmacology |
4.51 |
1 |
Complex clinical documentation |
4.49 |
1 |
Encoder software, DRG grouper, and coding manuals |
4.48 |
1 |
Assigning ICD-9-CM coding |
4.45 |
1 |
Procedural techniques |
4.45 |
1 |
Coding references |
4.41 |
1 |
Definition of CCs, MCCs |
4.23 |
1 |
DRG reimbursement methodologies |
3.79 |
1 |
Assigning CPT coding |
3.53 |
Leadership |
2 |
Effective communication skills |
4.87 |
2 |
AHIMA Practice Briefs |
3.97 |
2 |
Professional organizations available for resource |
3.87 |
2 |
Conflict resolution |
3.86 |
2 |
Presentation skills |
3.73 |
2 |
Performance audits |
3.68 |
2 |
Interpretation of statistical reports |
3.48 |
Record Review & Document Clarification |
3 |
Medical record structure |
4.55 |
3 |
Best practices for clinical documentation |
4.35 |
3 |
Best practices for data integrity |
4.18 |
3 |
AHIMA and compliance standards related to query process |
4.02 |
3 |
Core measures |
3.53 |
3 |
National patient safety indicators |
3.33 |
CDI Metrics & Statistics |
4 |
Effective reporting and communication techniques |
4.15 |
4 |
Presentation and spreadsheet software knowledge |
3.41 |
4 |
Statistical reports |
3.38 |
4 |
Development of statistical graphs and reports |
3.10 |
4 |
CDI benchmark metrics |
2.87 |
Research & Education |
5 |
Communication skills |
4.82 |
5 |
Writing skills |
4.53 |
5 |
Web navigational skills |
4.46 |
5 |
Coding Clinics and other reference resources |
4.30 |
5 |
Variety of uses of clinical data within an organization |
3.99 |
5 |
CDI trends and best practices |
3.38 |
5 |
Effective presentation techniques for behavior modification |
2.80 |
Compliance |
6 |
Privacy concepts |
4.82 |
6 |
Security concepts |
4.72 |
6 |
Fraud and abuse regulations |
4.26 |
6 |
Key components of data record exchange |
3.81 |
Note: DRG, diagnosis-related group; ICD-9-CM, International Classification of Diseases, Ninth Revision, Clinical Modification; CC, complication or comorbidity; MCC; major complication or comorbidity; CPT, Current Procedural Terminology; AHIMA, American Health Information Management Association; CDI, clinical documentation improvement. |
The Record Review & Document Clarification and Clinical & Coding Practice domains received the highest target weightings on the exam blueprint (26 percent and 24 percent respectively) because they had the greatest number of task or knowledge items that also had the highest frequency and importance weightings based on the survey responses. Because these areas make up the greatest proportion of the work done on the job and the knowledge required to complete those tasks, they form the largest proportion of the exam. Conversely, the Compliance domain has the smallest overall target weighting on the exam blueprint (6 percent) because it had fewer task or knowledge items, which also had the lowest frequency and importance ratings.
Table 9 and Table 10 depict the survey ratings for the “future” task and knowledge topics included in the survey. The data show that the majority of survey respondents felt that all of the future knowledge topics would be needed in the short term (within six months to one year), with the knowledge areas related to electronic health records (EHRs) being the most highly rated. The future task topic data show how many respondents were already performing each task, how frequently they perform it, and how important they rate it. Those who indicated that they do not currently perform a task were asked when they expect themselves or their organization to perform the task. Respondents were also asked to rate under which domain they felt the future task belonged. The data show that 10 to 40 percent of respondents were already performing one or more of the future tasks, while the majority of those who were not performing the tasks indicated they would either begin in the next six months to one year or would never perform that task.
Table 9
Future Task Survey Ratings
1 |
Do you create data definitions for your organization? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
202 |
174 |
182 |
107 |
67 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
30% |
168 |
564 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
83 |
59 |
11 |
15 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
13 |
48 |
107 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4+ years |
Never |
Unable to determine |
|
|
143 |
66 |
10 |
2 |
9 |
334 |
2 |
Are you involved in EHR content design? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
158 |
251 |
231 |
20 |
73 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
30% |
167 |
566 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
46 |
50 |
42 |
29 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
5 |
31 |
131 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
235 |
68 |
17 |
3 |
28 |
214 |
3 |
Are you involved in EHR and documentation improvement workflow and GAP analysis? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
140 |
228 |
215 |
94 |
55 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
24% |
140 |
592 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
25 |
57 |
37 |
21 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
2 |
32 |
106 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
204 |
74 |
13 |
5 |
16 |
280 |
4 |
Do you help define what data is included or excluded from the EHR? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
143 |
247 |
252 |
36 |
55 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
26% |
153 |
580 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
37 |
53 |
33 |
30 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
3 |
28 |
122 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
219 |
74 |
11 |
4 |
15 |
257 |
5 |
Do you evaluate usability of data in the EHR? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
163 |
235 |
228 |
49 |
58 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
28% |
162 |
571 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
33 |
47 |
30 |
52 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
3 |
29 |
130 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
210 |
78 |
14 |
2 |
10 |
257 |
6 |
Do you design EHR alerts, reminders, clinical decision support to support documentation improvement? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
156 |
231 |
231 |
47 |
68 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
21% |
126 |
607 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
43 |
42 |
19 |
22 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
7 |
30 |
89 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
231 |
82 |
13 |
0 |
17 |
264 |
7 |
Do you educate others in the proficient use of the EHR? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
140 |
264 |
137 |
15 |
177 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
28% |
162 |
571 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly Weekly |
Daily |
|
|
|
|
33 |
45 |
39 |
45 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
3 |
30 |
129 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
259 |
72 |
10 |
4 |
14 |
212 |
8 |
Do you provide feedback on EHR systems usability to physicians and other clinicians? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
123 |
294 |
138 |
27 |
151 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
30% |
168 |
565 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
47 |
47 |
39 |
35 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
7 |
39 |
122 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
201 |
87 |
12 |
2 |
13 |
249 |
9 |
Are you involved in implementing care protocols? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
176 |
307 |
102 |
36 |
111 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
17% |
106 |
626 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
66 |
21 |
6 |
13 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
15 |
24 |
67 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
223 |
65 |
12 |
4 |
15 |
307 |
10 |
Do you create continuum of care documents? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
168 |
259 |
185 |
28 |
93 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
11% |
75 |
658 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
28 |
26 |
5 |
16 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
3 |
18 |
54 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
219 |
77 |
16 |
3 |
17 |
326 |
11 |
Do you compile disparate data into understandable summary form? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
111 |
223 |
166 |
148 |
85 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
13% |
82 |
651 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
31 |
23 |
12 |
16 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
2 |
24 |
56 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
168 |
85 |
16 |
4 |
17 |
361 |
12 |
Are you involved in implementing critical paths or evidence-based medicine? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
158 |
245 |
111 |
66 |
153 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
10% |
66 |
667 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
26 |
20 |
10 |
10 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
7 |
17 |
42 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
202 |
76 |
21 |
3 |
26 |
339 |
13 |
Are you involved in the integration of data from external sources into the medical record? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
135 |
214 |
250 |
43 |
91 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
17% |
105 |
628 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
17 |
27 |
24 |
37 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
0 |
28 |
77 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
197 |
84 |
16 |
3 |
14 |
314 |
14 |
Do you help define sources of clinical data for quality measures and reporting? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
139 |
211 |
162 |
151 |
70 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
22% |
130 |
603 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
37 |
52 |
11 |
30 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
4 |
39 |
87 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
213 |
68 |
17 |
1 |
12 |
292 |
15 |
Do you review and recommend revisions to Computer-Assisted Coding? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
377 |
170 |
92 |
43 |
51 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
18% |
110 |
623 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
49 |
23 |
9 |
29 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
8 |
25 |
77 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
159 |
92 |
33 |
6 |
37 |
296 |
16 |
Do you communicate HIM principles and expertise in regards to clinical data content and integrity to clinicians? |
What domain do you think this topic belongs in? |
Coding |
Leadership |
Record Review |
Metrics |
Education |
Compliance |
Responses: |
362 |
171 |
123 |
20 |
57 |
0 |
Do you currently do this? |
Yes |
No |
|
|
|
|
Yes |
40% |
211 |
522 |
|
|
|
|
How often do you do this? |
Never |
Quarterly |
Monthly |
Weekly |
Daily |
|
|
|
35 |
61 |
41 |
73 |
0 |
|
How important is this task? |
Not Important |
Somewhat Important |
Very Important |
|
|
|
|
|
2 |
25 |
183 |
|
|
|
When do you expect you/your organization will perform this task? |
6 months to 1 year |
Next 1-2 years |
Next 2-4 years |
Next 4 + years |
Never |
Unable to determine |
|
|
174 |
62 |
7 |
1 |
13 |
265 |
Table 10
Future Knowledge Topic Survey Ratings
Future Knowledge Topic |
6 mos. to 1 year |
1–2 years |
2–4 years |
4+years |
Never |
Unable to determine |
Navigation of electronic health records (EHRs) |
482 |
81 |
26 |
4 |
29 |
109 |
EHR reporting metrics, standards and criteria |
427 |
75 |
20 |
3 |
44 |
162 |
EHR design for patient safety |
424 |
96 |
20 |
4 |
38 |
149 |
Principles of usability of EHRs |
422 |
93 |
23 |
6 |
40 |
147 |
The legal health record |
401 |
102 |
27 |
8 |
45 |
148 |
Meaningful use criteria |
385 |
114 |
28 |
7 |
36 |
161 |
Quality measures |
360 |
114 |
25 |
4 |
33 |
195 |
Automated data sources for quality measures |
357 |
109 |
21 |
4 |
47 |
193 |
Computer-assisted coding application software |
340 |
114 |
16 |
6 |
42 |
213 |
Resources to assist in data dictionary creation |
335 |
103 |
20 |
5 |
80 |
188 |
Clinical data content design and construction |
333 |
108 |
21 |
6 |
57 |
206 |
Best practices for data integrity automation |
326 |
122 |
29 |
8 |
75 |
171 |
Best practices for clinical documentation automation |
305 |
116 |
19 |
4 |
58 |
229 |
Sources of data for clinical quality measures |
284 |
111 |
27 |
6 |
67 |
236 |
Continuity of Care documents |
282 |
114 |
24 |
10 |
71 |
230 |
Process flow mapping and workflow analytics |
273 |
108 |
22 |
5 |
77 |
246 |
Principles of change management |
270 |
105 |
25 |
5 |
86 |
240 |
Finally, the survey respondent demographics are represented in Tables 11-20. Respondents’ geographic area, work setting, practice setting, facility size, type of health record system, employee status, department, job title, and age were all captured to ascertain the representativeness of the sample. All demographic characteristics were appropriately distributed, as they closely match the population’s demographic profile.
Table 11: Geographic Area of Survey Respondents
Geographic area |
Mid-Atlantic |
17.6% |
South |
25.1% |
Midwest |
36.5% |
Southwest |
6.9% |
West Coast |
13.8% |
Total |
100.0% |
Table 12: Survey Respondents' Location of Employment
Location of Employment |
Urban |
52.8% |
Rural |
34.3% |
Academic |
12.9% |
Total |
100.0% |
Table 13: Work Setting of Survey Respondents
Work Setting |
Inpatient |
36.4% |
Outpatient |
8.0% |
Both |
55.6% |
Total |
100.0% |
Table 14: Survey Respondents' Practice Setting
Practice Setting |
Acute Care |
83.7% |
Integrated Healthcare |
10.7% |
Long-Term Care |
5.6% |
Total |
100.0% |
Table 15: Survey Respondents' Facility Size
Facility Size |
<50 Beds |
15.1% |
50-100 Beds |
9.8% |
101-500 Beds |
51.2% |
501-1,000 Beds |
17.8% |
>1,000 Beds |
6.0% |
Total |
100.0% |
Table 16: Type of Health Record System Currently in Use at Respondents' Work Setting
Record System |
Electronic |
38.6% |
Paper |
13.9% |
Hybrid |
47.5% |
Total |
100.0% |
Table 17: Survey Respondents' Employee Status
Employee Status |
Employee |
96.2% |
Consultant |
3.8% |
Total |
100.0% |
Table 18: Survey Respondents' Administrative Reporting Chain of Command
Reporting Department |
HIM |
76.9% |
Quality |
6.0% |
Other |
17.1% |
Total |
100.0% |
Table 19: Survey Respondents' Job Titles
Job Title |
HIM Technician |
28.2% |
Director |
15.0% |
Manager |
19.3% |
Clinician |
4.5% |
Other |
33.1% |
Total |
100.0% |
Table 20: Survey Respondents' Age
Age |
<25 years |
1.0% |
25-35 years |
16.1% |
36-45 years |
24.2% |
46-55 years |
35.5% |
> 55 years |
23.3% |
Total |
100.0% |
Discussion
The opinions and experience of a representative sample of CDI specialists was obtained through the job analysis process to build a solid, legally defensible foundation for the CDIP credential based on job-related competency. This foundation takes shape in the exam blueprint, which outlines the main content domains tested on the exam. The weighting for each domain proportionately reflects the major components of the CDI practitioner job role. By following job analysis and test development best-practice methodology, CCHIIM was able to codify the clinical documentation improvement specialty by defining the critical factors of the job role and developing a standardized tool used to evaluate CDI practice competency. This credential will strengthen the CDI role by instilling employer confidence in CDIP-credentialed individuals who have met measured, defined, and validated professional standards.
Additionally, the job analysis will help provide direction for the specialty as it continues to grow. The job analysis included measurement of both current and future task and knowledge statements to track how the CDI practitioner role may evolve and what knowledge and abilities will be required of these workers as they grow in their roles. These “future” topics will be monitored and reevaluated in the next job analysis (typically conducted every three to five years, or sooner if the specialty undergoes an extreme transformation) to determine what adjustments should be made to the CDIP exam blueprint to best represent the profession.13
Numerous steps to minimize job analysis survey bias were taken. Survey incentives (such as the award of one continuing education unit [CEU] and an entry into an American Express gift card drawing) were offered to limit nonresponse bias and increase the response rate. Additionally, e-mailed survey reminders were sent in order to reach as many respondents as possible. Undercoverage bias was also avoided by ensuring that the demographic composition of the sample mirrored that of the population. The distribution of respondents meeting the parameters of the population (credentials, work setting, and job role) showed no significant difference in demographics when compared to the sample cohort as a whole. Therefore, neither undercoverage nor nonresponse bias was found to be a significant problem in the sample.
As Watzlaf, Rudman, Hart-Hester, and Ren noted in their 2009 article, the roles and job functions of HIM professionals are continuously changing and becoming more specialized.14 New specializations continue to emerge because of a variety of regulatory and environmental factors, and the new specializations in turn increase the need to certify individuals working in these nontraditional roles to ensure the integrity and quality of their work. HIM certification bodies must stay on top of these trends in order to provide meaningful professional guidelines and standards of excellence for these growing fields. As the CDI role and the entire HIM industry evolve, CCHIIM will continue to routinely examine job roles and functions and update the requisite body of knowledge and competency required for HIM excellence through job analyses and exam blueprint updates.
Limitations
While care was taken to ensure representativeness of the sample and obtain a satisfactory response rate, the study has some limitations. Because the population and resulting sample were drawn from the AHIMA membership database because of financial constraints and other factors, the survey results could have possibly been strengthened by casting a wider net and surveying individuals who do CDI work but are not AHIMA members.
Additionally, there is some debate about the use of five-point and three-point scales (as used for frequency and importance in this survey) versus four-point, forced-choice scales in survey research. Some argue for the use of four-point rating scales because they eliminate the tendency toward the middle and force respondents to pick a side, as opposed to a three- or five-point scale that has a “neutral” midpoint. However, four-point scales can force respondents to answer in a way that does not truly reflect their opinions, in cases when respondents may truly be neutral or middle-of-the-road in their opinion of a certain topic.15 Forcing respondents to give an untrue answer will unnecessarily skew results. These reasons led to the decision to use three- and five-point survey scales. Respondents were also given the opportunity to write in any comments they had about their ratings or the survey questions for each domain.
Conclusion
To fill an industry need for a validated professional standard of CDI excellence, CCHIIM explored the possibility of creating a new CDI credential for this growing field. To do so, a job analysis was conducted to thoroughly yet concisely define the requisite tasks and knowledge areas for the CDI practitioner role. The job analysis data were used to develop the CDIP exam blueprint in accordance with test development best-practice methodology, in that the domain weightings were determined based on SME rankings of task or knowledge criticality and frequency. Because validated, job-specific content is the crux of the CDIP exam, those who list the CDIP credential after their name have proven their competency and expertise related to the codified CDI body of knowledge. As a result, the HIM field in its entirety is strengthened by having a defined, measurable, and future-thinking measure of proficiency related to ensuring the quality of patient health information.
Jessica Ryan, MA, is a learning specialist at the University of Chicago Medical Center in Chicago, IL.
Karen Patena, MBA, RHIA, FAHIMA, is a clinical associate professor and director of health information management programs at the University of Illinois at Chicago in Chicago, IL.
Wallace Judd, PhD, is a psychometrician at Authentic Testing, Inc., in Gaithersburg, MD.
Mike Niederpruem, MS, MA, CAE, is the director of education and research at the Dental Auxiliary Learning and Education Foundation in Chicago, IL.
Notes
1 Wang, N., D. Schnipke, and E. A. Witt. “Use of Knowledge, Skill, and Ability Statements in Developing Licensure and Certification Examinations.” Educational Measurement: Issues and Practice 24 (2005): 15–22. Available at https://eric.ed.gov/?id=EJ718248 (doi:10.1111/j.1745-3992.2005.00003.x).
2 Mehrens, William A., and W. James Popham. “How to Evaluate the Legal Defensibility of High-Stakes Tests.” Applied Measurement in Education 5, no. 3 (1992): 265–83.
3 Chinn, Roberta N., and Norman R. Hertz. Job Analysis: A Guide for Credentialing Organizations. Lexington, KY: Council on Licensure, Enforcement, and Regulation (CLEAR), 2010, p. 14.
4 National Commission for Certifying Agencies. Standards for the Accreditation of Certification Programs. Washington, DC: Institute for Credentialing Excellence, 2004.
5 American National Standards Institute. Guidance on Psychometric Requirements for ANSI Accreditation (Public Guidance No. PCAC-GI-502). 2009. https://www.ansica.org/wwwversion2/outside/ALLviewDoc.asp?dorID=62&menuID=2.
6 Mehrens, William A., and W. James Popham. “How to Evaluate the Legal Defensibility of High-Stakes Tests.”
7 American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, DC: American Psychological Association, 1999, p. 161.
8 Chinn, Roberta N., and Norman R. Hertz. Job Analysis: A Guide for Credentialing Organizations, p. 14.
9 Rudman,William J., John S. Eberhardt III, William Pierce, and Susan Hart-Hester. “Healthcare Fraud and Abuse.” Perspectives in Health Information Management 6 (Fall 2009).
10 AHIMA. “Guidance for Clinical Documentation Improvement Programs.” Journal of AHIMA 81, no. 5 (May 2010): expanded web version.
11 AHIMA. “Clinical Documentation Improvement Toolkit.” 2010.
12 AHIMA Physician Practice Council. “Resolution on Quality Data and Documentation in the EHR.” FORE Library: HIM Body of Knowledge (2007).
13 Raymond, Mark R. “Job Analysis and the Specification of Content for Licensure and Certification Examinations.” Applied Measurement in Education 14, no. 4 (2001): 369–415. Available at doi:10.1207/S15324818AME1404_4.
14 Watzlaf, Valerie J. M., William J. Rudman, Susan Hart-Hester, and Ping Ren. “The Progression of the Roles and Functions of HIM Professionals: A Look into the Past, Present, and Future.” Perspectives in Health Information Management 6 (Summer 2009). Available at http://perspectives.ahima.org/the-progression-of-the-roles-and-functions-of-him-professionals-a-look-into-the-past-present-and-future/.
15 Young, Scott A., and Karen M. Barbera. “Content Essentials: A Primer in Survey Development.” Valtera, 2010. Available at http://info.valtera.com/bid/107384/Content-Essentials-A-Primer-in-Survey-Development.
Article citation:
Ryan, Jessica; Patena, Karen; Judd, Wallace; Niederpruem, Mike.
"Validating Competence: A New Credential for Clinical Documentation Improvement Practitioners"
Perspectives in Health Information Management
(Spring, April 2013).
|