Does an Education Intervention Improve Physician Signature Legibility? Pilot Study of a Prospective Chart Review

by James K. Glisson, MD, PharmD; Mary E. Morton, PhD, RHIA; Allyn H. Bond, MD; and Michael Griswold, PhD

Abstract

Illegible physician signatures in patient records can lead to inaccurate documentation, improper billing, and potential legal issues. Many studies in the current literature address legibility of prescriptions and medication orders; however, few focus on legibility of physicians’ signatures.

The purpose of the present quality improvement survey was to evaluate physician signature legibility on patient charts at the University of Mississippi Medical Center’s Adult Internal Medicine Clinic. At the time of the study, the clinic was known as the University of Mississippi Medical Center (UMMC) Adult Internal Medicine Clinic. Effective July 1, 2009, UMMC entered into a collaboration with Jackson-Hinds Comprehensive Health Center (JHCHC), a federally qualified health center. The clinic is now known as the Federally Qualified Health Center at the Jackson Medical Mall. In this pilot study, we examined clinic notes and billing sheets for legible physician signatures over a three-month period. Midway through the study, an intervention group was given name stamps and a standardized discussion on the importance of signature legibility and proper name stamp usage. Legibility of resident signatures in the intervention group increased from 26 percent to 60 percent. Legibility of attending signatures in the intervention group increased from 1.4 percent to 86 percent. Results suggest the significant impact of resident education on changing practice behavior.

Key words: documentation/standards, education/medical/graduate, handwriting, legibility, medical records/standards, physician handwriting, quality assurance/healthcare, quality of care, resident education

Introduction

The patient record serves as a communication tool between caregivers, provides justification for reimbursement of services, and serves as a medicolegal document.1 Prior studies support the belief that physicians’ handwriting is often illegible.2-5 Lack of a legible physician signature can lead to inaccurate documentation, improper billing, potential legal issues, lost time and money, and frustration for members of the healthcare team.6-10 One academic medical center reported that over 61 percent of its support staff spent more than ten minutes clarifying illegible orders.11

Poor legibility can also lead to medical errors, which was brought to light in the Institute of Medicine’s highly publicized report entitled To Err Is Human: Building a Safer Health System.12 This heightened awareness triggered a number of initiatives focused upon improving the safety of care delivered in the United States, including passage of the Patient Safety and Quality Improvement Act of 2005 and creation of the Joint Commission’s National Patient Safety Goals (NPSG) program.13,14 While implementation of electronic health records (EHRs) will minimize handwritten documentation, recent EHR adoption rates in the ambulatory care environment have been estimated at 21.8 percent for a basic system and 6.9 percent for a fully functional system.15 Even though the American Recovery and Reinvestment Act of 2009 (ARRA) calls for financial incentives to healthcare providers for early adoption of health information technology, some still speculate that critical mass adoption by the specified deadline is unlikely.16-19 As a result, the need for some handwritten signatures will likely persist for some time.

The Medicare Conditions of Participation and the Joint Commission accreditation standards require medical records to be legible and authenticated.20,21 Many studies in the current literature address legibility of prescriptions and medication orders; however, few focus on legibility of medical record documentation and physicians’ signatures.22-27 To our knowledge, no similar research has been published in the literature in English on educating residents on the importance of handwriting legibility; however, one study noted significant improvement in overall record documentation practices with feedback from the attending physician.28 Interventions have been found to improve quality of documentation, as has the use of name stamps. 29-33 The purpose of the present survey is to evaluate physician signature legibility on patient charts at the Adult Internal Medicine Residency Clinic at the University of Mississippi Medical Center (UMMC). The medical director designed this study, having been approached by multiple caregivers who were unable to clarify orders due to illegible signatures. After assessing a preliminary sample of records, the issue was deemed significant enough to warrant further study. We hypothesized that physician signature legibility would initially be poor with less than 30 percent of signatures being recognized by two independent reviewers. We proposed that a physician education initiative would improve legibility.

Methods

A convenience sample of patient charts from September to December 2009 was used for the study. Any patient seen in the Adult Internal Medicine Residency Clinic during the time of the chart review was a candidate for inclusion. Physicians were not informed of the chart review in order to minimize bias or changes in behavior. Charts were reviewed by students and faculty from the UMMC School of Health Related Professions. A total of six reviewers were utilized in both the pre- and postintervention phases. These individuals were considered independent reviewers because they were not familiar with the signatures of the individual physicians, whereas others involved with the clinic may have been able to recognize specific physician signatures. Two students or faculty reviewed each chart independently to minimize error and bias. Additionally, all charts included were reviewed by the primary investigator, who is the medical director of the clinic.

The correct identity of all signatures of both residents and attendings was determined by the primary investigator using the clinic schedule and the patient’s medical record number. The residents all signed a signature form at the beginning of their residency, and this was used to help clarify difficult signatures if the clinic schedule could not resolve the issue of an unclear signature. Data entered by the initial reviewers, such as the presence or absence of a signature, were validated by the medical director. In the case of an unreadable signature whose owner could not be identified by the clinic schedule or the master resident signature list, the signature was deemed unreadable to all and noted as such. The Institutional Review Board of the University of Mississippi Medical Center approved this research with a waiver of consent.

Inclusion Criteria

All internal medicine residents or attendings in this clinic were included in the initial pool. A total of 63 internal medicine residents and 18 attendings participate in this clinic. Charts utilized by any of these physicians were included unless noted otherwise in the exclusion criteria.

Exclusion Criteria

Medical students were not included, but the staff and residents who work with them were included. Three attendings were excluded because they participated in this research and other quality control measures with residents. Three attendings who are chief residents and two others who infrequently attend the clinic were excluded from the “treatment” group as their chart yield would be low. Thus, only 10 possible attendings were included for the intervention. A total of 10 residents would not be in clinic for at least half of the intervention phase; thus, they were excluded from the randomization, but data were collected on them when possible.

Randomization

The randomization was done using the SAS program to generate a random number list. This list was superimposed on top of the existing resident and attending list. The physicians were each assigned to either an intervention or a control group. Figure 1 documents the number of resident (R) and attending (A) physicians in both the control (R = 30, A = 7) and intervention (R = 23, A = 3) groups.

Figure 1:

Intervention

The intervention protocol used is documented below. The medical director conducted individual education sessions with each physician in the intervention group. Specific instructions on how to use the name stamp were provided. The name stamp was monogrammed with the physician’s name, credentials, and “Department of Internal Medicine,” which appeared in block letters. The stamps did not bear the physician’s actual signature, which is a practice prohibited by the Medicare Conditions of Participation and many accrediting organizations.34 They were also told if they lost the name stamp or if it broke to report this to the medical director and from then on any signature should have the name hand-printed beside it. As noted above, some of the staff were excluded from the intervention because it was felt their yield of charts would be low during the study period. Medical students and non–internal medicine residents had data collected about the attending only and were not individually tracked or included in the intervention.

The safety issues surrounding illegible signatures were addressed individually with each participant in the intervention group. The participants were told the clinic staff was having trouble identifying the writer of a specific clinic note. Thus, the name stamps and following intervention was being implemented with a selected group of physicians. The intervention group was told to use the name stamp, sign charts with a legible signature, and print their name after the signature to avoid any confusion about the author of the note. On a biweekly basis the residents in the intervention group were reminded by the medical director to continue to use the name stamp and write legibly. We desired to mimic real-life situations as much as possible; thus, no penalty for poor cooperation was implemented.

To summarize, the intervention protocol was as follows:

  1. Encourage residents/attendings to sign legibly.
  2. Encourage residents/attendings to print their name by the signature.
  3. Encourage residents/attendings to use the name stamp.
  4. Emphasize safety issues with illegible signatures.
  5. If the name stamp is lost or unavailable, residents/attendings should print their name by the signature.

Data Collection and Crossover Bias

In order to prevent crossover bias, the residents in the treatment group were asked to not disclose the fact that the name stamp was part of a study or that it was given to them by the medical director. This was done to prevent as much crossover bias as possible. If they told other physicians the stamp was given to them by the medical director, this could introduce bias, especially in the clinics in which the medical director is one of the attendings. It should be noted that there was already some use of name stamps within the clinic, as one of the attendings occasionally distributes them to residents on certain rotations. Therefore, residents in the control group were unlikely to be suspicious of stamp usage by others.

Statistical Analysis

We were unable to find a standardized definition for legibility; however, some organizations consider documentation to be illegible if it cannot be read by two people.35 Therefore, in this study, a signature was deemed legible if it had a name stamp and/or if it could be read by both of the reviewers. Thus, a legible signature did not solely rely upon the name stamp. If the reviewers disagreed regarding a signature’s legibility, then it was not deemed legible and was considered a “worst-case scenario.” A second analysis was then performed in which the signature was deemed legible if either of the two reviewers could read it; if so, it was referred to as a “best-case scenario.” It should be noted that only the “worst-case scenario” was used in the final analysis. Data were collected in an initial, preintervention period as well as in a postintervention period. Groups were compared on binary outcomes with logistic regression. Generalized estimating equations (GEE) modeling with robust estimation of standard errors was then performed to account for repeated measures from an individual physician.36

Results

A total of 343 charts were reviewed. Three attendings and 23 residents were randomized into the stamp (intervention) group. See Table 1 and Figure 1 for participant demographics and the study flow diagram, respectively. One resident’s name stamp broke during the study, and he was told to print his name after his signature and encouraged to continue to sign legibly. Another resident randomized to the intervention group refused to participate in the study and refused to make any changes to his handwriting. One attending in the intervention group left the university during the postintervention phase.

Table 1: Demographic Distribution of Participants (N = 63)

Participant Profile Control Group Stamp Group
Female N (%) 16 (53%) 9 (39%)
Gender Male N (%) 14 (47%) 14 (61%)
White N (%) 9 (30%) 5 (22%)
Ethnicity Nonwhite N (%) 21 (70%) 18 (78%)
Residents (N = 53) First-year N (%) 10 (33%) 7 (30%)
Years in residency Second-year N (%) 8 (27%) 10 (44%)
Third-year N (%) 12 (40%) 6 (26%)
Medical school location U.S. N (%) 24 (77%) 23 (92%)
Non-U.S. N (%) 7 (23%) 2 (8%)

Female N (%) 3 (43%) 2 (67%)
Gender Male N (%) 4 (57%) 1 (33%)
Attendings (N = 10) Ethnicity White N (%) 5 (71%) 3 (100%)
Nonwhite N (%) 2 (29%) 0 (0%)
Years in practice Average Mean (SD) 8.14 (9.63) 17.67 (12.42)

The following paragraph discusses the data obtained and assumes the “worst-case scenario” as described previously (see Table 2). Legibility of resident signatures in the stamp group increased from 26 percent preintervention to 60 percent postintervention (OR = 4.44 (1.83, 10.8), p = 0.001). Legibility of attending signatures in the stamp group increased from 1.4 percent preintervention to 86 percent postintervention (OR = 448 (18, 11204), p = .0001). Thus, the data support our original hypothesis that both groups would have less than 30 percent legibility before the intervention. In the resident group, odds of improving legibility were more than five times greater in the stamp group than in the control group (OR = 5.49 (1.42, 21.28), p = .014). For the attending group, the odds of improving legibility were 316 times greater in the stamp group than the control group (OR = 316 (11, 9128), p = .001). See Table 2 and Figure 2 for additional information regarding pre- and postintervention legibility. Note that the odds ratios are so large because of the enormity of the effect, not from any model-fitting difficulties. Table 3 shows the percent agreement of reviewers as well as the kappa statistic for each group. A kappa statistic of 0.41–0.6 shows moderate agreement and 0.61–0.8 shows substantial agreement.37 The results of the two analyses (worst-case and best-case) were qualitative; therefore, only the former is included. Results of both analyses are available from the authors by request. Finally, we examined all data for any differences based upon ethnicity and gender, and none were deemed statistically significant.

Table 2: Worst-Case Scenario Results

Control Group Intervention Group Comparison between Intervention and Control Groups (OR)
Preintervention 28% (15%, 48%) 13% (6%, 28%) 0.39 (0.11, 1.38)
All physicians Postintervention 30% (15%, 51%) 76% (49%, 91%) 7.50 (1.69,33.34)
Within-group change (OR) 1.06 (0.52,2.18) p = .087 20.34 (3.22, 128) p = .001 Stamp Effect: OR= 19.15 (2.65, 138) p = .003

Preintervention 32% (19%, 48%) 26% (15%, 40%) 0.75 (0.29, 1.94) p = .55
Residents Postintervention 27% (13%, 48%) 60% (42%, 76%) 4.11 (1.27, 13.3) p = .019
Within-group change (OR) OR = 0.81 (0.29, 2.25) p = .679 OR = 4.44 (1.83, 10.8) p = .001 Stamp Effect: OR =5.49 (1.42, 21.28) p = .014

Preintervention 26% (7%, 61%) 1.4% (0.1%, 13%) 0.041 (0.002, 0.68) p = .026
Attendings Postintervention 33% (9%, 71%) 86% (51%, 97%) 12.8 (1.15, 143) p = .038
Within-group change (OR) OR = 1.41 (0.54, 3.74) p = .48 OR = 448 (18, 11204) p = .0001 Stamp Effect: OR= 316 (11, 9128) p = .001

Figure 2:

Table 3: Intra-rater Reliability for Worst-Case Scenario

Residents Attendings All Physicians
Second Rater Second Rater Second Rater
Not Legible Legible Not Legible Legible Not Legible Legible
First rater Not legible 42.86% 4.37% 53.13% 4.90% 48.17% 4.65%
Legible 19.83% 32.94% 9.54% 32.43% 14.51% 32.68%

Percent agreement 75.8% (71%–80%) 85.56% (82%–89%) 80.85% (78%–84%)
Kappa statistic 0.52 (0.44–0.61) 0.70 (0.62–0.77) 0.61 (0.55–0.67)

Discussion

While this is a small-scale study, one cannot ignore the profound effects that the education intervention provided, especially among members of the attending stamp group. The substantial differences between the attending and resident odds ratios may be due to the fact that faculty more acutely recognize the serious ramifications of poor legibility. However, improvement in resident handwriting was substantial. Physicians in the intervention group improved their written signature legibility during the postintervention phase. This group also showed improved legibility for additional written documentation (not just signatures), suggesting that the educational component of the intervention was successful. Further investigation of resident behaviors may determine the impact of an intervention such as the one described on the quality of handwriting in postresidency practice.

Our results are somewhat consistent with a limited number of prior studies. Medford and France also noted a substantial improvement in signature legibility (81 percent) after implementing use of name stamps for authenticating case notes. 38 Boehringer et al. observed similar findings after introducing name stamps to 34 percent of the residents on staff.39 A study by Daly et al. revealed 100 percent signature legibility when accompanied by printed or stamped name.40 Other studies have observed noteworthy improvements in legibility post intervention.41,42

Our study differs from previous studies in that it integrates both an educational intervention and the use of a name stamp. While some prior studies have incorporated one-on-one feedback with individual physicians regarding the quality of their own documentation, we did not find any studies in the literature where educational sessions were conducted with an intervention group. Our study is also unique in its use of multiple reviewers to determine signature legibility. Prior studies have utilized clerical medical records staff or a single reviewer to audit the legibility of signatures.43

Limitations

This investigation is not without limitations. Only internal medicine residents and attendings at our university clinic were included, which may not reflect behaviors of physicians in different clinical specialties or those practicing in nonacademic settings. Multiple reviewers were used to account for and report on variations in rating legibility, and a standardized definition of legibility does not exist. Additionally, despite efforts to prevent bias, some physicians may have disclosed the nature of this research to their participating colleagues. Furthermore, one resident refused to participate in the intervention group, and only 3 of 10 attendings were randomized into the intervention group. Finally, while it appears the educational component was the influential factor for creating behavioral change, data were collected on legibility of signatures only. A follow-up study to collect data on overall legibility outcomes would be required to confirm this assumption. A multicenter investigation involving physicians in various specialties would provide more comprehensive information on physician legibility. However, despite these limitations, such a significant benefit in the intervention group cannot be disregarded.

Conclusions

This study was conceived by the clinic director because of illegible signatures in the medical record creating problems in arranging follow-up and communicating abnormal lab results to the appropriate physician. It is the duty of the residency program to teach the importance of legible documentation, including signatures, to its trainees with the goal of lifelong behavior modification. Health information management (HIM) practitioners can assist physician faculty in developing educational programs targeted at improving handwriting legibility. Similar programs might also be incorporated into existing clinical documentation improvement programs. In nonacademic settings, service-line coders with responsibilities for educating clinical staff in documentation and compliance issues may provide guidance and ongoing feedback to providers. Legible documentation to support reimbursement is becoming even more critical as external auditing programs are put into place. Clear documentation will be crucial for supporting a defensive strategy in the Recovery Audit Contractor (RAC) and the Medicaid Integrity Contractor (MIC) audits.44

Although this was a pilot study, it does indicate the significance of educating residents on changing practice behavior. A larger-scale study conducted over a longer period of time that extends into the postresidency period may support the significance of an education intervention on handwriting behavior. Poor physician handwriting can cause errors in medical management and contribute in other ways to patient adverse events. An illegible physician signature has numerous implications relating to patient safety. Illegible signatures may lead to inaccurate documentation by other healthcare professionals and delays in patient care, especially in an emergent situation. As demonstrated in this investigation, a standardized educational initiative improved physician signature legibility. The transition to an EHR system will help accurately document physician encounters; however, some institutions or clinics may not transition to this technology in the near future and will continue to have at least some paper documentation. Thus, methods such as the one demonstrated in this research will help alleviate issues related to physician identity and signature legibility.

Disclaimer

This research was presented in part at the American Health Information Management Association (AHIMA) Assembly on Education Summer Symposium, New Orleans, Louisiana, July 28, 2010.

James K. Glisson, MD, PharmD, is an assistant professor in the Department of Internal Medicine at the School of Medicine at the University of Mississippi Medical Center in Jackson, MS.

Mary E. Morton, PhD, RHIA, is an assistant professor in the Health Informatics and Information Management Program in the School of Health Related Professions at the University of Mississippi Medical Center in Jackson, MS.

Allyn H. Bond, MD, is a resident in the Department of Internal Medicine at the University of Mississippi Medical Center in Jackson, MS.

Michael Griswold, PhD, is an associate professor in the Department of Biostatistics in the School of Medicine at the University of Mississippi Medical Center in Jackson, MS.

Notes

1 Glondys, Barbara. “Ensuring Legibility of Patient Records (AHIMA Practice Brief).” Journal of the American Health Information Management Association 74, no. 5 (2003): 64A–D.

2 Hobson, Jonathan C., Sameer Khemani, and Arvind Singh. “Prospective Audit of the Quality of ENT Emergency Clinic Notes Before and After Introduction of a Computerized Template.” Journal of Laryngology & Otology 119, no. 4 (2005): 264–66.

3 Lyons, Ronan, Christopher Payne, Michael McCabe, and Colin Fielder. “Legibility of Doctors’ Handwriting: Quantitative Comparative Study.” British Medical Journal 317, no. 7162 (1998): 863–64.

4 Rodriguez-Vera, F. J., Y. Marin, A. Sanchez, C. Borrachero, and E. Pujol. “Illegible Handwriting in Medical Records.” Journal of the Royal Society of Medicine 95, no. 11 (2002): 545–46.

5 Bruner, Anne, and Morton L. Kasdan. “Handwriting Errors: Harmful, Wasteful and Preventable.” Journal of the Kentucky Medical Association 99, no. 5 (2001): 189–92.

6 Rodriguez-Vera, F. J., Y. Marin, A. Sanchez, C. Borrachero, and E. Pujol. “Illegible Handwriting in Medical Records.”

7 Bruner, Anne, and Morton L. Kasdan. “Handwriting Errors: Harmful, Wasteful and Preventable.”

8 Daly, P., F. J. Moloney, M. Doyle, and J. B. O’Mahony. “Legibility of Doctor’s Signatures: Novel Approaches to Improving an Age-Old Problem.” Irish Medical Journal 99, no. 7 (2006): 214–15.

9 Panigrahi, A. R., and C. Cunningham. “Legibility and Authorship of Clinical Notes.” Journal of the Royal Society of Medicine 96, no. 4 (2003): 208.

10 Kozak, E. A., R. S. Dittus, W. R. Smith, J. F. Fitzgerald, and C. D. Langfeld. “Deciphering the Physician Note.” Journal of General Internal Medicine 9, no. 1 (1994): 52–54.

11 Boehringer, Peter A., Jeanette Rylander, Dominic T. Dizon, and Michael W. Peterson. “Improving the Quality of the Order-Writing Process for Inpatient Orders in a Teaching Hospital.” Quality Management in Health Care 16, no. 3 (2007): 215–18.

12 Institute of Medicine. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press, 1999.

13 Agency for Healthcare Research and Quality (AHRQ). “Patient Safety and Quality Improvement Act of 2005 (Patient Safety Act): An Overview.” Accessed March 7, 2011.

14 The Joint Commission. Facts about the National Patient Safety Goals. 2009. Accessed June 23, 2011.

15 Hsiao, Chun-Ju, Esther Hing, Thomas C. Socey, and Bill Cai. Electronic Medical Record/Electronic Health Record Systems of Office-based Physicians: United States, 2009 and Preliminary 2010 State Estimates. National Center for Health Statistics. December 2010. Accessed February 15, 2011.

16 Steinbrook, R. “Health Care and the American Recovery and Reinvestment Act.” New England Journal of Medicine 360, no. 11 (2009): 1057–60.

17 Menachemi, N. “Barriers to Ambulatory EHR: Who Are ‘Imminent Adopters’ and How Do They Differ from Other Physicians?” Informatics in Primary Care 14, no. 2 (2006): 101–8.

18 Ford, E. W., N. Menachemi, L. T. Peterson, and T. R. Huerta. “Resistance Is Futile: But It Is Slowing the Pace of EHR Adoption Nonetheless.” Journal of the American Medical Informatics Association 16, no. 3 (2009): 274–81.

19 Kaplan, B., and K. Harris-Salamone. “Health IT Success and Failure: Recommendations from Literature and an AMIA Workshop.” Journal of the American Medical Informatics Association 16, no. 3 (2009): 291–99.

20 Centers for Medicare and Medicaid Services. “Conditions of Participation for Hospitals.” Code of Federal Regulations, 2009. 42 CFR, Chapter IV, Part 482.

21 The Joint Commission. Comprehensive Accreditation Manual for Hospitals (E-dition). Oakbrook Terrace, IL: Joint Commission Resources, 2010.

22 Hobson, Jonathan C., Sameer Khemani, and Arvind Singh. “Prospective Audit of the Quality of ENT Emergency Clinic Notes Before and After Introduction of a Computerized Template.”

23 Rodriguez-Vera, F. J., Y. Marin, A. Sanchez, C. Borrachero, and E. Pujol. “Illegible Handwriting in Medical Records.”

24 Daly, P., F. J. Moloney, M. Doyle, and J. B. O’Mahony. “Legibility of Doctor’s Signatures: Novel Approaches to Improving an Age-Old Problem.”

25 Medford, Andrew R. L., and Anthony J. France. “Pocket-Size Self-Inking Rubber Stamps Improve Legibility of Case Notes.” Quality in Primary Care 12, no. 2 (2004): 147–49.

26 Dexter, Sara C., Daichi Hayashi, and James R. Tysome. “The Ankle Score: An Audit of Otolaryngology Emergency Clinic Record Keeping.” Annals of the Royal College of Surgeons of England 90, no. 3 (2008): 231–34.

27 Lefter, Liviu P., Stuart R. Walker, Fleur Dewhurst, and R. W. L. Turner. “An Audit of Operative Notes: Facts and Ways to Improve.” ANZ Journal of Surgery 78, no. 9 (2008): 800–802.

28 Opila, Donald A. “The Impact of Feedback to Medical Housestaff on Chart Documentation and Quality of Care in the Outpatient Setting.” Journal of General Internal Medicine 12, no. 6 (1997): 352–56.

29 Medford, Andrew R. L., and Anthony J. France. “Pocket-Size Self-Inking Rubber Stamps Improve Legibility of Case Notes.”

30 Dexter, Sara C., Daichi Hayashi, and James R. Tysome. “The Ankle Score: An Audit of Otolaryngology Emergency Clinic Record Keeping.”

31 Opila, Donald A. “The Impact of Feedback to Medical Housestaff on Chart Documentation and Quality of Care in the Outpatient Setting.”

32 Daly, P., F. J. Moloney, M. Doyle, and J. B. O’Mahony. “Legibility of Doctor’s Signatures: Novel Approaches to Improving an Age-Old Problem.”

33 Medford, Andrew R. L., and Anthony J. France. “Pocket-Size Self-Inking Rubber Stamps Improve Legibility of Case Notes.”

34 Centers for Medicare & Medicaid Services (CMS). “Signature Requirements Clarification.” Medicare Program Integrity Manual. Transmittal 248, Change Request 5971 (March 28, 2008). Accessed March 7, 2011.

35 Dunn, Rose T. “Legibility Creates Documentation Challenges.” MIC Monitor, October 20, 2010. Accessed February 15, 2011.

36 Liang, K., and S. D. Zeger. “Longitudinal Data Analysis Using Generalized Linear Models.” Biometrika 73, no. 1 (1986): 13–22.

37 Landis, J. R., and G. G. Koch. “The Measurement of Observer Agreement for Categorical Data.” Biometrics 33, no. 1 (1977): 159–74.

38 Medford, Andrew R. L., and Anthony J. France. “Pocket-Size Self-Inking Rubber Stamps Improve Legibility of Case Notes.”

39 Boehringer, Peter A., Jeanette Rylander, Dominic T. Dizon, and Michael W. Peterson. “Improving the Quality of the Order-Writing Process for Inpatient Orders in a Teaching Hospital.”

40 Daly, P., F. J. Moloney, M. Doyle, and J. B. O’Mahony. “Legibility of Doctor’s Signatures: Novel Approaches to Improving an Age-Old Problem.”

41 Dexter, Sara C., Daichi Hayashi, and James R. Tysome. “The Ankle Score: An Audit of Otolaryngology Emergency Clinic Record Keeping.”

42 Opila, Donald A. “The Impact of Feedback to Medical Housestaff on Chart Documentation and Quality of Care in the Outpatient Setting.”

43 Medford, Andrew R. L., and Anthony J. France. “Pocket-Size Self-Inking Rubber Stamps Improve Legibility of Case Notes.”

44 Dunn, Rose T. “Legibility Creates Documentation Challenges.”


Article citation:
Glisson, James K; Morton, Mary E.; Bond, Allyn H; Griswold, Michael. "Does an Education Intervention Improve Physician Signature Legibility? Pilot Study of a Prospective Chart Review" Perspectives in Health Information Management (Summer, July 2011).