Preliminary development of the physician documentation quality instrument. Academic Article uri icon

Overview

MeSH

  • Documentation
  • Factor Analysis, Statistical
  • Physicians

MeSH Major

  • Medical Records Systems, Computerized
  • Quality Assurance, Health Care

abstract

  • This study sought to design and validate a reliable instrument to assess the quality of physician documentation. Adjectives describing clinician attitudes about high-quality clinical documentation were gathered through literature review, assessed by clinical experts, and transformed into a semantic differential scale. Using the scale, physicians and nurse practitioners scored the importance of the adjectives for describing quality in three note types: admission, progress, and discharge notes. Psychometric methods including exploratory factor analysis were applied to provide preliminary evidence for the construct validity and internal consistency reliability. A 22-item Physician Documentation Quality Instrument (PDQI) was developed. Exploratory factor analysis (n = 67 clinician respondents) on three note types resulted in solutions ranging from four (discharge) to six (admission and progress) factors, and explained 65.8% (discharge) to 73% (admission and progress) of the variance. Each factor solution was unique. However, four sets of items consistently factored together across all note types: (1) up-to-date and current; (2) brief, concise, succinct; (3) organized and structured; and (4) correct, comprehensible, consistent. Internal consistency reliabilities were: admission note (factor scales = 0.52-88, overall = 0.86), progress note (factor scales = 0.59-0.84, overall = 0.87), and discharge summary (factor scales = 0.76-0.85, overall = 0.88). The exploratory factor analyses and reliability analyses provide preliminary evidence for the construct validity and internal consistency reliability of the PDQI. Two novel dimensions of the construct for document quality were developed related to form (Well-formed, Compact). Additional work is needed to assess intrarater and interrater reliability of applying of the proposed instrument and to examine the reproducibility of the factors in other samples.

publication date

  • August 2008
  • July 2008

has subject area

  • Documentation
  • Factor Analysis, Statistical
  • Medical Records Systems, Computerized
  • Physicians
  • Quality Assurance, Health Care

Research

keywords

  • Journal Article
  • Validation Studies

Identity

Language

  • eng

PubMed Central ID

  • PMC2442259

Digital Object Identifier (DOI)

  • 10.1197/jamia.M2404

PubMed ID

  • 18436914

Additional Document Info

start page

  • 534

end page

  • 541

volume

  • 15

number

  • 4