Inter-rater Reliability of a Clinical Documentation Rubric Within Pharmacotherapy Problem-Based Learning Courses

AMERICAN JOURNAL OF PHARMACEUTICAL EDUCATION(2020)

Cited 2|Views4
No score
Abstract
Objective. To evaluate a clinical documentation rubric for pharmacotherapy problem-based learning (PBL) courses using inter-rater reliability (IRR) among different evaluators. Methods. A rubric was adapted for use in grading student pharmacists' clinical documentation in pharmacotherapy PBL courses. Multiple faculty evaluators used the rubric to assess student pharmacists' clinical documentation. The mean rubric score given by the evaluators and the standard deviation were calculated. Intra-class correlation coefficients (ICC) were calculated to determine the inter-rater reliability (IRR) of the rubric. Results. Three hundred seventeen clinical documentation submissions were scored twice by multiple evaluators using the rubric. The mean initial evaluation score was 9.1 (SD = 0.9) and the mean second evaluation score was 9.1 (SD=0.9), with no significant difference found between the two. The overall ICC was 0.7 across multiple graders, indicating good IRR. Conclusion. The clinical documentation rubric demonstrated overall good IRR between multiple evaluators when used in pharmacotherapy PBL courses. The rubric will undergo additional evaluation and continuous quality improvement to ensure that student pharmacists are provided with the formative feedback they need.
More
Translated text
Key words
evaluation,rubric,inter-rater reliability,clinical documentation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined