Chrome Extension
WeChat Mini Program
Use on ChatGLM

Agreement of physician and patient ratings of communication in medical encounters: A systematic review and meta-analysis of interrater agreement

PATIENT EDUCATION AND COUNSELING(2020)

Cited 17|Views1
No score
Abstract
Objective: To determine the agreement of physician and patient ratings of communication in medical face-to-face consultations. Methods: A systematic search of twelve databases was conducted. Studies investigating agreement between physician and patient ratings of communication in medical face-to-face encounters and reporting interrater agreement were included. Methodological quality was assessed, and study characteristics and physician-patient agreement were narratively summarized. Meta-analysis was conducted for a subsample of the included studies investigating shared decision making. Results: Of the 17 included studies, ten studies did not demonstrate any correspondence between physician and patient ratings. The remaining seven studies revealed poor to fair absolute agreement (kappa between .13 and .42; kappa(w) between .31 and .49; 95% CI 0.13 - 0.76) and poor to moderate consistency (r = .17 and .06; r(polyc) between .39 and .63; p < .05). Meta-analysis of six studies yielded small association (r(polyc) = .15). Conclusion: Physicians and patients evaluate communication differently and at best, only slightly agree in their ratings, indicating that the construct of communication is not measurable in a stable manner. (C) 2020 Elsevier B.V. All rights reserved.
More
Translated text
Key words
Interrater agreement,physician-patient communication,physician-patient interaction,shared decision making,systematic review,meta-analysis
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined