OBJECTIVE: This study estimated the inter- and intraobserver reliability of a set of noninstrumented physical examination measures for knee pain in older adults. STUDY DESIGN AND SETTING: Forty-five patients from primary care, and 13 patients from secondary care, were each examined by two out of a team of three physical therapists, and were reexamined by one of these observers 1 month later. The examination items were standardized and included dichotomous, ordinal and continuous variables considered relevant to a primary care context. RESULTS: For individual dichotomous items, median interobserver and intraobserver agreement (kappa) was 0.22 (interquartile range IQR=0.12-0.35) and 0.41 (IQR=0.28-0.56) respectively. For ordinally rated variables, weighted kappa ranged from -0.08 to 0.43 for interobserver agreement, and from 0.00 to 0.79 for intraobserver agreement. The median intraclass correlation coefficient for continuous examination variables was 0.80 (range 0.68-0.89) for interobserver agreement, and 0.84 (range 0.67-0.95) for intraobserver agreement. CONCLUSION: For trained but nonexpert examiners, agreement was generally poor for dichotomous and ordinal examination items; however, kappa-values are liable to be depressed by the low prevalence of clinical signs in this sample. Agreement on continuous variables was notably better.
OBJECTIVE: This study estimated the inter- and intraobserver reliability of a set of noninstrumented physical examination measures for knee pain in older adults. STUDY DESIGN AND SETTING: Forty-five patients from primary care, and 13 patients from secondary care, were each examined by two out of a team of three physical therapists, and were reexamined by one of these observers 1 month later. The examination items were standardized and included dichotomous, ordinal and continuous variables considered relevant to a primary care context. RESULTS: For individual dichotomous items, median interobserver and intraobserver agreement (kappa) was 0.22 (interquartile range IQR=0.12-0.35) and 0.41 (IQR=0.28-0.56) respectively. For ordinally rated variables, weighted kappa ranged from -0.08 to 0.43 for interobserver agreement, and from 0.00 to 0.79 for intraobserver agreement. The median intraclass correlation coefficient for continuous examination variables was 0.80 (range 0.68-0.89) for interobserver agreement, and 0.84 (range 0.67-0.95) for intraobserver agreement. CONCLUSION: For trained but nonexpert examiners, agreement was generally poor for dichotomous and ordinal examination items; however, kappa-values are liable to be depressed by the low prevalence of clinical signs in this sample. Agreement on continuous variables was notably better.
Authors: Nasimah Maricar; Michael J Callaghan; Matthew J Parkes; David T Felson; Terence W O'Neill Journal: J Rheumatol Date: 2016-10-01 Impact factor: 4.666
Authors: Nasimah Maricar; Michael J Callaghan; Matthew J Parkes; David T Felson; Terence W O'Neill Journal: Semin Arthritis Rheum Date: 2015-10-22 Impact factor: 5.532