| Literature DB >> 35983202 |
Andres De Los Reyes1, Fanita A Tyrell2, Ashley L Watts3, Gordon J G Asmundson4.
Abstract
On page 1 of his classic text, Millsap (2011) states, "Measurement invariance is built on the notion that a measuring device should function the same way across varied conditions, so long as those varied conditions are irrelevant [emphasis added] to the attribute being measured." By construction, measurement invariance techniques require not only detecting varied conditions but also ruling out that these conditions inform our understanding of measured domains (i.e., conditions that do not contain domain-relevant information). In fact, measurement invariance techniques possess great utility when theory and research inform their application to specific, varied conditions (e.g., cultural, ethnic, or racial background of test respondents) that, if not detected, introduce measurement biases, and, thus, depress measurement validity (e.g., academic achievement and intelligence). Yet, we see emerging bodies of work where scholars have "put the cart before the horse" when it comes to measurement invariance, and they apply these techniques to varied conditions that, in fact, may reflect domain-relevant information. These bodies of work highlight a larger problem in measurement that likely cuts across many areas of scholarship. In one such area, youth mental health, researchers commonly encounter a set of conditions that nullify the use of measurement invariance, namely discrepancies between survey reports completed by multiple informants, such as parents, teachers, and youth themselves (i.e., informant discrepancies). In this paper, we provide an overview of conceptual, methodological, and measurement factors that should prevent researchers from applying measurement invariance techniques to detect informant discrepancies. Along the way, we cite evidence from the last 15 years indicating that informant discrepancies reflect domain-relevant information. We also apply this evidence to recent uses of measurement invariance techniques in youth mental health. Based on prior evidence, we highlight the implications of applying these techniques to multi-informant data, when the informant discrepancies observed within these data might reflect domain-relevant information. We close by calling for a moratorium on applying measurement invariance techniques to detect informant discrepancies in youth mental health assessments. In doing so, we describe how the state of the science would need to fundamentally "flip" to justify applying these techniques to detect informant discrepancies in this area of work.Entities:
Keywords: Converging Operations; Diverging Operations; Operations Triad Model; domain-relevant information; informant discrepancies
Year: 2022 PMID: 35983202 PMCID: PMC9378825 DOI: 10.3389/fpsyg.2022.931296
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
FIGURE 1How theory and research about informant discrepancies ought to inform decision-making on when to apply measurement invariance techniques to detect informant discrepancies in youth mental health research. In research on multi-informant assessments of youth mental health, competing theories exist for what informant discrepancies reflect. One theory (i.e., depression-distortion hypothesis; Richters, 1992) claims that discrepancies reflect mood-congruent rater biases, such that a negative mood state compels an informant (e.g., parent) to attend to, encode, recall, and rate more negative youth behaviors, relative to informants who do not experience such mood states (e.g., teacher); accordingly, informant discrepancies reflect measurement confounds. In contrast, the other theory (i.e., situational specificity; Achenbach et al., 1987) claims that discrepancies reflect the notion that youth vary in the contexts in which they display mental health concerns, and the informants from whom assessors solicit reports (e.g., parents, teachers, and youth) vary in the contexts in which they observe youth; accordingly, informant discrepancies reflect domain-relevant information.
FIGURE 2Graphical depiction of the discrete nature of decisions regarding the use of measurement invariance techniques to detect informant discrepancies in youth mental health research. Given the existence of competing theories about what informant discrepancies reflect (see Figure 1), one requires a “referee” to decide when to apply measurement invariance techniques. The only reasonable means by which to resolve the issue of competing theories is to put the theories “to the test” using empirical data produced by well-constructed validation studies. As (A) depicts, the definition offered by Millsap (2011) logically results in one deciding to use measurement invariance techniques only when empirical data (i.e., results from well-constructed validation studies) clearly tilt in favor of inferring that the measurement conditions one seeks to detect (e.g., informant discrepancies) reflect measurement confounds. Note that this is a distinct set of decisions from determining the degree to which informant discrepancies reflect domain-relevant information (B). Indeed, both of these decisions ought to be grounded in empirical data linked to domain-relevant data conditions. Having said that, and whereas the decision to use measurement invariance techniques is discrete, determining the composition of variance in informant discrepancies is dimensional, because measures of all psychological domains likely reflect a “mix” of domain-relevant information and measurement confounds (i.e., no one measurement is “perfect”).
FIGURE 3Graphical representation of the research concepts that comprise the Operations Triad Model. The top half (A) represents Converging Operations: a set of measurement conditions for interpreting patterns of findings based on the consistency within which findings yield similar conclusions. The bottom half denotes two circumstances, within which researchers identify discrepancies across empirical findings derived from multiple informants’ reports and, thus, discrepancies in the research conclusions drawn from these reports. On the left (B) is a graphical representation of Diverging Operations: a set of measurement conditions for interpreting patterns of inconsistent findings based on hypotheses about variations in the behavior(s) assessed. The solid lines linking informants’ reports, empirical findings derived from these reports, and conclusions based on empirical findings denote the systematic relations among these three study components. The dual arrowheads in the figure representing Diverging Operations convey the idea that one ties meaning to the discrepancies among empirical findings and research conclusions and, thus, how one interprets informants’ reports to vary as a function of variation in the behaviors being assessed. On the right (C) is a graphical representation of Compensating Operations: a set of measurement conditions for interpreting patterns of inconsistent findings based on methodological features of measures or informants. The dashed lines denote the lack of systematic relations among informants’ reports, empirical findings, and research conclusions. Originally published in De Los Reyes et al. (2013a). ©Annual Review of Clinical Psychology. Copyright 2012 Annual Reviews. All rights reserved. The Annual Reviews logo and other Annual Reviews products referenced herein are either registered trademarks or trademarks of Annual Reviews. All other marks are the property of their respective owner and/or licensor.
Pieces of evidence (i.e., exhibits) that point toward informant discrepancies in youth mental health assessments as cases of domain-relevant information.
| Exhibit | Description | Citation support |
| A | The notion of situational specificity |
|
| B | Youth mental health researchers rely on reports completed by structurally different informants | |
| C | Structurally different informants tend to complete parallel instruments that hold crucial measurement properties constant (e.g., item content, scaling, response options), thus reducing the likelihood that random error variance explains informant discrepancies |
|
| D | Several decades of research consistently point to large discrepancies between structurally different informants’ reports |
|
| E | Multi-informant assessments conducted across the globe consistently reveal discrepancies between structurally different informants’ reports |
|
| F | Researchers observe discrepancies between structurally different informants’ reports, regardless of the instruments used to collect these reports and how well-established they might be | |
| G | The best, most high-quality studies available to understand what informant discrepancies might reflect tend to show that these discrepancies reflect domain-relevant factors |
|
| H | In tests of criterion-related validity that use independent assessments as criterion variables (i.e., observed behavior), approaches that meaningfully integrate data from structurally different informants’ reports outperform approaches that assume informant discrepancies reflect measurement confounds | |
| I | Any evidence of links between rater characteristics and informant discrepancies can be parsimoniously explained by domain-relevant factors |
|
| J | Informant discrepancies tend not to be moderated by demographic characteristics of the youth being rated |
Examples of measurement invariance studies focused on informant discrepancies in assessments of domains relevant to understanding youth mental health.
| Authors (Year) | Article title | Citations as precedents |
|
| Measuring perceptions of the therapeutic alliance in individual, family, and group therapy from a systemic perspective: Structural validity of the SOFTA-s | |
|
| A trifactor model for integrating ratings across multiple informants | |
|
| Psychometric models for scoring multiple reporter assessments: Applications to integrative data analysis in prevention science and beyond | |
|
| Measurement invariance of Alabama Parenting Questionnaire Across age, gender, clinical status, and informant | |
|
| Evaluating construct equivalence of youth depression measures across multiple measures and multiple studies | |
|
| Teacher vs. parent informant measurement invariance of the Strengths and Difficulties Questionnaire | |
|
| Evaluating maternal psychopathology biases in reports of child temperament: An investigation of measurement invariance | |
|
| Is parent–child disagreement on child anxiety explained by differences in measurement properties? An examination of measurement invariance across informants and time | |
|
| Agreement in youth-parent perceptions of parenting behaviors: A case for testing measurement invariance in reporter discrepancy research | |
|
| A psychometric analysis of the Social Anxiety Scale for Adolescents among youth with autism spectrum disorder: caregiver–adolescent agreement, factor structure, and validity | |
|
| Latent congruence model to investigate similarity and accuracy in family members’ perception: The challenge of cross-national and cross-informant measurement (non) invariance | |
|
| Are we thinking about the same disorder? A trifactor model approach to understand parents’ and their adolescents’ reports of borderline personality pathology |
aCitations note publications that pre-dated the publication year of the article, and that served as precedents to guide researchers toward refraining from applying measurement invariance techniques to study informant discrepancies.
FIGURE 4How the state of the science on informant discrepancies in youth mental health assessments would need to tilt to justify the use of measurement invariance techniques to detect informant discrepancies. (A) Depicts the current state of the science on these discrepancies, which clearly tilts in favor of informant discrepancies more likely reflecting domain-relevant information rather than measurement confounds. To justify the use of the measurement invariance techniques to detect informant discrepancies in youth mental health assessments would require theory and evidence that renders informant discrepancies synonymous with measurement confounds. This would require a complete “flip” of the theoretical and evidentiary bases of informant discrepancies as they manifest in youth mental health assessments (B), such that the state of the science would need to clearly tilt in favor of the likelihood that informant discrepancies in youth mental health assessments reflect measurement confounds.