The evaluation of the agreement among a number of experts about a spe- cific topic is an important and scarcely explored issue, especially in multi- variate settings. The classical indexes (such as Cohen’s kappa) have been mainly proposed for evaluating the agreement between two experts in the univariate case. The evaluation of the agreement among more than two experts in the multivariate case is a still under-explored topic. This prob- lem is particularly crucial in the Formal Psychological Assessment (FPA) where the so called clinical context can be described as a Boolean matrix where the presence of a 1 in a cell ia means that the item i investigates the attribute a. The construction of the clinical context can be carried out through the query to a number of experts about the assignment to each item of the set of attributes it investigates. We propose a model evaluating the agreement of a number of experts in a task of clinical context construc- tion. The model takes into account two main factors that could intervene in explaining observed differences in the task completion by the experts: first, the experts may have different theoretical believes with respect to how items and attributes are related; second, given a certain strength of a specific link, experts may have different thresholds in performing the attribute assignment to each item. By means of a Bayesian approach a number of models have been simulated in order to estimate the role of both theoretical believes and experts’ threshold in a wide range of poten- tial scenarios that may occur. Results show how the proposed approach may be fruitfully applied in carrying out an evaluation of agreement among experts accounting simultaneously for the two factors. The application of the proposed approach to a number of research situations, different from the FPA, is discussed.

Inter-rater agreement in multivariate settings: A Bayesian approach

NUCCI, MASSIMO;SPOTO, ANDREA;ALTOE', GIANMARCO
2015

Abstract

The evaluation of the agreement among a number of experts about a spe- cific topic is an important and scarcely explored issue, especially in multi- variate settings. The classical indexes (such as Cohen’s kappa) have been mainly proposed for evaluating the agreement between two experts in the univariate case. The evaluation of the agreement among more than two experts in the multivariate case is a still under-explored topic. This prob- lem is particularly crucial in the Formal Psychological Assessment (FPA) where the so called clinical context can be described as a Boolean matrix where the presence of a 1 in a cell ia means that the item i investigates the attribute a. The construction of the clinical context can be carried out through the query to a number of experts about the assignment to each item of the set of attributes it investigates. We propose a model evaluating the agreement of a number of experts in a task of clinical context construc- tion. The model takes into account two main factors that could intervene in explaining observed differences in the task completion by the experts: first, the experts may have different theoretical believes with respect to how items and attributes are related; second, given a certain strength of a specific link, experts may have different thresholds in performing the attribute assignment to each item. By means of a Bayesian approach a number of models have been simulated in order to estimate the role of both theoretical believes and experts’ threshold in a wide range of poten- tial scenarios that may occur. Results show how the proposed approach may be fruitfully applied in carrying out an evaluation of agreement among experts accounting simultaneously for the two factors. The application of the proposed approach to a number of research situations, different from the FPA, is discussed.
2015
-
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3162219
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact