Pertaining to goal orientation and achievement, agency is a fundamental aspect of human cognition and behavior. Accordingly, detecting and quantifying linguistic encoding of agency are critical for the analysis of human actions, interactions, and social dynamics. Available agency-quantifying computational tools rely on word-counting methods, which typically are insensitive to the semantic context in which the words are used and consequently prone to miscoding, for example, in case of polysemy. Additionally, some currently available tools do not take into account differences in the intensity and directionality of agency. In order to overcome these shortcomings, we present BERTAgent, a novel tool to quantify semantic agency in text. BERTAgent is a computational language model that utilizes the transformers architecture, a popular deep learning approach to natural language processing. BERTAgent was fine-tuned using textual data that were evaluated by human coders with respect to the level of conveyed agency. In four validation studies, BERTAgent exhibits improved convergent and discriminant validity compared to previous solutions. Additionally, the detailed description of BERTAgent's development procedure serves as a tutorial for the advancement of similar tools, providing a blueprint for leveraging the existing lexicographical data sets in conjunction with the deep learning techniques in order to detect and quantify other psychological constructs in textual data.

BERTAgent: The Development of a Novel Tool to Quantify Agency in Textual Data

Suitner C.;Erseghe T.;
2025

Abstract

Pertaining to goal orientation and achievement, agency is a fundamental aspect of human cognition and behavior. Accordingly, detecting and quantifying linguistic encoding of agency are critical for the analysis of human actions, interactions, and social dynamics. Available agency-quantifying computational tools rely on word-counting methods, which typically are insensitive to the semantic context in which the words are used and consequently prone to miscoding, for example, in case of polysemy. Additionally, some currently available tools do not take into account differences in the intensity and directionality of agency. In order to overcome these shortcomings, we present BERTAgent, a novel tool to quantify semantic agency in text. BERTAgent is a computational language model that utilizes the transformers architecture, a popular deep learning approach to natural language processing. BERTAgent was fine-tuned using textual data that were evaluated by human coders with respect to the level of conveyed agency. In four validation studies, BERTAgent exhibits improved convergent and discriminant validity compared to previous solutions. Additionally, the detailed description of BERTAgent's development procedure serves as a tutorial for the advancement of similar tools, providing a blueprint for leveraging the existing lexicographical data sets in conjunction with the deep learning techniques in order to detect and quantify other psychological constructs in textual data.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3559998
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
  • OpenAlex ND
social impact