Among the various types of biases that can be recognised in the behaviour of algorithms learning from data, gender-related biases assume particular importance in certain contexts, such as the Italian one, traditionally linked to a patriarchal vision of society. This becomes even more true considering the context of university education, where there is a strong under-representation of female students in STEM Faculties, and, particularly, in Computer Science Courses. After a brief review of gender biases reported in Machine Learning-based systems, the experience of the teaching “Gender Knowledge and Ethics in Artificial Intelligence” active since A.Y. 2021-22 at the School of Engineering of the University of Padova is presented.

Gender knowledge and Artificial Intelligence

Roda A.
2022

Abstract

Among the various types of biases that can be recognised in the behaviour of algorithms learning from data, gender-related biases assume particular importance in certain contexts, such as the Italian one, traditionally linked to a patriarchal vision of society. This becomes even more true considering the context of university education, where there is a strong under-representation of female students in STEM Faculties, and, particularly, in Computer Science Courses. After a brief review of gender biases reported in Machine Learning-based systems, the experience of the teaching “Gender Knowledge and Ethics in Artificial Intelligence” active since A.Y. 2021-22 at the School of Engineering of the University of Padova is presented.
2022
CEUR Workshop Proceedings
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3470848
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact