Stereotyping in Language (Technologies): An Examination of Racial and Gender Stereotypes in Natural Language and Language Models

Guardado en:
Detalles Bibliográficos
Publicado en:ProQuest Dissertations and Theses (2025)
Autor principal: Lee, Hojun
Publicado:
ProQuest Dissertations & Theses
Materias:
Acceso en línea:Citation/Abstract
Full Text - PDF
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:This dissertation examines stereotyping across natural language and language technologies through three interconnected studies. The first chapter applies a contemporary model of race relations from social psychology to investigate America's racial framework within American English, revealing how language encodes hierarchical associations between racial/ethnic groups and attributes of superiority and Americanness. The second chapter extends this analysis to Large Language Models (LLMs), finding that these language technologies portray socially subordinate groups as more homogeneous compared to dominant groups. The third chapter investigates stereotyping in Vision Language Models (VLMs), showing that these language technologies generate more uniform representations for women than men and for White Americans than Black Americans, with uniformity increasing for more gender-prototypical appearances. This research demonstrates how stereotypes persist across natural language and AI systems, with many biases in language technologies mirroring established patterns of human social cognition. By integrating the fields of natural language processing and social sciences through interdisciplinary research, this dissertation documents bias patterns in language technologies and demonstrates how social psychological theories provide valuable tools for detecting and measuring stereotyping of AI systems.
ISBN:9798310392960
Fuente:ProQuest Dissertations & Theses Global