What students really think: unpacking AI ethics in educational assessments through a triadic framework

Uloženo v:
Podrobná bibliografie
Vydáno v:International Journal of Educational Technology in Higher Education vol. 22, no. 1 (Dec 2025), p. 56
Hlavní autor: Lim, Tristan
Další autoři: Gottipati, Swapna, Cheong, Michelle
Vydáno:
Springer Nature B.V.
Témata:
On-line přístup:Citation/Abstract
Full Text
Full Text - PDF
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!

MARC

LEADER 00000nab a2200000uu 4500
001 3257250012
003 UK-CbPIL
022 |a 2365-9440 
022 |a 1698-580X 
024 7 |a 10.1186/s41239-025-00556-8  |2 doi 
035 |a 3257250012 
045 2 |b d20251201  |b d20251231 
084 |a 142233  |2 nlm 
100 1 |a Lim, Tristan  |u Singapore University of Social Sciences, Singapore, Singapore (GRID:grid.443365.3) (ISNI:0000 0004 0388 6484) 
245 1 |a What students really think: unpacking AI ethics in educational assessments through a triadic framework 
260 |b Springer Nature B.V.  |c Dec 2025 
513 |a Journal Article 
520 3 |a The rise of AI in educational assessments has significantly enhanced efficiency and accuracy. However, it also introduces critical ethical challenges, including bias in grading, data privacy risks, and accountability gaps. These issues can undermine trust in AI-driven assessments and compromise educational fairness, making a structured ethical framework essential. To address these challenges, this study empirically validates an existing triadic ethical framework for AI-assisted educational assessments, originally proposed by Lim, Gottipati and Cheong (In: Keengwe (ed) Creative AI tools and ethical implications in teaching and learning, IGI Global, 2023), grounded in student perceptions. The framework encompasses three ethical domains—physical, cognitive, and informational—which intersect with five key assessment pipeline stages: system design, data stewardship, assessment construction, administration, and grading. By structuring AI-driven assessments within this ethical framework, the study systematically maps key concerns, including fairness, accountability, privacy, and academic integrity. To validate the proposed framework, Structural Equation Modeling (SEM) was employed to examine its relevance and alignment with learners' ethical concerns. Specifically, the study aims to (1) evaluate how well the triadic framework aligns with learners' perceptions of ethical issues using SEM analysis, and (2) examine relationships among the assessment pipeline stages, ethical considerations, pedagogical outcomes, and learner experiences. Findings reveal robust connections between AI-assisted assessment stages, ethical concerns, and learners' perspectives. By bridging theoretical validation with practical insights, this study emphasizes actionable strategies to support the development of AI-driven assessment systems that balance technological efficiency, pedagogical effectiveness, and ethical responsibility. 
653 |a Students 
653 |a Assessments 
653 |a Educational evaluation 
653 |a Ontology 
653 |a Systems design 
653 |a Automation 
653 |a Privacy 
653 |a Cognitive ability 
653 |a Ethics 
653 |a Research & development--R&D 
653 |a Feedback 
653 |a Cognition & reasoning 
653 |a Accountability 
653 |a Artificial intelligence 
653 |a Educational objectives 
653 |a Education 
653 |a Instructional scaffolding 
653 |a Personalized learning 
653 |a Literature reviews 
653 |a Surveillance 
653 |a Perceptions 
653 |a Efficiency 
653 |a Fairness 
653 |a Risk assessment 
653 |a Teaching 
653 |a Ethical dilemmas 
653 |a Morality 
653 |a Frame analysis 
653 |a Evaluation 
653 |a Learning 
653 |a Management 
653 |a Effectiveness 
653 |a Structural equation modeling 
653 |a Learning Analytics 
653 |a Dropout Rate 
653 |a Intelligent Tutoring Systems 
653 |a Influence of Technology 
653 |a Diagnostic Tests 
653 |a Learning Theories 
653 |a Educational Technology 
653 |a Grading 
653 |a Student Experience 
653 |a Network Analysis 
653 |a Integrity 
653 |a Interpersonal Relationship 
653 |a At Risk Students 
653 |a Instructional Effectiveness 
653 |a Adaptive Testing 
653 |a Educational Assessment 
653 |a Formative Evaluation 
653 |a Outcomes of Education 
653 |a Language Processing 
653 |a Data Processing 
653 |a Algorithms 
700 1 |a Gottipati, Swapna  |u Singapore Management University, School of Computing and Information Systems, Singapore, Singapore (GRID:grid.412634.6) (ISNI:0000 0001 0697 8112) 
700 1 |a Cheong, Michelle  |u Singapore Management University, School of Computing and Information Systems, Singapore, Singapore (GRID:grid.412634.6) (ISNI:0000 0001 0697 8112) 
773 0 |t International Journal of Educational Technology in Higher Education  |g vol. 22, no. 1 (Dec 2025), p. 56 
786 0 |d ProQuest  |t Political Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3257250012/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text  |u https://www.proquest.com/docview/3257250012/fulltext/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full Text - PDF  |u https://www.proquest.com/docview/3257250012/fulltextPDF/embedded/6A8EOT78XXH2IG52?source=fedsrch