Validation and cut-off scoring of the assessment implementation measure (AIM) tool in undergraduate medical education
Salvato in:
| Pubblicato in: | BMC Medical Education vol. 25 (2025), p. 1-15 |
|---|---|
| Autore principale: | |
| Altri autori: | , |
| Pubblicazione: |
Springer Nature B.V.
|
| Soggetti: | |
| Accesso online: | Citation/Abstract Full Text Full Text - PDF |
| Tags: |
Nessun Tag, puoi essere il primo ad aggiungerne!!
|
MARC
| LEADER | 00000nab a2200000uu 4500 | ||
|---|---|---|---|
| 001 | 3257228309 | ||
| 003 | UK-CbPIL | ||
| 022 | |a 1472-6920 | ||
| 024 | 7 | |a 10.1186/s12909-025-07862-9 |2 doi | |
| 035 | |a 3257228309 | ||
| 045 | 2 | |b d20250101 |b d20251231 | |
| 084 | |a 58506 |2 nlm | ||
| 100 | 1 | |a Mohammad, Khadija | |
| 245 | 1 | |a Validation and cut-off scoring of the assessment implementation measure (AIM) tool in undergraduate medical education | |
| 260 | |b Springer Nature B.V. |c 2025 | ||
| 513 | |a Journal Article | ||
| 520 | 3 | |a BackgroundThe quality of assessment in undergraduate medical colleges remains underexplored, particularly concerning the availability of validated instruments for its measurement. Bridging the gap between established assessment standards and their practical application is crucial for improving educational outcomes. To address this, the ‘Assessment Implementation Measure’ (AIM) tool was designed to evaluate the perception of assessment quality among undergraduate medical faculty members. While the content validity of the AIM questionnaire has been established, limitations in sample size have precluded the determination of construct validity and a statistically defined cutoff score.ObjectiveTo establish the construct validity of the Assessment Implementation Measure (AIM) tool. To determine the cutoff scores of the AIM tool and its domains statistically for classifying assessment implementation quality.MethodsThis study employed a cross-sectional validation design to establish the construct validity and a statistically valid cutoff score for the AIM tool to accurately classify the quality of assessment implementation as either high or low. A sample size of 347 undergraduate medical faculty members was used for this purpose. The construct validity of the AIM tool was established through exploratory factor analysis (EFA), reliability was confirmed via Cronbach's alpha, and cutoff scores were calculated via the receiver operating characteristic curve (ROC).ResultsEFA of the AIM tool revealed seven factors accounting for 63.961% of the total variance. One item was removed, resulting in 29 items with factor loadings above 0.40. The tool’s reliability was excellent (0.930), and the seven domains ranged from 0.719 to 0.859; however, the ‘Ensuring Fair Assessment’ domain demonstrated a weak Cronbach’s alpha of 0.570. The cutoff score for differentiating high and low assessment quality was calculated as 77 out of 116 using the ROC curve. The scores for the seven domains ranged from 5.5 to 18.5. The tool's area under the curve (AUC) was 0.994, and for the seven factors, it ranged from 0.701 to 0.924.ConclusionThe validated AIM tool and statistically established cutoff score provide a standardized measure for institutions to evaluate and improve their assessment programs. EFA factor analysis grouped 29 of the 30 items into 7 factors, demonstrating good construct validity. The tool demonstrated good reliability via Cronbach’s alpha, and a cutoff score of 77 was calculated through ROC curve analysis. This tool can guide faculty development initiatives and support quality assurance processes in medical schools. | |
| 610 | 4 | |a LinkedIn Corp WhatsApp Inc | |
| 653 | |a Teaching | ||
| 653 | |a Students | ||
| 653 | |a Medical education | ||
| 653 | |a Discriminant analysis | ||
| 653 | |a Validation studies | ||
| 653 | |a Sample size | ||
| 653 | |a Quality standards | ||
| 653 | |a Validity | ||
| 653 | |a Principal components analysis | ||
| 653 | |a Educational objectives | ||
| 653 | |a Variables | ||
| 653 | |a Data collection | ||
| 653 | |a Eigenvalues | ||
| 653 | |a Learning | ||
| 653 | |a Educational Quality | ||
| 653 | |a Cutting Scores | ||
| 653 | |a Indexes | ||
| 653 | |a Educational Practices | ||
| 653 | |a Sampling | ||
| 653 | |a Educational Development | ||
| 653 | |a Construct Validity | ||
| 653 | |a Graduates | ||
| 653 | |a Factor Analysis | ||
| 653 | |a Stakeholders | ||
| 653 | |a College Faculty | ||
| 653 | |a Program Evaluation | ||
| 653 | |a Medical Evaluation | ||
| 653 | |a Factor Structure | ||
| 653 | |a Accountability | ||
| 653 | |a Faculty Development | ||
| 653 | |a Instructional Effectiveness | ||
| 653 | |a Data Analysis | ||
| 653 | |a Educational Assessment | ||
| 653 | |a Outcomes of Education | ||
| 653 | |a Content Validity | ||
| 700 | 1 | |a Sajjad, Madiha | |
| 700 | 1 | |a Rehan Ahmed Khan | |
| 773 | 0 | |t BMC Medical Education |g vol. 25 (2025), p. 1-15 | |
| 786 | 0 | |d ProQuest |t Healthcare Administration Database | |
| 856 | 4 | 1 | |3 Citation/Abstract |u https://www.proquest.com/docview/3257228309/abstract/embedded/H09TXR3UUZB2ISDL?source=fedsrch |
| 856 | 4 | 0 | |3 Full Text |u https://www.proquest.com/docview/3257228309/fulltext/embedded/H09TXR3UUZB2ISDL?source=fedsrch |
| 856 | 4 | 0 | |3 Full Text - PDF |u https://www.proquest.com/docview/3257228309/fulltextPDF/embedded/H09TXR3UUZB2ISDL?source=fedsrch |