A Comparison of College Readiness among Adult and Traditional-Age College Students

By DAACS | January 16, 2026

A Comparison of College Readiness among Adult and Traditional-Age College Students

Author: Oxana Rosca

Abstract

This dissertation provides a comprehensive validity investigation of the DAACS Reading and Mathematics assessments to ensure that college readiness comparisons between adult learners (age 24+) and traditional-age students rest on psychometrically sound measurement. Accordingly, the dissertation emphasized the collection of internal structure validity evidence for DAACS mathematics and reading assessments.

The internal structure of the DAACS Reading Assessment was examined using confirmatory factor analysis and bifactor twoparameter logistic (2PL) item response theory (IRT) models to distinguish a general reading comprehension factor from passagespecific residual factors. Model fit indices, local dependence diagnostics, and bifactor indices such as explained common variance (ECV) provided strong evidence for a dominant general factor underlying reading performance, supporting the interpretability of unidimensional IRTbased scores. In the Mathematics Assessment, unidimensional 2PL IRT models were estimated across adaptive multistage item sets. Item discrimination and difficulty parameters, standard errors, and local dependence statistics indicated strong item functioning for the vast majority of items. Across both assessments, the resulting IRT models supported score interpretations grounded in coherent latent constructs of reading and mathematical readiness.

Fairness validity evidence was gathered through a hybrid DIF approach using logistic regression models with IRTbased ability estimates as the matching variable. However, logistic regression DIF analyses produced preliminary findings due to limited sample size and statistical power, and these DIF results are currently being updated using additional data collected during the 2023 datacollection cycle.

Overall, this dissertation contributes evidence that is essential for ensuring that DAACS Reading and Mathematics scores are interpretable and appropriate for guiding institutional decision-making. By establishing that DAACS scores reflect meaningful differences in underlying skills, the study provides a defensible psychometric foundation for future research and for the equitable use of diagnostic assessment in higher education.

Download full article here:https://scholarsarchive.library.albany.edu/etd/342/