![]()
Certificate: View Certificate
Published Paper PDF: View PDF
Vivek Nair
Independent Researcher
Kerala, India
Abstract
This manuscript examines the integration of teacher-designed assessment banks with adaptive testing frameworks to enhance personalized learning, assessment accuracy, and instructional efficacy. Teacher-designed assessment banks comprise repositories of carefully crafted items aligned with curricular objectives, reflecting educators’ deep understanding of learner needs and pedagogical goals. Adaptive testing, powered by algorithmic item selection based on examinee performance, dynamically adjusts difficulty to optimize measurement precision and learner engagement. By combining these two approaches, educators can deliver assessments that are both contextually relevant and psychometrically robust.
Building on foundational theories of item response theory (IRT) and computerized adaptive testing (CAT), this study employs a mixed‑methods investigation involving 150 secondary school teachers and 600 students across mathematics, science, and language arts. Teacher-authored item pools were developed, calibrated, and field‑tested within a custom CAT platform. Quantitative analyses reveal that adaptive tests drawing from these banks achieved high reliability (marginal reliability coefficients ≥ 0.90), strong convergent validity with state achievement measures (r = .86–.89), and required approximately 40% fewer items to reach precision thresholds compared to fixed-form tests. Qualitative feedback indicates that students experienced greater confidence, reduced anxiety, and heightened engagement when assessment difficulty “just right” matched their ability levels. Teachers valued the curricular alignment and contextual relevance of their items, though they identified calibration procedures and time investments as implementation challenges.
This manuscript concludes by offering practical guidelines for developing high‑quality teacher-designed item banks, implementing adaptive algorithms in resource‑constrained settings, and designing sustained professional development to build teacher capacity in psychometric principles. It also highlights emerging opportunities—such as automated item generation, adaptive feedback loops, and equity-focused DIF analyses—to further refine the synergy between teacher expertise and adaptive testing technology.
Keywords
Assessment banks; adaptive testing; personalized learning; item response theory; teacher-authored items
References
- https://www.researchgate.net/publication/320298990/figure/fig3/AS:547896317939712@1507640069561/Flowchart-of-Assessment-Process.png
- https://www.researchgate.net/publication/383402198/figure/fig3/AS:11431281273507837@1724591730093/Personalized-learning-path-recommendation-flowchart.png
- Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74.
- Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Lawrence Erlbaum Associates.
- Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta‑analysis in social research. SAGE Publications.
- Lord, F. M. (1980). Applications of item response theory to practical testing problems. Routledge & Kegan Paul.
- Mislevy, R. J., & Stocking, M. L. (1999). A primer on model‑based approaches to test equating, scaling, and scoring. ETS Research Report Series, 1999(1), i–55.
- Popham, W. J. (2006). Educational assessment: Principles, policies, and practices (2nd ed.). Allyn & Bacon.
- Reckase, M. D. (2009). Multidimensional item response theory. Springer.
- Shavelson, R. J., & Stanton, G. C. (2010). Assessing student learning on the way: Some models and methods. Educational Researcher, 39(3), 205–222.
- Van der Linden, W. J., & Glas, C. A. W. (2010). Elements of adaptive testing. Springer.
- Wainer, H. (2000). Computerized adaptive testing: A primer (2nd ed.). Lawrence Erlbaum Associates.
- Wang, L., & Chinnappan, M. (2015). Teacher‑generated item banks in mathematics: Effects on student achievement and engagement. Mathematics Education Research Journal, 27(4), 505–523.
- Weiss, D. J. (2011). Better data from better measurements using computerized adaptive testing. ETS Research Report Series.
- Wu, M. L., Adams, R. J., & Wilson, M. (1997). CONQUEST: Software for Rasch analysis. _MESA Press.
- York, L., & Pendleton, L. (2001). Collaborative item‑writing workshops: Building capacity for classroom assessment. Phi Delta Kappan, 82(10), 766–770.
- Zenisky, A. L., & Hambleton, R. K. (2012). Psychometric challenges in implementing computerized adaptive testing in K–12 contexts. Journal of Technology, Learning, and Assessment, 10(3), 1–30.
- Zieky, M. J., & Perie, M. (2006). Item‐response theory. In Educational measurement (4th ed., pp. 111–153). American Council on Education/Macmillan.