![]()
DOI: https://doi.org/10.63345/ijre.v14.i6.4
Dr. Daksha Borada
IILM University
Greater Noida, Uttar Pradesh 201306, India
Abstract
The advent of digital learning environments in higher education has resulted in massive repositories of student interaction and performance data, which, if properly harnessed, can transform curriculum design into a dynamic, evidence‑based process. Learning analytics (LA) encompasses the measurement, collection, analysis, and reporting of data about learners and their contexts, aiming to optimize learning and the environments in which it occurs. While the theoretical benefits of LA—such as personalized learning pathways, early identification of at‑risk students, and adaptive instructional strategies—have been widely acknowledged, practical integration into curriculum development remains limited by technical, organizational, and ethical barriers. This study explores how faculty and instructional designers perceive, utilize, and encounter obstacles when employing LA for curriculum adaptation. A mixed‑methods approach was adopted: a structured survey of 200 higher education practitioners across five universities, complemented by thematic analysis of open‑ended responses. Quantitative data reveal that metrics related to assignment performance trends, student engagement, predictive risk alerts, time‑on‑task, and forum participation are most frequently applied to inform curriculum decisions. However, only 45% of respondents enact formal adaptations on a quarterly basis, with others confined to semester‑end or ad‑hoc tweaks. Qualitative findings highlight critical impediments including fragmented data systems, insufficient analytics expertise, ambiguous governance policies, and privacy concerns. Crucially, correlation analysis demonstrates that practitioners with formal LA training are more than twice as likely to implement real‑time curriculum modifications. Drawing on these insights, we propose a systematic framework for embedding LA into curriculum design cycles: (1) establish integrated analytics dashboards; (2) develop clear data governance and ethical guidelines; (3) provide targeted professional development for faculty and designers; (4) create cross‑functional analytics teams; and (5) implement iterative feedback loops to monitor impact. By adopting this framework, institutions can move beyond isolated pilot projects to sustainable, data‑driven curriculum adaptation that enhances learning outcomes and institutional agility.
Keywords
Learning Analytics, Curriculum Adaptation, Higher Education, Data‑Driven Instruction, Educational Technology
References
- Campion, M., & De Blois, P. (2012). Advancing learning analytics in higher education. EDUCAUSE Review, 47(5), 40–50.
- Dowell, N., Shell, D., & Ochoa, X. (2015). Practical considerations for implementing learning analytics. Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, 415–416.
- Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5–6), 304–317.
- Fook, C. Y., Sidhu, G., & Khoo, A. (2015). Learning analytics: Data‑driven approaches to curriculum design. Journal of University Teaching & Learning Practice, 12(2), 1–18.
- Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education: A review of UK and international practice. JISC.
- Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 30–40.
- Slade, S., & Prinsloo, P. (2013). Ethical frameworks for learning analytics. Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, 95–104.
- Campbell, J. P., & De Blois, P. (2012). A maturity model for educational institutions: Implementation of learning analytics. EDUCAUSE Quarterly, 35(4), 23–29.
- Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 267–270.
- Ifenthaler, D., & Yau, J. Y. K. (2020). Utilising learning analytics for study success. Higher Education, 79(6), 1023–1043.
- Wise, A. F., & Jung, Y. (2019). The microgenetic structure of self‑regulated learning in MOOCs. Computers & Education, 136, 1–15.
- Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society, 15(3), 3–26.
- Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.
- Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5–13.
- Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice. Journal of Learning Analytics, 1(1), 3–14.
- Prinsloo, P., & Slade, S. (2017). An evaluation of policy frameworks for data-driven decision‑making in higher education. Journal of Learning Analytics, 4(1), 38–61.
- Drachsler, H., & Greller, W. (2016). Privacy and analytics: It’s a DELICATE issue. Proceedings of the 6th International Conference on Learning Analytics and Knowledge, 89–98.
- Papamitsiou, Z., & Economides, A. A. (2016). Learning analytics application frameworks: A systematic literature review. IEEE Transactions on Learning Technologies, 10(4), 405–421.
- Nec, G., & Manouselis, N. (2019). Learning analytics in online environments: A systematic review. Educational Technology Research and Development, 67, 867–889.
- Daniels, R. J., & Siemens, G. (2014). Enhancing teaching through analytics. Journal of Learning Analytics, 1(1), 62–83.