Human-Centred Development of Indicators for Self-Service Learning Analytics

A Transparency through Exploration Approach

Authors

DOI:

https://doi.org/10.18608/jla.2026.8921

Keywords:

human-centred learning analytics, trustworthy learning analytics, transparent learning analytics, self-service learning analytics, open learning analytics, transparency, trust, acceptance, research paper

Abstract

The aim of learning analytics (LA) is to turn educational data into insights, decisions, and actions to improve learning and teaching. The reasoning of the provided insights, decisions, and actions is often not transparent to the end-user, and this can lead to trust and acceptance issues when interventions, feedback, and recommendations fail. In this paper, we shed light on achieving transparent LA by following a transparency through exploration approach. To this end, we present the design, implementation, and evaluation details of the Indicator Editor, which aims to support self-service LA (SSLA) by empowering end-users to take control of the indicator implementation process. We systematically designed and implemented the Indicator Editor through an iterative human-centred design (HCD) approach. Further, we conducted a qualitative user study (n=15\) to investigate the impact of following an SSLA approach on users' perceptions of and interactions with the Indicator Editor. Our study showed qualitative evidence that supporting user interaction and providing user control in the indicator implementation process can have positive effects on different crucial aspects of LA, namely transparency, trust, satisfaction, and acceptance.

References

Abdi, S., Khosravi, H., Sadiq, S., & Gasevic, D. (2020). Complementing educational recommender systems with open learner models. In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 360–365). ACM. https://doi.org/10.1145/3375462.3375520

Ahn, J., Campos, F., Hays, M., & DiGiacomo, D. (2019). Designing in context: Reaching beyond usability in learning analytics dashboard design. Journal of Learning Analytics, 6(2), 70–85. https://doi.org/10.18608/jla.2019.62.5

Ahn, J., Campos, F., Nguyen, H., Hays, M., & Morrison, J. (2021). Co-designing for privacy, transparency, and trust in K-12 learning analytics. In Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK 2021), 12–16 April 2021, Irvine, California, USA (pp. 55–65). ACM. https://doi.org/10.1145/3448139.3448145

Alfredo, R., Echeverria, V., Jin, Y., Yan, L., Swiecki, Z., Gaševíc, D., & Martinez-Maldonado, R. (2024). Human-centred learning analytics and AI in education: A systematic literature review. Computers and Education: Artificial Intelligence, 6, 100215. https://doi.org/10.1016/j.caeai.2024.100215

Alvarez, C. P., Martinez-Maldonado, R., & Buckingham Shum, S. (2020). LA-DECK: A card-based learning analytics co-design tool. In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 63–72). ACM. https://doi.org/10.1145/3375462.3375476

Alzahrani, A. S., Tsai, Y.- S., Aljohani, N., Whitelock-Wainwright, E., & Gasevic, D. (2023). Do teaching staff trust stakeholders and tools in learning analytics? A mixed methods study. Educational Technology Research and Development, 71(4), 1471–1501. https://doi.org/10.1007/s11423-023-10229-w

Amershi, S., Cakmak, M., Knox, W. B., & Kulesza, T. (2014). Power to the people: The role of humans in interactive machine learning. AI Magazine, 35(4), 105–120. https://doi.org/10.1609/aimag.v35i4.2513

Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989. https://doi.org/10.1177/1461444816676645

Andjelkovic, I., Parra, D., & O’Donovan, J. (2016). Moodplay: Interactive mood-based music discovery and recommendation. In Proceedings of the 2016 Conference on User Modeling Adaptation and Personalization (UMAP 2016), 13–17 July 2016, Halifax, Nova Scotia, Canada (pp. 275–279). ACM. https://doi.org/10.1145/2930238.2930280

Barria Pineda, J., & Brusilovsky, P. (2019). Making educational recommendations transparent through a fine-grained open learner model. In C. Trattner, D. Parra, & N. Riche (Eds.), Proceedings of Workshop on Intelligent User Interfaces for Algorithmic Transparency in Emerging Technologies at the 24th ACM Conference on Intelligent User Interfaces (IUI 2019), 20 March 2019, Los Angeles, California, USA (Vol. 2327). CEUR Workshop Proceedings.

Barria-Pineda, J., Akhuseyinoglu, K., & Brusilovsky, P. (2019). Explaining need-based educational recommendations using interactive open learner models. In Adjunct Publication of the 27th Conference on User Modeling, Adaptation and Personalization (UMAP 2019), 9–12 June 2019, Larnaca, Cyprus (pp. 273–277). ACM. https://doi.org/10.1145/3314183.3323463

Bennett, L., & Folley, S. (2019). Four design principles for learner dashboards that support student agency and empowerment. Journal of Applied Research in Higher Education, 12(1), 15–26. https://doi.org/10.1108/jarhe-11-2018-0251

Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., & Verbert, K. (2018). Open learner models and learning analytics dashboards: A systematic review. In Proceedings of the Eighth International Conference on Learning Analytics and Knowledge (LAK 2018), 7–9 March 2018, Sydney, Australia (pp. 41–50). ACM. https://doi.org/10.1145/3170358.3170409

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1

Buckingham Shum, S., Martínez-Maldonado, R., Dimitriadis, Y., & Santos, P. (2024). Human-centred learning analytics: 2019–24. British Journal of Educational Technology, 55(3), 755–768. https://doi.org/10.1111/bjet.13442

Campos, F., Nguyen, H., Ahn, J., & Jackson, K. (2024). Leveraging cultural forms in human-centred learning analytics design. British Journal of Educational Technology, 55(3), 769–784. https://doi.org/10.1111/bjet.13384

Carter, L., & Belanger, F. (2005). The utilization of e-government services: Citizen trust, innovation and acceptance factors. Information Systems Journal, 15(1), 5–25. https://doi.org/10.1111/j.1365-2575.2005.00183.x

Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5-6), 318–331. https://doi.org/10.1504/ijtel.2012.051815

Chatti, M. A., & Muslim, A. (2019). The PERLA framework: Blending personalization and learning analytics. International Review of Research in Open and Distributed Learning, 20(1). https://doi.org/10.19173/irrodl.v20i1.3936

Chatti, M. A., Muslim, A., Guesmi, M., Richtscheid, F., Nasimi, D., Shahin, A., & Damera, R. (2020). How to design effective learning analytics indicators? A human-centered design approach. In C. Alario-Hoyos, M. Rodríguez-Triana, M. Scheffel, I. Arnedillo-Sanchez, & S. Dennerlein (Eds.), Addressing global challenges and quality education. EC-TEL 2020. Lecture notes in computer science (pp. 303–317, Vol. 12315). Springer. https://doi.org/10.1007/978-3-030-57717-9_22

Chatti, M. A., Muslim, A., Guliani, M., & Guesmi, M. (2020). The LAVA model: Learning analytics meets visual analytics. In D. Ifenthaler & D. Gibson (Eds.), Adoption of data analytics in higher education learning and teaching (pp. 71–93). Springer. https://doi.org/10.1007/978-3-030-47392-1_5

Chen, B., & Zhu, H. (2019). Towards value-sensitive learning analytics design. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, Arizona, USA (pp. 343–352). ACM. https://doi.org/10.1145/3303772.3303798

Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In Proceedings of the Second International Conference on Learning Analytics and Knowledge (LAK 2012), 29 April–2 May 2012, Vancouver, British Columbia, Canada (pp. 134–138). ACM. https://doi.org/10.1145/2330601.2330636

Conati, C., Porayska-Pomsta, K., & Mavrikis, M. (2018). AI in education needs interpretable machine learning: Lessons from Open Learner Modelling. arXiv preprint arXiv:1807.00154. https://arxiv.org/abs/1807.00154

Conijn, R., Kahr, P., & Snijders, C. C. (2023). The effects of explanations in automated essay scoring systems on student trust and motivation. Journal of Learning Analytics, 10(1), 37–53. https://doi.org/10.18608/jla.2023.7801

Cramer, H., Evers, V., Ramlal, S., Van Someren, M., Rutledge, L., Stash, N., Aroyo, L., & Wielinga, B. (2008). The effects of transparency on trust in and acceptance of a content-based art recommender. User Modeling and User-Adapted Interaction, 18(5), 455–496. https://doi.org/10.1007/s11257-008-9051-3

Cukurova, M., Zhou, Q., Spikol, D., & Landolfi, L. (2020). Modelling collaborative problem-solving competence with transparent learning analytics: Is video data enough? In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 270–275). ACM. https://doi.org/10.1145/3375462.3375484

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. https://doi.org/10.2307/249008

de Quincey, E., Briggs, C., Kyriacou, T., & Waller, R. (2019). Student centred design of a learning analytics system. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, Arizona, USA (pp. 353–362). ACM. https://doi.org/10.1145/3303772.3303793

Dimitriadis, Y., Martınez-Maldonado, R., & Wiley, K. (2021). Human-centered design principles for actionable learning analytics. In T. Tsiatsos, S. Demetriadis, A. Mikropoulos, & V. Dagdilelis (Eds.), Research on E-learning and ICT in education: Technological, pedagogical and instructional perspectives (pp. 277–296). Springer. https://doi.org/10.1007/978-3-030-64363-8_15

Dollinger, M., Liu, D., Arthars, N., & Lodge, J. M. (2019). Working together in learning analytics towards the co-creation of value. Journal of Learning Analytics, 6(2), 10–26. https://doi.org/10.18608/jla.2019.62.2

Dollinger, M., & Lodge, J. M. (2018). Co-creation strategies for learning analytics. In Proceedings of the Eighth International Conference on Learning Analytics and Knowledge (LAK 2018), 7–9 March 2018, Sydney, Australia (pp. 97–101). ACM. https://doi.org/10.1145/3170358.3170372

Drachsler, H., & Greller, W. (2016). Privacy and analytics: It’s a DELICATE issue a checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics and Knowledge (LAK 2016), 25–29 April 2016, Edinburgh, Scotland, UK (pp. 89–98). ACM. https://doi.org/10.1145/2883851.2883893

Du, M., Liu, N., & Hu, X. (2020). Techniques for interpretable machine learning. Communications of the ACM, 63(1), 68–77. https://doi.org/10.1145/3359786

Duan, X., Pei, B., Ambrose, G. A., Hershkovitz, A., Cheng, Y., & Wang, C. (2024). Towards transparent and trustworthy prediction of student learning achievement by including instructors as co-designers: A case study. Education and Information Technologies, 29(3), 3075–3096. https://doi.org/10.1007/s10639-023-11954-8

Dudley, J. J., & Kristensson, P. O. (2018). A review of user interface design for interactive machine learning. ACM Transactions on Interactive Intelligent Systems (TiiS), 8(2), 1–37. https://doi.org/10.1145/3185517

Dyckhoff, A. L., Zielke, D., Bultmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Journal of Educational Technology & Society, 15(3), 58–76. https://www.jstor.org/stable/jeductechsoci.15.3.58

Elias, T. (2011). Learning analytics: Definitions, processes and potential. https://scispace.com/pdf/learning-analytics-definitions-processes-and-potential-ps57ps7au6.pdf

Gedikli, F., Jannach, D., & Ge, M. (2014). How should I explain? A comparison of different explanation types for recommender systems. International Journal of Human-Computer Studies, 72(4), 367–382. https://doi.org/10.1016/j.ijhcs.2013.12.007

Gedrimiene, E., Celik, I., Mäkitalo, K., & Muukkonen, H. (2023). Transparency and trustworthiness in user intentions to follow career recommendations from a learning analytics tool. Journal of Learning Analytics, 10(1), 54–70. https://doi.org/10.18608/jla.2023.7791

Gefen, D., Karahanna, E., & Straub, D. W. (2003). Trust and TAM in online shopping: An integrated model. MIS Quarterly, 27(1), 51–90. https://doi.org/10.2307/30036519

Hakami, E., & Hernández Leo, D. (2020). How are learning analytics considering the societal values of fairness, accountability, transparency, and human well-being? A literature review. In A. Martinez-Mones, A. Alvarez, M. Caeiro-Rodriguez, & Y. Dimitriadis (Eds.), Learning Analytics Summer Institute Spain 2020: Learning Analytics. Time for Adoption?(LASI-SPAIN 2020), 15–16 June 2020, Valladolid, Spain (pp. 121–141). CEUR Workshop Proceedings. https://doi.org/10.1007/978-3-030-47392-1_2

Hanington, B., & Martin, B. (2019). Universal methods of design expanded and revised: 125 ways to research complex problems, develop innovative ideas, and design effective solutions. Rockport Publishers.

Haythornthwaite, C. (2017). An information policy perspective on learning analytics. In Proceedings of the Seventh International Conference on Learning Analytics and Knowledge (LAK 2017), 13–17 March 2017, Vancouver, British Columbia, Canada (pp. 253–256). ACM. https://doi.org/10.1145/3027385.3027389

He, C., Parra, D., & Verbert, K. (2016). Interactive recommender systems: A survey of the state of the art and future research challenges and opportunities. Expert Systems with Applications, 56, 9–27. https://doi.org/10.1016/j.eswa.2016.02.013

Hellmann, M., Hernandez-Bocanegra, D. C., & Ziegler, J. (2022). Development of an instrument for measuring users’ perception of transparency in recommender systems. In A. Smith-Renner & O. Amir (Eds.), HUMANIZE: Transparency and Explainability in Adaptive Systems through User Modeling Grounded in Psychological Theory: Workshops at the International Conference on Intelligent User Interfaces (IUI 2022), 21–22 March 2022, Helsinki, Finland (virtual) (pp. 156–165). CEUR Workshop Proceedings. https://ceur-ws.org/Vol-3124/paper17.pdf

Hilliger, I., De Laet, T., Henríquez, V., Guerra, J., Ortiz-Rojas, M., Zúñiga, M. Á., Baier, J., & Pérez-Sanagustín, M. (2020). For learners, with learners: Identifying indicators for an academic advising dashboard for students. In C. Alario-Hoyos, M. Rodríguez-Triana, M. Scheffel, I. Arnedillo-Sanchez, & S. Dennerlein (Eds.), Addressing global challenges and quality education. EC-TEL 2020. Lecture notes in computer science (pp. 117–130, Vol. 12315). Springer. https://doi.org/10.1007/978-3-030-57717-9_9

Hilliger, I., Miranda, C., Celis, S., & Pérez-Sanagustín, M. (2024). Curriculum analytics adoption in higher education: A multiple case study engaging stakeholders in different phases of design. British Journal of Educational Technology, 55(3), 785–801. https://doi.org/10.1111/bjet.13374

Hoel, T., Griffiths, D., & Chen, W. (2017). The influence of data protection and privacy frameworks on the design of learning analytics systems. In Proceedings of the Seventh International Conference on Learning Analytics and Knowledge (LAK 2017), 13–17 March 2017, Vancouver, British Columbia, Canada (pp. 243–252). ACM. https://doi.org/10.1145/3027385.3027414

Holstein, K., McLaren, B. M., & Aleven, V. (2019). Co-designing a real-time classroom orchestration tool to support teacher AI complementarity. Journal of Learning Analytics, 6(2), 27–52. https://doi.org/10.18608/jla.2019.62.3

Hosseini, M., Shahri, A., Phalp, K., & Ali, R. (2018). Four reference models for transparency requirements in information systems. Requirements Engineering, 23(2), 251–275. https://doi.org/10.1007/s00766-017-0265-y

Hutchins, N. M., & Biswas, G. (2024). Co-designing teacher support technology for problem-based learning in middle school science. British Journal of Educational Technology, 55(3), 802–822. https://doi.org/10.1111/bjet.13363

Jiang, L., Liu, S., & Chen, C. (2019). Recent research advances on interactive machine learning. Journal of Visualization, 22, 401–417. https://doi.org/10.1007/s12650-018-0531-1

Jin, Y., Tintarev, N., & Verbert, K. (2018). Effects of personal characteristics on music recommender systems with different levels of controllability. In Proceedings of the 12th ACM Conference on Recommender Systems (RecSys 2018), 2 October 2018, Vancouver, British Columbia, Canada (pp. 13–21). ACM. https://doi.org/10.1145/3240323.3240358

Jivet, I., Wong, J., Scheffel, M., Valle Torre, M., Specht, M., & Drachsler, H. (2021). Quantum of choice: How learners’ feedback monitoring decisions, goals and self-regulated learning skills are related. In Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK 2021), 12–16 April 2021, Irvine, California, USA (pp. 416–427). ACM. https://doi.org/10.1145/3448139.3448179

Joarder, S., & Chatti, M. A. (2025). The ISC Creator: Human-centered design of learning analytics interactive indicator specification cards. Proceedings of the 17th International Conference on Education Technology and Computers (ICETC 2025), 18–21 September 2025, Barcelona, Spain. https://arxiv.org/abs/2504.07811

Jugovac, M., & Jannach, D. (2017). Interacting with recommenders—overview and research directions. ACM Transactions on Interactive Intelligent Systems (TiiS), 7(3), 1–46. https://doi.org/10.1145/3001837

Kaur, D., Uslu, S., Rittichier, K. J., & Durresi, A. (2022). Trustworthy artificial intelligence: A review. ACM Computing Surveys (CSUR), 55(2), 1–38. https://doi.org/10.1145/3491209

Keim, D. A., Mansmann, F., Schneidewind, J., & Ziegler, H. (2006). Challenges in visual data analysis. In E. Banissi, R. A. Burkhard, A. Ursyn, J. J. Zhang, M. Bannatyne, C. Maple, A. J. Cowell, G. Y. Tian, & M. Hou (Eds.), Proceedings of the Tenth International Conference on Information Visualisation (IV 2006), 5–7 July 2006, London, UK (pp. 9–16). IEEE. https://doi.org/10.1109/iv.2006.31

Khalil, M., Prinsloo, P., & Slade, S. (2023). Fairness, trust, transparency, equity, and responsibility in learning analytics. Journal of Learning Analytics, 10(1), 1–7. https://doi.org/10.18608/jla.2023.7983

Khosravi, H., Buckingham Shum, S., Chen, G., Conati, C., Tsai, Y.-S., Kay, J., Knight, S., Martinez-Maldonado, R., Sadiq, S., & Gasevic, D. (2022). Explainable artificial intelligence in education. Computers and Education: Artificial Intelligence, 3, 100074. https://doi.org/10.1016/j.caeai.2022.100074

Knijnenburg, B. P., Reijmer, N. J., & Willemsen, M. C. (2011). Each to his own: How different users call for different interaction methods in recommender systems. In Proceedings of the Fifth ACM Conference on Recommender Systems (RecSys 2011), 23–27 October 2011, Chicago, Illinois, USA (pp. 141–148). ACM. https://doi.org/10.1145/2043932.2043960

Knijnenburg, B. P., Willemsen, M. C., Gantner, Z., Soncu, H., & Newell, C. (2012). Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction, 22(4), 441–504. https://doi.org/10.1007/s11257-011-9118-4

Lang, C., & Davis, L. (2023). Learning analytics and stakeholder inclusion: What do we mean when we say “human-centered”? In Proceedings of the 13th International Conference on Learning Analytics and Knowledge (LAK 2023), 13–17 March 2023, Arlington, Texas, USA (pp. 411–417). ACM. https://doi.org/10.1145/3576050.3576110

Lang, C., Macfadyen, L. P., Slade, S., Prinsloo, P., & Sclater, N. (2018). The complexities of developing a personal code of ethics for learning analytics practitioners: Implications for institutions and the field. In Proceedings of the Eighth International Conference on Learning Analytics and Knowledge (LAK 2018), 7–9 March 2018, Sydney, Australia (pp. 436–440). ACM. https://doi.org/10.1145/3170358.3170396

Lawrence, L., Echeverria, V., Yang, K., Aleven, V., & Rummel, N. (2024). How teachers conceptualise shared control with an AI co-orchestration tool: A multiyear teacher-centred design process. British Journal of Educational Technology, 55(3), 823–844. https://doi.org/10.1111/bjet.13372

Li, W., Brooks, C., & Schaub, F. (2019). The impact of student opt-out on educational predictive models. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, Arizona, USA (pp. 411–420). ACM. https://doi.org/10.1145/3303772.3303809

Ma, S., Zhou, T., Nie, F., & Ma, X. (2022). Glancee: An adaptable system for instructors to grasp student learning status in synchronous online classes. In S. Barbosa, C. Lampe, C. Appert, D. A. Shamma, S. Drucker, J. Williamson, & K. Yatani (Eds.), Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (CHI 2022), 29 April–5 May 2022, New Orleans, Louisiana, USA (pp. 1–25). ACM. https://doi.org/10.1145/3491102.3517482

Martínez-Maldonado, R. (2023). Human-centred learning analytics: Four challenges in realising the potential. Learning Letters, 1, 6. https://doi.org/10.59453/fizj7007

Martínez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). LATUX: An iterative workflow for designing, validating, and deploying learning analytics visualizations. Journal of Learning Analytics, 2(3), 9–39. https://doi.org/10.18608/jla.2015.23.3

McKnight, D. H., Choudhury, V., & Kacmar, C. (2002). Developing and validating trust measures for e-commerce: An integrative typology. Information Systems Research, 13(3), 334–359. https://doi.org/10.1287/isre.13.3.334.81

Miller, T. (2022). Are we measuring trust correctly in explainability, interpretability, and transparency research? arXiv preprint arXiv:2209.00651. https://arxiv.org/abs/2209.00651

Muslim, A., Chatti, M. A., Mughal, M., & Schroeder, U. (2017). The goal-question-indicator approach for personalized learning analytics. In P. Escudeiro, G. Costagliola, S. Zvacek, J. Uhomoibhi, & B. M. McLaren (Eds.), Proceedings of the Ninth International Conference on Computer Supported Education (CSEDU 2017), 21–23 April 2017, Porto, Portugal (pp. 371–378, Vol. 1). SCITEPRESS Digital Library. https://doi.org/10.5220/0006319803710378

Ngo, T., Kunkel, J., & Ziegler, J. (2020). Exploring mental models for transparent and controllable recommender systems: A qualitative study. In T. Kuflik, I. Torre, R. Burke, & C. Gena (Eds.), Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization (UMAP 2020), 14–17 July 2020, Genoa, Italy (pp. 183–191). ACM. https://doi.org/10.1145/3340631.3394841

Norman, D. (2013). The design of everyday things: Revised and expanded edition. Basic Books.

Nunes, I., & Jannach, D. (2017). A systematic review and taxonomy of explanations in decision support and recommender systems. User Modeling and User-Adapted Interaction, 27(3), 393–444. https://doi.org/10.1007/s11257-017-9195-0

Oliver-Quelennec, K., Bouchet, F., Carron, T., Fronton Casalino, K., & Pinc¸ on, C. (2022). Adapting learning analytics dashboards by and for university students. In I. Hilliger, P. Mu ˜noz-Merino, T. De Laet, A. Ortega-Arranz, & T. Farrell (Eds.), Educating for a new future: Making sense of technology-enhanced learning adoption. EC-TEL 2022. Lecture notes in computer science (pp. 299–309, Vol. 13450). Springer. https://doi.org/10.1007/978-3-031-16290-9_22

Ooge, J., Dereu, L., & Verbert, K. (2023). Steering recommendations and visualising its impact: Effects on adolescents’ trust in e-learning platforms. In Proceedings of the 28th International Conference on Intelligent User Interfaces (IUI 2023), 27-31 March 2023, Sydney, Australia (pp. 156–170). ACM. https://doi.org/10.1145/3581641.3584046

Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438–450. https://doi.org/10.1111/bjet.12152

Pozdniakov, S., Martinez-Maldonado, R., Tsai, Y.- S., Cukurova, M., Bartindale, T., Chen, P., Marshall, H., Richardson, D., & Gasevic, D. (2022). The question-driven dashboard: How can we design analytics interfaces aligned to teachers’ inquiry? In Proceedings of the 12th International Conference on Learning Analytics and Knowledge (LAK 2022), 21–25 March 2022, online (pp. 175–185). ACM. https://doi.org/10.1145/3506860.3506885

Prieto-Álvarez, C. G., Martínez-Maldonado, R., & Anderson, T. D. (2018). Co-designing learning analytics tools with learners. In J. Lodge, J. Horvath, & L. Corrin (Eds.), Learning analytics in the classroom (pp. 93–110). Routledge. https://doi.org/10.4324/9781351113038-7

Prinsloo, P., & Slade, S. (2013). An evaluation of policy frameworks for addressing ethical considerations in learning analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK 2013), 8–13 April 2013, Leuven, Belgium (pp. 240–244). ACM. https://doi.org/10.1145/2460296.2460344

Prinsloo, P., & Slade, S. (2015). Student privacy self-management: Implications for learning analytics. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (LAK 2015), 16–20 April 2015, Poughkeepsie, New York, USA (pp. 83–92). ACM. https://doi.org/10.1145/2723576.2723585

Pu, P., Chen, L., & Hu, R. (2011). A user-centric evaluation framework for recommender systems. In Proceedings of the Fifth ACM Conference on Recommender Systems (RecSys 2011), 23–27 October 2011, Chicago, Illinois, USA (pp. 157–164). ACM. https://doi.org/10.1145/2043932.2043962

Rehrey, G., Shepard, L., Hostetter, C., Reynolds, A., & Groth, D. (2019). Engaging faculty in learning analytics: Agents of institutional culture change. Journal of Learning Analytics, 6(2), 86–94. https://doi.org/10.18608/jla.2019.62.6

Roberts, L. D., Howell, J. A., & Seaman, K. (2017). Give me a customizable dashboard: Personalized learning analytics dashboards in higher education. Technology, Knowledge and Learning, 22, 317–333. https://doi.org/10.1007/s10758-017-9316-1

Sarmiento, J. P., Campos, F., & Wise, A. (2020). Engaging students as co-designers of learning analytics. In Companion Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 29–32). ACM. https://www.researchgate.net/publication/341030912_Engaging_Students_as_Co_Designers_of_Learning_Analytics

Sarmiento, J. P., & Wise, A. F. (2022). Participatory and co-design of learning analytics: An initial review of the literature. In Proceedings of the 12th International Conference on Learning Analytics and Knowledge (LAK 2022), 21–25 March 2022, online (pp. 535–541). ACM. https://doi.org/10.1145/3506860.3506910

Scheffel, M., Drachsler, H., & Specht, M. (2015). Developing an evaluation framework of quality indicators for learning analytics. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (LAK 2015), 16–20 April 2015, Poughkeepsie, New York, USA (pp. 16–20). ACM. https://doi.org/10.1145/2723576.2723629

Shibani, A., Knight, S., & Buckingham Shum, S. (2019). Contextualizable learning analytics design: A generic model and writing analytics evaluations. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, Arizona, USA (pp. 210–219). ACM. https://doi.org/10.1145/3303772.3303785

Shibani, A., Knight, S., & Buckingham Shum, S. (2022). Questioning learning analytics? Cultivating critical engagement as student automated feedback literacy. In Proceedings of the 12th International Conference on Learning Analytics and Knowledge (LAK 2022), 21–25 March 2022, online (pp. 326–335). ACM. https://doi.org/10.1145/3506860.3506912

Shneiderman, B. (2020). Bridging the gap between ethics and practice: guidelines for reliable, safe, and trustworthy human-centered AI systems. ACM Transactions on Interactive Intelligent Systems (TiiS), 10(4), 1–31. https://doi.org/10.1145/3419764

Shneiderman, B. (2022). Human-centered AI. Oxford University Press. https://doi.org/10.1093/oso/9780192845290.001.0001

Shreiner, T. L., & Guzdial, M. (2022). The information won’t just sink in: Helping teachers provide technology-assisted data literacy instruction in social studies. British Journal of Educational Technology, 53(5), 1134–1158. https://doi.org/10.1111/bjet.13255

Shute, V. J., Smith, G., Kuba, R., Dai, C.- P., Rahimi, S., Liu, Z., & Almond, R. (2021). The design, development, and testing of learning supports for the physics playground game. International Journal of Artificial Intelligence in Education, 31(3), 357–379. https://doi.org/10.1007/s40593-020-00196-1

Siepmann, C., & Chatti, M. A. (2023). Trust and transparency in recommender systems. In Human-Centred Explainable AI Workshop at the 2023 CHI Conference on Human Factors in Computing Systems (CHI 2023), 23–28 April 2023, Hamburg, Germany. https://arxiv.org/abs/2304.08094

Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366

Slade, S., Prinsloo, P., & Khalil, M. (2019). Learning analytics at the intersections of student trust, disclosure and benefit. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, Arizona, USA (pp. 235–244). ACM. https://doi.org/10.1145/3303772.3303796

Spinner, T., Schlegel, U., Schafer, H., & El-Assady, M. (2019). explAIner: A visual analytics framework for interactive and explainable machine learning. IEEE Transactions on Visualization and Computer Graphics, 26(1), 1064–1074. https://doi.org/10.1109/tvcg.2019.2934629

Sundar, S. S. (2020). Rise of machine agency: A framework for studying the psychology of human–AI interaction (HAII). Journal of Computer-Mediated Communication, 25(1), 74–88. https://doi.org/10.1093/jcmc/zmz026

Swenson, J. (2014). Establishing an ethical literacy for learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (LAK 2014), 24–28 March 2014, Indianapolis, Indiana, USA (pp. 246–250). ACM. https://doi.org/10.1145/2567574.2567613

Teasley, S. D. (2017). Student facing dashboards: One size fits all? Technology, Knowledge and Learning, 22(3), 377–384. https://doi.org/10.1007/s10758-017-9314-3

Tintarev, N., & Masthoff, J. (2015). Explaining recommendations: Design and evaluation. In F. Ricci, L. Rokach, & B. Shapira (Eds.), Recommender systems handbook (pp. 353–382). Springer. https://doi.org/10.1007/978-1-4899-7637-6_10

Topali, P., Ortega-Arranz, A., Rodríguez-Triana, M. J., Er, E., Khalil, M., & Akçapınar, G. (2025). Designing human-centered learning analytics and artificial intelligence in education solutions: A systematic literature review. Behaviour & Information Technology, 44(5), 1071–1098. https://doi.org/10.1080/0144929x.2024.2345295

Tsai, C. -H., & Brusilovsky, P. (2017). Providing control and transparency in a social recommender system for academic conferences. In Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization (UMAP 2017), 9–12 July 2017, Bratislava, Slovakia (pp. 313–317). ACM. https://doi.org/10.1145/3079628.3079701

Tsai, C.-H., & Brusilovsky, P. (2021). The effects of controllability and explainability in a social recommender system. User Modeling and User-Adapted Interaction, 31(3), 591–627. https://doi.org/10.1007/s11257-020-09281-5

Tsai, Y. -S., & Gašević, D. (2017). Learning analytics in higher education—challenges and policies: A review of eight learning analytics policies. In Proceedings of the Seventh International Conference on Learning Analytics and Knowledge (LAK 2017), 13–17 March 2017, Vancouver, British Columbia, Canada (pp. 233–242). ACM. https://doi.org/10.1145/3027385.3027400

Tsai, Y. - S., Perrotta, C., & Gašević, D. (2020). Empowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics. Assessment & Evaluation in Higher Education, 45(4), 554–567. https://doi.org/10.1080/02602938.2019.1676396

Tsai, Y.- S., Whitelock-Wainwright, A., & Gašević, D. (2020). The privacy paradox and its implications for learning analytics. In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 230–239). ACM. https://doi.org/10.1145/3375462.3375536

Usmani, U. A., Happonen, A., & Watada, J. (2023). Human-centered artificial intelligence: Designing for user empowerment and ethical considerations. In Proceedings of the Fifth International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA 2023), 8–10 June 2023, Istanbul, Turkiye. IEEE. https://doi.org/10.1109/hora58378.2023.10156761

Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. https://doi.org/10.2307/30036540

Verbert, K., De Laet, T., Millecamp, M., Broos, T., Chatti, M. A., & Muslim, A. (2020). XLA: Explainable Learning Analytics. In Companion Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020), 23–27 March 2020, Frankfurt, Germany (pp. 477–479). ACM. https://lirias.kuleuven.be/retrieve/606655

Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509. https://doi.org/10.1177/0002764213479363

Verbert, K., Parra, D., Brusilovsky, P., & Duval, E. (2013). Visualizing recommendations to support exploration, transparency and controllability. In Proceedings of the 2013 International Conference on Intelligent User Interfaces (IUI 2013), 19–22 March 2013, Santa Monica, California, USA (pp. 351–362). ACM. https://doi.org/10.1145/2449396.2449442

Viberg, O., Jivet, I., & Scheffel, M. (2023). Designing culturally aware learning analytics: A value sensitive perspective. In O. Viberg & A. Grönlund (Eds.), Practicable learning analytics (pp. 177–192). Springer. https://doi.org/10.1007/978-3-031-27646-0_10

West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J., & Searle, B. (2020). Do academics and university administrators really know better? The ethics of positioning student perspectives in learning analytics. Australasian Journal of Educational Technology, 36(2), 60–70. https://doi.org/10.14742/ajet.4653

Whitelock-Wainwright, A., Gašević, D., & Tejeiro, R. (2017). What do students want? Towards an instrument for students’ evaluation of quality of learning analytics services. In Proceedings of the Seventh International Conference on Learning Analytics and Knowledge (LAK 2017), 13–17 March 2017, Vancouver, British Columbia, Canada (pp. 368–372). ACM. https://doi.org/10.1145/3027385.3027419

Wiley, K., Dimitriadis, Y., & Linn, M. (2024). A human-centred learning analytics approach for developing contextually scalable K-12 teacher dashboards. British Journal of Educational Technology, 55(3), 845–885. https://doi.org/10.1111/bjet.13383

Wilson, J., Huang, Y., Palermo, C., Beard, G., & MacArthur, C. A. (2021). Automated feedback and automated scoring in the elementary grades: Usage, attitudes, and associations with writing outcomes in a districtwide implementation of MI Write. International Journal of Artificial Intelligence in Education, 31(2), 234–276. https://doi.org/10.1007/s40593-020-00236-w

Yang, F., Huang, Z., Scholtz, J., & Arendt, D. L. (2020). How do visual explanations foster end users’ appropriate trust in machine learning? In Proceedings of the 25th International Conference on Intelligent User Interfaces (IUI 2020), 17–20 March 2020, Cagliari, Italy (pp. 189–201). ACM. https://doi.org/10.1145/3377325.3377480

Zhao, R., Benbasat, I., & Cavusoglu, H. (2019). Do users always want to know more? Investigating the relationship between system transparency and users’ trust in advice-giving systems. In Proceedings of the 27th European Conference on Information Systems (ECIS 2019), 8–14 June 2019, Stockholm, Sweden. AIS eLibrary. https://aisel.aisnet.org/ecis2019rip/42/

Downloads

Published

2026-03-22

How to Cite

Joarder, S., & Chatti, M. A. . (2026). Human-Centred Development of Indicators for Self-Service Learning Analytics: A Transparency through Exploration Approach. Journal of Learning Analytics, 13(1), 163-189. https://doi.org/10.18608/jla.2026.8921

Most read articles by the same author(s)