Making Sense of Sensemaking: Understanding How K–12 Teachers and Coaches React to Visual Analytics
Keywords:Learning Analytics, Sensemaking, Dashboard design, Visualization, Human-Computer Interaction
With the spread of learning analytics (LA) dashboards in K--12 schools, educators are increasingly expected to make sense of data to inform instruction. However, numerous features of school settings, such as specialized vantage points of educators, may lead to different ways of looking at data. This observation motivates the need to carefully observe and account for the ways data sensemaking occurs, and how it may differ across K--12 professional roles. Our mixed-methods study reports on interviews and think-aloud sessions with middle-school mathematics teachers and instructional coaches from four districts in the United States. By exposing educators to an LA dashboard, we map their varied reactions to visual data and reveal prevalent sensemaking patterns. We find that emotional, analytical, and intentional responses inform educators’ sensemaking and that different roles at the school afford unique vantage points toward data. Based on these findings, we offer a typology for representing sensemaking in a K--12 school context and reflect on how to expand visual LA process models.
Ahn, J., Campos, F., Hays, M., & DiGiacomo, D. (2019). Designing in context: Reaching beyond usability in learning analytics dashboard design. Journal of Learning Analytics, 6(2), 70–85. https://doi.org/10.18608/jla.2019.62.5
Ahn, J., Nguyen, H., & Campos, F. (2021). From Visible to Understandable: Designing for Teacher Agency in Education Data Visualizations. Contemporary Issues in Technology and Teacher Education., 21(1).
Alhadad, S. S. (2018). Visualizing data to support judgement, inference, and decision making in learning analytics: Insights from cognitive psychology and visualization science. Journal of Learning Analytics, 5(2), 60–85. https://doi.org/10.18608/jla.2018.52.5
Bocala, C., & Boudet, K. (2015). Teaching educators habits of mind for using data wisely. Teachers College Record, 117(4), 1–20. Retrieved from https://www.tcrecord.org/library/content.asp?contentid=17853
Borko, H., Livingston, C., & Shavelson, R. J. (1990). Teachers’ thinking about instruction. Remedial and Special Education, 11(6), 40–49. https://doi.org/10.1177/074193259001100609
Bower, G. H., & Cohen, P. R. (1982). Emotional influences in memory and thinking: Data and theory. In M. S. Clark & S. T. Fiske (Eds.), Affect and Cognition—17th Annual Carnegie Mellon Symposium on Cognition. New York: Psychology Press.
Brown, A. D., Colville, I., & Pye, A. (2015). Making sense of sensemaking in Organization Studies. Organization Studies, 36(2), 265–277. https://doi.org/10.1177/0170840614559259
Bruner, J. (1991). The narrative construction of reality. Critical Inquiry, 18(1), 1–21. Retrieved from https://www.jstor.org/stable/1343711
Buckingham Shum, S., Ferguson, R., & Martinez-Maldonaldo, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1
Charleer, S., Klerkx, J., Duval, E., De Laet, T., & Verbert, K. (2016). Creating effective learning analytics dashboards: Lessons learnt. In K. Verbert & M. Sharples (Eds.), Adaptive and Adaptable Learning (EC-TEL 2016), 13–16 September 2016, Lyon, France. Lecture Notes in Computer Science (pp. 42–56). Cham: Springer. https://doi.org/10.1007/978-3-319-45153-4_4
Charters, E. (2003). The use of think-aloud methods in qualitative research: An introduction to think-aloud methods. Brock Education: A Journal of Educational Research and Practice, 12(2), 68–82. https://doi.org/10.26522/brocked.v12i2.38
Chen, B., & Zhu, H. (2019). Towards value-sensitive learning analytics design. In Proceedings of the Ninth International Conference on Learning Analytics & Knowledge (LAK 2019), 4–8 March 2019, Tempe, AZ, USA (pp. 343–352). New York: ACM. https://doi.org/10.1145/3303772.3303798
Cobb, P., Jackson, K., Henrick, E., & Smith, T. M. (2018). Systems for Instructional Improvement: Creating Coherence from the Classroom to the District Office. Boston, MA: Harvard Education Press.
Coburn, C. E., & Penuel, W. R. (2016). Research–practice partnerships in education: Outcomes, dynamics, and open questions. Educational Researcher, 45(1), 48–54. https://doi.org/10.3102/0013189X16631750
Coburn, C. E., Penuel, W. R., & Geil, K. E. (2013). Research-Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts. New York: William T. Grant Foundation.
Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement, 9(4), 173–206. https://doi.org/10.1080/15366367.2011.626729
Colville, I. (2007). Sensemaking. In R. Thorpe & R. Holt (Eds.), The SAGE Dictionary of Qualitative Management Research. SAGE Publications.
Dervin, B. (1998). Sense-making theory and practice: An overview of user interests in knowledge seeking and use. Journal of Knowledge Management, 2(2), 36–46. https://doi.org/10.1108/13673279810249369
Dervin, B., & Naumer, C. M. (2010). Sense-making. In M. J. Bates & M. N. Maack (Eds.), Encyclopedia of Library and Information Sciences (3rd ed., pp. 4696–4707). Boca Raton, FL, USA: CRC Press.
D’Mello, S. K. (2017). Emotional learning analytics. In C. Lang, George Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of Learning Analytics (1st ed., pp. 115–128). Society for Learning Analytics Research. https://doi.org/10.18608/hla17.010
Marsh, J. A., Sloan McCombs, J., & Martorell, F. (2010). How instructional coaches support data-driven decision making: Policy implementation and effects in Florida middle schools. Educational Policy, 24(6), 872–907. https://doi.org/10.1177/0895904809341467
Mccoy, C., & Shih, P. C. (2016). Teachers as producers of data analytics: A case study of a teacher-focused educational data science program. Journal of Learning Analytics, 3(3), 193–214. https://doi.org/10.18608/jla.2016.33.10
Miles, M. B., & Huberman, A. M. (1994). Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks, CA: SAGE.
Molenaar, I., & Knoop-van Campen, C. (2018). How teachers make dashboard information actionable. IEEE Transactions on Learning Technologies, 12(3), 347–355. https://doi.org/https://doi.org/10.1109/TLT.2018.2851585
Murnane, R. J., Sharkey, N. S., & Boudett, K. P. (2005). Using student-assessment results to improve instruction: Lessons from a workshop. Journal of Education for Students Placed at Risk (JESPAR), 10(3), 269–280. https://doi.org/10.1207/s15327671espr1003_3
Peck, E. M., Ayuso, S. E., & El-Etr, O. (2019). Data is personal: Attitudes and perceptions of data visualization in rural Pennsylvania. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), 4–9 May 2019, Glasgow, UK (pp. 1–12). New York: ACM. https://doi.org/10.1145/3290605.3300474
Romat, H., Riche, N. H., Hinckley, K., Lee, B., Appert, C., Pietriga, E., & Collins, C. (2019). Activeink: (Th)inking with data. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), 4–9 May 2019, Glasgow, UK (pp. 1–13). New York: ACM. https://doi.org/10.1145/3290605.3300272
Russell, J. L., Correnti, R., Stein, M. K., Thomas, A., Bill, V., & Speranzo, L. (2020). Mathematics coaching for conceptual understanding: Promising evidence regarding the Tennessee math coaching model. Educational Evaluation and Policy Analysis, 42(3), 439–466. https://doi.org/10.3102/0162373720940699
Scaife, M., & Rogers, Y. (1996). External cognition: How do graphical representations work? International Journal of Human Computer Studies, 45(2), 185–213. https://doi.org/10.1006/ijhc.1996.0048
Scheffel, M., Drachsler, H., Toisoul, C., Ternier, S., & Specht, M. (2017). The proof of the pudding: Examining validity and reliability of the evaluation framework for learning analytics. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data Driven Approaches in Digital Education (EC-TEL 17), 12–15 September 2017, Tallinn, Estonia. Lecture Notes in Computer Science (pp. 194–208). Cham: Springer. https://doi.org/10.1007/978-3-319-66610-5_15
Schildkamp, K. (2019). Data-based decision-making for school improvement: Research insights and gaps. Educational Research, 61(3), 257–273. https://doi.org/10.1080/00131881.2019.1625716
Shapiro, R. B., & Wardrip, P. S. (2015). Keepin’ it real: Understanding analytics in classroom practice. Technology, Instruction, Cognition and Learning, 10, 127–149.
Sherin, M. G., & Russ, R. S. (2014). Teacher noticing via video: The role of interpretive frames. In B. Calandra & P. J. Rich (Eds.), Digital Video for Teacher Education: Research and Practice (pp. 11–28). New York: Routledge.
Simon, J. (2017). A priori knowledge in learning analytics. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance e-Learning (pp. 199–227). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-52977-6_7
Sun, K., Mhaidli, A. H., Watel, S., Brooks, C. A., & Schaub, F. (2019). It’s my data! Tensions among stakeholders of a learning analytics dashboard. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), 4–9 May 2019, Glasgow, UK (pp. 1–14). New York: ACM. https://doi.org/10.1145/3290605.3300824
Van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571–596. Retrieved from https://www.learntechlib.org/primary/p/9171/
van Leeuwen, A., Rummel, N., & van Gog, T. (2019). What information should CSCL teacher dashboards provide to help teachers interpret CSCL situations? International Journal of Computer-Supported Collaborative Learning, 14(3), 261–289. https://doi.org/10.1007/s11412-019-09299-x
van Leeuwen, A., van Wermeskerken, M., Erkens, G., & Rummel, N. (2017). Measuring teacher sense making strategies of learning analytics: A case study. Learning: Research and Practice, 3(1), 42–58. https://doi.org/10.1080/23735082.2017.1284252
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509. https://doi.org/10.1177/0002764213479363
Voyiatzaki, E., & Avouris, N. (2014). Support for the teacher in technology-enhanced collaborative classroom. Education and Information Technologies, 19(1), 129–154. https://doi.org/10.1007/s10639-012-9203-2
Wardrip, P. S., & Herman, P. (2018). “We’re keeping on top of the students”: Making sense of test data with more informal data in a grade-level instructional team. Teacher Development, 22(1), 31–50. http://doi.org/10.1080/13664530.2017.1308428
Weick, K. E. (1993). The collapse of sensemaking in organizations: The Mann Gulch disaster. Administrative Science Quarterly, 38(4), 628–652. https://doi.org/10.2307/2393339
Weick, K. E. (1995). Sensemaking in Organizations. Thousand Oaks, CA, USA: SAGE Publications.
Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53–69. https://doi.org/10.18608/jla.2019.62.4
Xhakaj, F., Aleven, V., & McLaren, B. M. (2016). How teachers use data to help students learn: Contextual inquiry for the design of a dashboard. In K. Verbert, M. Sharples, & T. Klobuˇcar (Eds.), Adaptive and Adaptable Learning (EC-TEL 2016), 13–16 September 2016, Lyon, France. Lecture Notes in Computer Science (Vol. 9891, pp. 13–16). Cham: Springer. https://doi.org/10.1007/978-3-319-45153-4_26
Xhakaj, F., Aleven, V., & McLaren, B. M. (2017). Effects of a teacher dashboard for an intelligent tutoring system on teacher knowledge, lesson planning, lessons and student learning. Data Driven Approaches in Digital Education (EC-TEL 17), 12–15 September 2017, Tallinn, Estonia. Lecture Notes in Computer Science, 10474, 315–329. https://doi.org/10.1007/978-3-319-66610-5_23
Yeager, D., Bryk, A., Hausman, H., Muhich, J., & Morales, L. (2013). Practical Measurement. Palo Alto, CA.
Yi, J. S., Kang, Y. a., Stasko, J. T., & Jacko, J. A. (2007). Toward a deeper understanding of the role of interaction in information visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6), 1224–1231. https://doi.org/10.1109/TVCG.2007.70515
Yi, J. S., Kang, Y.-a., Stasko, J. T., & Jacko, J. A. (2008). Understanding and characterizing insights: How do people gain insights using information visualization? In ACM Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization (BELIV 2008), 5 April 2008, Florence, Italy (pp. 4:1–4:6). New York: ACM. https://doi.org/10.1145/1377966.1377971
Yin, R. K. (2012). Case study methods. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA Handbook of Research Methods in Psychology: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological (Vol. 2, pp. 141–155). American Psychological Association. https://doi.org/10.1037/13620-009
You, J. W., & Kang, M. (2014). The role of academic emotions in the relationship between perceived academic control and self-regulated learning in online learning. Computers and Education, 77, 125–133. https://doi.org/10.1016/j.compedu.2014.04.018
How to Cite
Copyright (c) 2021 Journal of Learning Analytics
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) license that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).