Detecting Learning Strategies with Analytics: Links with Self-reported Measures and Academic Performance


  • Dragan Gasevic The University of Edinburgh
  • Jelena Jovanovic University of Belgrade
  • Abelardo Pardo The University of Sydney
  • Shane Dawson University of South Australia



Learning analytics, learning strategy, approaches to learning, self-reported measures


The use of analytic methods for extracting learning strategies from trace data has attracted considerable attention in the literature. However, there is a paucity of research examining any association between learning strategies extracted from trace data and responses to well-established self-report instruments and performance scores. This paper focuses on the link between the learning strategies identified in the trace data and student reported approaches to learning. The paper reports on the findings of a study conducted in the scope of an undergraduate engineering course (N=144) that followed a flipped classroom design. The study found that learning strategies extracted from trace data can be interpreted in terms of deep and surface approaches to learning. The detected significant links with self-report measures are with small effect sizes for both the overall deep approach to learning scale and the deep strategy scale. However, there was no observed significance linking the surface approach to learning and surface strategy nor were there significant associations with motivation scales of approaches to learning. The significant effects on academic performance were found, and consistent with the literature that used self-report instruments showing that students who followed a deep approach to learning had a significantly higher performance.


Azevedo, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educational Psychologist, 50(1), 84–94.

Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016). The role of achievement goal orientations when studying effect of learning analytics visualizations. Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼLA), 25–29 April 2016, Edinburgh, UK (pp. 54–63). New York: ACM.

Biggs, J. (1987). Student approaches to learning and studying. (Research Monograph). Australian Council for Educational Research. Retrieved from

Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The revised two-factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71(1), 133–149.

Bjork, E. L., & Bjork, R. A. (2011). Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In M. A. Gernsbacher, R. W. Pew, L. M. Hough, & J. R. Pomerantz (Eds.), Psychology and the real world: Essays illustrating fundamental contributions to society (pp. 56–64).

Bliuc, A.-M., Ellis, R. A., Goodyear, P., & Piggott, L. (2010). Learning through face-to-face and online discussions: Associations between students’ conceptions, approaches and academic performance in political science. British Journal of Educational Technology, 41(3), 512–524.

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., … Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement (Research Report). Canberra, Australia: Office of Learning and Teaching, Australian Government.

Dawson, S., Drachsler, H., Rosé, C. P., Gašević, D., & Lynch, G. (Eds.). (2016). Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK. New York: ACM.

Del Valle, R., & Duffy, T. M. (2009). Online learning: Learner characteristics and their approaches to managing learning. Instructional Science, 37(2), 129–149.

Elliot, A. J., & McGregor, H. A. (2001). A 2 X 2 achievement goal framework. Journal of Personality and Social Psychology, 80, 501–519.

Ellis, R. A., Goodyear, P., Calvo, R. A., & Prosser, M. (2008). Engineering students’ conceptions of and approaches to learning through discussions in face-to-face and online contexts. Learning and Instruction, 18(3), 267–282.

Ellis, R. A., Marcus, G., & Taylor, R. (2005). Learning through inquiry: Student difficulties with online course-based material. Journal of Computer Assisted Learning, 21(4), 239–252.

Entwistle, N. J. (2009). Teaching for understanding at university: Deep approaches and distinctive ways of thinking. Basingstoke, UK: Palgrave Macmillan.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415,

Gabadinho, A., Ritschard, G., Mueller, N. S., & Studer, M. (2011). Analyzing and visualizing state sequences in R with TraMineR. Journal of Statistical Software, 40(4), 1–37.

Gašević, D., Dawson, S., Rogers, T., & Gašević, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting learning success. The Internet and Higher Education, 28, 68–84.

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.

Khan, I., & Pardo, A. (2016). Data2U: Scalable real time student feedback in active learning environments. Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼLA), 25–29 April 2016, Edinburgh, UK (pp. 249–253). New York: ACM.

Kovanović, V., Gašević, D., Joksimović, S., Hatala, M., & Adesope, O. (2015). Analytics of communities of inquiry: Effects of learning technology use on cognitive presence in asynchronous online discussions. The Internet and Higher Education, 27, 74–89.

Levenshtein, V. I. (1966). Binary codes capable of correcting deletions, insertions and reversals. Soviet Physics Doklady, 10, 707–710.

Lodge, J., & Lewis, M. (2012). Pigeon pecks and mouse clicks: Putting the learning back into learning analytics. Proceedings of the 33rd Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE 2016). 28–30 November 2016, Wellington, New Zealand (pp. 560–564). Australasian Society for Computers in Learning in Tertiary Education.

Lust, G., Elen, J., & Clarebout, G. (2013a). Regulation of tool-use within a blended course: Student differences and performance effects. Computers & Education, 60(1), 385–395.

Lust, G., Elen, J., & Clarebout, G. (2013b). Students’ tool-use within a web enhanced course: Explanatory mechanisms of students’ tool-use pattern. Computers in Human Behavior, 29(5), 2013–2021.

Lust, G., Vandewaetere, M., Ceulemans, E., Elen, J., & Clarebout, G. (2011). Tool-use in a blended undergraduate course: In Search of user profiles. Computers & Education, 57(3), 2135–2144.

Martin, A. J., Papworth, B., Ginns, P., Malmberg, L.-E., Collie, R. J., & Calvo, R. A. (2015). Real-time motivation and engagement during a month at school: Every moment of every day for every student matters. Learning and Individual Differences, 38, 26–35.

Miyamoto, Y. R., Coleman, C., Williams, J. J., Whitehill, J., Nesterko, S., & Reich, J. (2015). Beyond time-on-task: The relationship between spaced study and certification in MOOCs. Journal of Learning Analytics, 2(2), 47–69.

O’Flaherty, J., Phillips, C., Karanicolas, S., Snelling, C., & Winning, T. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education, 25, 85–95.

Pardo, A., Ellis, R. A., & Calvo, R. A. (2015). Combining observational and experiential data to inform the redesign of learning activities. Proceedings of the 5th International Conference on Learning Analytics and Knowledge (LAK ʼ(L), 16–20 March 2015, Poughkeepsie, NY, USA (pp. 305–309). New York: ACM.

Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and academic performance: A case study. Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼLA), 25–29 April 2016, Edinburgh, UK (pp. 422–429). New York: ACM.

Pardo, A., & Mirriahi, N. (in press). Design, deployment and evaluation of a flipped learning first year engineering course. In C. Reidsema, L. Kavanagh, R. Hadgraft, & N. Smith (Eds.), Flipping the Classroom: Practice and Practices. Singapore: Springer.

Rogers, T., Gašević, D., & Dawson, S. (2016). Learning analytics and the imperative for theory driven research. In C. Haythornthwaite, R. Andrews, J. Fransma, & E. Meyers (Eds.), The SAGE Handbook of E-Learning Research, 2nd ed. (pp. 232–250). London, UK: SAGE Publications.

Siemens, G., & Gasevic, D. (2012). Guest editorial: Learning and knowledge analytics. Educational Technology & Society, 15(3), 1–2.

Svihla, V., Wester, M. J., & Linn, M. C. (2015). Distributed revisiting: An analytic for retention of coherent science learning. Journal of Learning Analytics, 2(2), 75–101.

Trigwell, K., & Prosser, M. (1991). Improving the quality of student learning: The influence of learning context and student approaches to learning on learning outcomes. Higher Education, 22(3), 251–266.

Trigwell, K., Prosser, M., & Waterhouse, F. (1999). Relations between teachers’ approaches to teaching and students’ approaches to learning. Higher Education, 37(1), 57–70.

van Leeuwen, A. (2015). Learning analytics to support teachers during synchronous CSCL: Balancing between overview and overload. Journal of Learning Analytics, 2(2), 138–162.

Weinstein, C. E., Husman, J., & Dierking, D. R. (2000). Self-regulation interventions with a focus on learning strategies. In P. R. Pintrich & M. Boekaerts (Eds.), Handbook on self-regulation (pp. 727–747). New York: Academic Press.

Winne, P. H., & Jamieson-Noel, D. (2002). Exploring students’ calibration of self reports about study tactics and achievement. Contemporary Educational Psychology, 27(4), 551–572.

Winne, P. H., & Jamieson-Noel, D. (2003). Self-regulating studying by objectives for learning: Students’ reports compared to a model. Contemporary Educational Psychology, 28(3), 259–276.

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ʼLA), 24–28 March 2014, Indianapolis, IN, USA (pp. 203–211). New York: ACM.

Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5–13.

Wise, A. F., Speer, J., Marbouti, F., & Hsiao, Y.-T. (2013). Broadening the notion of participation in online discussions: Examining patterns in learners’ online listening behaviors. Instructional Science, 41(2), 323–343.

Zhou, M., & Winne, P. H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413–419.

Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary Educational Psychology, 25(1), 82–91.




How to Cite

Gasevic, D., Jovanovic, J., Pardo, A., & Dawson, S. (2017). Detecting Learning Strategies with Analytics: Links with Self-reported Measures and Academic Performance. Journal of Learning Analytics, 4(2), 113–128.



Special section: Shape of Educational Data

Most read articles by the same author(s)

1 2 3 4 > >>