Formative Assessment and Learning Analytics.

Dirk T. Tempelaar, André Heck, Hans Cuypers, Henk van der Kooij, Evert van de Vrie. Surf-project

Abstract
Learning analytics seeks to enhance the learning process through systematic measurements of learning related data, and informing learners and teachers of the results of these measurements, so as to support the control of the learning process. Learning analytics has various sources of information, two main types being intentional and learner activity related metadata [1]. This contribution aims to provide a practical application of Shum and Crick’s theoretical framework [1] of a learning analytics infrastructure that combines learning dispositions data with data extracted from computer-based, formative assessments. The latter data component is derived from one of the educational projects of ONBETWIST, part of the SURF program ‘Testing and Test Driven Learning’.

Conclusions
The intensive use of practice test environments makes a major difference for academic performance. But in a student-centered curriculum it is not sufficient when teachers are convinced of the benefits that test-based learning in digital learning environments entails. Students regulate their own learning process, making themselves choices on how intensively they will exercise and therefore, are the ones who need to become convinced of the usefulness of these digital tools.
In this, learning analytics can play an important role: it provides a multitude of information that the student can use to adapt the personal learning environment as much as possible to the own strengths and weaknesses.
For example, in our experiment the students were informed about their personal learning dispositions, attitudes and values, together with information on how learning in general interferes with choices they can make in composing their learning blend. At the same time: the multitude of information available from learning analytics is also the problem: that information requires individual processing. Some information is more important for one student than the other, requiring a personal selection of information to take place. Learning analytics deployed within a system of student-centered education thus has its own challenges.

The aim of this contribution extends beyond demonstrating the practical importance of Shum and Crick’s learning analytics infrastructure. Additionally, this research provides many clues as to what individualized information feedback could look alike. In the learning blend described in this case study, the face-to-face component PBL constitutes the main instructional method. The digital component is intended as a supplementary learning tool, primarily for students for whom the transition from secondary to university education entails above average hurdles.
Part of these problems are of cognitive type: e.g. international students who never received statistics education as part of their high school mathematics program, or other freshmen who might have been educated in certain topics, without achieving required proficiency levels. For these kind of cognitive deficiencies, the digital test-directed environments proved to be an effective tool to supplement PBL.

But this applies not only to adjustment problems resulting from knowledge backlogs. Students encounter several types of adjustment problems where the digital tools appear to be functional. The above addressed learning dispositions are a good example: student-centered education presupposes in fact deep, self-regulated learning, where many students have little experience in this, and feel on more familiar ground with step-wise, externally regulated learning.
As the analyses demonstrate: the digital test environments help in this transformation. It also makes clear that the test environments are instrumental for students with non-adaptive cognitions about learning mathematics and statistics, such as anxiety. An outcome that is intuitive: the individual practice sessions with computerized feedback will for some students be a safer learning environment than the PBL tutorial group sessions.

Finally, the learning analytics outcomes make also clear where the limits of the potentials of digital practice are: for students with non-adaptive behaviors and negative learning emotions. If learning involves boredom and provokes self-handicapping, even the challenges of test-based learning will fall short.