Learning analytics will play a key role in determining the effectiveness of learning from the courses provided as part of our curricula development. EDSA will analyse data to gain insights around how to improve module curricula outlines, content, and their delivery method. Analytics will be provided across a range of audience groups to ensure that the EDSA materials cater for the needs of data science professionals and students regardless of language, industry sector, country or gender.
Learning analytics will be used throughout the project to judge engagement with the learning materials that we produce. This will help us to ensure that the content is correct, the delivery method is suitable and that they are appropriate for the broad range of audience groups that we are targeting in the project.
The EDSA project has a duration of 3 years and is coordinated by Prof John Domingue (The Open University). John has been recently featured in a BBC News article entitled “OU students’ progress to be monitored by software”, reflecting the inherent expertise of the consortium for innovating and leading in this area.
With Learning Analytics it will be possible to obtain valuable information about how the learners interact with the EDSA courseware, in addition to their own judgments provided via questionnaires. Our approach will be based on tracking learner activities, which consist of interactions between a subject (learner), an object (learning activity) and is bounded with a verb (action performed). For this purpose, we will be using the Experience API (or xAPI or Tin Can API) in order to express learner activities, as well as the open-source Learning Record Store (LRS) Learning Locker in order to store and visualise the captured learner activities.
Case Study: The EDSA Process Mining MOOC
Recently, TU/e has applied process mining techniques on recorded student behaviour of the Process Mining MOOC on Coursera, in order to do a preliminary analysis of the learning process within the MOOC. The MOOC itself is set up in a structured way: there are 6 weeks of content and for each content week a quiz has to be made, at least 2 weeks after the week became available. At the end of the course there is also a final quiz. For the students that want to aim for a certificate with distinction, a quiz, peer assessment and peer review is also mandatory.
When we analysed the registration behaviour, we noticed a peak in subscriptions around the opening time, especially for the students that would end up getting a certificate. We then went on to compare the groups of students that did not get a certificate, the ones that got a normal certificate and the ones that got a distinction certificate. Each group was further split into the non-signature track and signature track, e.g. the ones that did not or that did pay for their certificate, which would imply higher motivation and participation rates. The behaviour of the successful students was clearly timelier than the `failed’ students, where the signature track students are even timelier.
We then created the expected learning model, where students study sequentially, and verified how well this model fits each of the different groups. Although we expected the successful signature track students to best fit this model, it seemed that the students that failed the course but that did watch all weeks studied the most sequentially. Further investigation is needed to discover the cause of this difference.