SoLAR Flare UK – Retention discussion

[Kindle-friendly PDF of this post]

Notes from a small group discussion at #flareUK about learning analytics and retention. Written down on an informal Chatham House Rule approach – not everyone in the group agrees with all of this stuff, but here it is!

The Lonely Church

Retention and success are distinct, but linked. Depends on definition.Question of institutional survival vs driving it for individuals where it might be better for them individually to drop out.

Retention is the killer app for learning analytics. For each course, and the whole qualification.

Role for analytics to improve match between students and courses. Analytics happening pre-registration. Very difficult to do! Not everyone includes this in learning analytics – has implications for learning, but isn’t learning. Feeder institutions, understanding the patterns of achievement by those students. In the UK have access to national datasets.

Some institutions aiming for quick drop-out – better for students and university for the drop-out to happen sooner rather than later. OU project modelling based on demographics, aim to intervene with a simple contact by tutor for at-risk students. Did make a substantial shift from passive to active withdrawal. But wasn’t captured which students had an intervention, so hard to understand how effective it was. Other student – withdrawal happened really in first semester, but only captured later. Delay precisely because those students weren’t engaging with the systems.

When is it appropriate to deliver resources, and to whom – the learning analytics can determine that. One of the SURF projects was a prototype advising students what course to take next, and why. See them going through a curriculum, can tell them if this is wise or not based on 10y of data on progress. Students enthusiastic about it; educational management less keen.

Knewton are working on this. An adaptive learning platform, looking at the interactions, what students struggle with and do well, and adapt the content based on the progress they’re making. Personalised learning journey! It’s very difficult to do.

Can become a bad self-fulfilling prophecy – e.g. telling students not likely to do well. But as a university, want to know that. Depends on what you do with it – e.g. say here are the resources they need to do better. Ethical issue of generating information that’s not provided to them; question of how to portray it; danger of Big Brotherish. Whatever we collect about students we have to provide to them – data protection issue. Also how we base our actions. Present the data to them, but not necessarily how we offer extra support. May be unethical not to collect and use the data – if we know it can help. If student opts out of that data, that puts them at greater risk of failing, so is unethical not to do that. Tricky issue about what to tell them, and how. Don’t want it to lead to demotivation. Might lead to changed behaviour, using tools because they know they’re being monitored. Surveys and research on that; some students like to break the system because they can. Don’t want to be targeted with spam based on what organisations think your preferences are.

Signals as a good exemplar of practice – students can see their own signal as well as tutors. Can’t explain everything to the students, and they don’t want to know. Being humble about the quality of our predictions and algorithms. Educational Data Mining focus on those, but few on feeding back to students. So learning analytics this has a key focus. It won’t be perfect, it’s working from what will work, and what they can do about it. It’s not the answer, it’s one of the tools in the educator’s toolbox. You might have a hunch about what’s going wrong, or Signals might give you a cue about it, and you can take a more holistic view. Present feedback to the student with the evidence and rationale. E.g. about course selection, if you give them the evidence

Quality is best when you have the feedback linked together – actionable intelligence. John Campbell’s perspective. Combining interventions from learning analytics with dashboard feedback is more effective. We have this in chemistry, didn’t have it biology; there was an effect that carried over because the students learned more effectively and carried the behaviours over – and demanded their traffic light and the pointer to support resources.

Traffic light can be a bit simplistic. You do want to know more, why you’ve got that. Supplement it with some of the underlying data – e.g. engagement, how does that compare to similar students. Wrapped up with individual aspiration and motivation, very contextual.

Retention vs success. Retention isn’t the goal, it’s a byproduct – the goal is success. Keep coming back to this! Success is much more difficult to measure though. Sell it to managers by retention, sell it to students by success. It’s a sliding scale – retention is part of success. Can stay in but not be successful. QUesiton of level of assessment outcome – not just intervening around the complete/dropout level, but at the first/2:1 borderline. People coasting – e.g. green light but could do better. An option to override the green light as a tutor, can give an orange light. Success – we’re not got at collected data about what a student sees as success, what their goal is. Can’t answer the question without that. Some students just want to get through the course, so barely enough will do. Others who won’t go for anything less than the best. Success is quite qualitative, whereas retention is almost binary – here or not here.

Context for each individual students – motivation, expectation – personalised. Personalised contact is critical, the personal knowledge you have about the student, e.g. because they’re ill. Analytics as the basis for a personal discussion. Relationship management.

Using analytics to identify the quality of teaching is another area. Quality of teaching is something you really care about as a student. Ethical imperative to improve teaching.

Data useful after student has left – for references and things like that. Another set of opportunities.

Cradle-to-grave user profile – from earliest engagement through to afterwards and beyond graduation. Retention on university career – 20y, more – so at a course level, qualification/degree level, through to lifelong learning.

How to pass through portfolio of analytics from primary to secondary to university level.


This work by Doug Clow is licensed under a Creative Commons Attribution 3.0 Unported License.
No further permission needed to reuse/remix (with attribution), but it’s nice to be notified if you do.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.