SoLAR Flare UK – Feedback session

[Kindle-friendly PDF of this post]

Liveblog notes from SoLAR Flare UK – feedback from small groups and large group discussion.

...

Retention and success

Feedback from me – as in previous liveblog.

Dashboards

People head towards these, they can’t do everything. Four key issues: 1. need to talk to multiple audiences, serving different purposes; interpretation and clarity is important. 2. Multiple purposes – aggregation, intervention, motivations, ipsative vs norm. 3. Ethics – emotions around anxiety, surveillances, privacy, transparenct. 4. Operationalisation, institutional responses.

Analytics for student success and retention: issues.

Minority Report as a model – can you predict the future? Danger of a pre-crime/pre-fail unit. If everyone’s paying £9k, why pour all the resources in to those who are going to fail? Is it finding people at boundaries – what about people in the middle? Self-fulfilling failure prophecy, if present to them “students like you also failed” isn’t very motivating. Danger of back-firing personalisation expectations “Dear <field1>”. You can’t step in the same river twice. Important to record the interventions that have been made, as well as the other tracking information. Danger of “can’t count, doesn’t count”.

Pedagogy and Learning Analytics

How do we measure learning? Rather than ‘success’ in assessments. Approximate measures for learning – make sure they’re not too approximate, not accrediting proxies; make process clear to students and teachers. Can we pick up individual proxies, or look for suites of analytics? Shouldn’t assessment be our ‘best measure’ of learning? How we define a pedagogy that can pick out the knowledge we want from our students, in such as a way as to accredit not just a recipe, not just box-ticking that they followed the right procedures. Analytics as snapshots, continuity and change metrics; what can we do if we measure over time, looking at change and new behaviours? Overarching issue to have analytics driven by what we want to achieve rather than what data is available.

Examples of dialogue analysis – analysis of use of social networks. Hypothesis test our assumptions about learning as teachers, so we can have a dynamic pedagogy.

We might be able to pick up pass rate and online activity may have a correlation.

Need for tools to be responsive.

Data sources

Many issues, similar to other areas. Availability and planning for data. Seeing close, iterative cycles.  Mydata – in consultation at the moment, UK legislation/programme to give consumers access to the data held about them. Micro-, meso- and macro- focus – student, dept, institution – but also a nano-level of granularity, down to very small bits; question of whether that data’s available.

Having a purpose for the data, think what you might do with it, rather than just gathering lots of data.

Avoid situation of paying to access your own data. Example of IBM doing work around learning analytics, is the data available back to customers? Only if they ask for it. Putting demands on vendors that data is part of the product you’re paying for, ensuring you’re not locked out of it. Companies springing up that create APIs to access data, in the education sector. Student information services that have an API to several separate sources, to provide a service that lets you access data. Paying for what you’ve already got but can’t get access to in an easy way.

Culture change, becoming more data-aware. Senior management decision-making culture – is a big shift in certain institutions.

Group discussion

Struck by the diversity of people here – institutional support, statisticians, English lecturers, educational technologists. Very wide range.

How many face students on a weekly basis? Only a handful. Most people aren’t directly student-facing – how do we get more of those people in to that conversation. They’d say they’re too busy facing students. If get more teachers involved, would help address questions e.g. about how we mediate feedback; teachers should be engaged in this discussion and help us to the answers. Many here are working with teachers and learners. Teacher can sense where things are going wrong. How we can bridge educators’ understand with what’s happening in the background with numbers is very important. It may tell a different story. The numbers on their own might be open to interpretation.

If we scale up education, there’s an issue of how you provide quality, personalised feedback at scale; bricks and mortar can’t meet the demand at scale. Analytics may be a key part in the jigsaw; need machine support to direct the valuable but scarce human attention. Do people know what they want – they may not be in a position to say, yet. Need to present what we can do, but important to engage with their needs. Participatory design – as for software development.

The idea of triggers – institutions can do a health check regardless of where they are; think about how they can respond, as an institution, at scale.

It’s not a green field site – there are procedures already in place, monitored by the QAA et al. Question is how those processes can be better informed, make better use of new data, and what the implications for change are.

There will be a million and one learning analytics and ideas about what does well. It’s like the SETI Large Satellite array – scanning the space, trying to pick up the signals. Pull them around a learner to create collective intelligence.

Participatory design tricky, letting people try things out. There are tools there – dashboards, think about commercial vendors’ offerings. As a community need to share the bits and pieces we’ve got so we can do more of the personalisation. Danger of being vendor-driven, with dashboards that are visually appealing, but tricky questions to be asked. Users not always engaged in purchase decisions; needs to be considered.

Need to accept we’re in initial stages of applying this. We’re asking questions, rather than finding the answers already. Many projects are gaining insight, but ending up with more questions at the end. We don’t need to stop here. Learning analytics is not the holy grail. Analytics as the answer to all problems – it’s like the Gartner Hype Cycle, at the top of the slope with disillusionment about to set in.

We were collecting data in the past; possibly working with it in Excel, doing something with it. The simple things that mattered before, now we can automatically crunch. Putting end users in the driving seat is potentially new.  Previously the stakeholders didn’t have access to these representations – only researchers. Don’t know whether educators and learners can make sense of them. Know they can make sense of red/amber/green. Early techie adopters can make sense of social network visualisations.

Metaphor for feedback at the point where data is visible to someone else. The QAA assumes that institutions do annual monitoring to connect issues with planning and action. We have a feedback model for students in which assessors give students advice about the performance. Feedback model, idea of a system; it’s not information or sense-making that creates a change, it’s an action and change. You can have as granular information as you like, but unless there’s a capacity to act, it’s not feedback, it’s just information. Improving assessment feedback regimes has struggled with this; is a big issue for this community. One way to attack part of that is participatory design – opportunities to ask challenging questions, test assertions – user-centred questions.

Well, ten years from now this’ll be different. My five-year-old assessing his own learning, school monitors whether they’re on track. Those learners are going to be used to being monitored. Flip side will them being used to monitoring their own learning. Learners will treat their learning different. Learners have access to data that others don’t; danger of top-down view restricting pedagogy. There are things I don’t want my institution to have access to, but I’m prepared to join them together for me. Question of the level of control over your data trail. The student is a key decision-maker – an active agent. But many learners not very active. If we were to engage learners in co-designing of analytics whereby they would be judged – what would the pedagogical value be there? What would you be happy to be judged by? That discussion could be very powerful pedagogically. Analogous to asking students to assess each other, have to engage in depth.

Who’s involved in making sense of the data, and designing the interventions.  Bringing in qualitative data as well.

What would it mean to apply analytics to our current conversation? What would the benefit be? We know that people who don’t turn up to our tutorials can get perfectly good results at the end. It should count if they do turn up to events. But we know there’s only a loose connection between participation and the end results. Should we count people talking, or is it about who goes on to develop systems? What’s the measure? There’s many situation where there’s things you can measure with only a loose connection to what you’re getting out at the end. There are things we definitely can measure that are only very loosely connected to learning.

Easy to envisage a future for analytics that shores up the current system. What can we do that hastens the new dawn? How can we tweak analytics to encourage more active learning? Make sure that vendors are in the dialogue – to have credibility with educators, turn R&D space in to product. Right now, dashboard from most vendors counts e.g. who talked, how much time, how much airspace – no notion of quality. More advanced would show who responded to whom, disagreed with X, supported Y. Innovation creeping in, and that’s what faculty member will see – unless university is building their own analytics. Another level could add on feedback to learners and teachers. If we ask people what they want to be assessed on – likely to vary, won’t be things we pick up on unless we ask learners about how they can show how we’ve learned. Mediating between the analytics and the people making use of it; nobody’s done this in detail before. People making use of the analytics make better use of it, but also feed back in to the analytics process to improve them.  Data wranglers have found a way they were interpreting data was different to how others were, which took some investigation. Struggle to understand what the really important things are to track, and managing the implications. We do know a lot of answers to the questions – sad story is that it doesn’t depend on effort and native ability but parental income. We have lots of good indicators of what the big factors are, what correlates with the outcomes. Or do we? (Divergence of opinion in the room.)

In some senses, learning analytics is the response to the loss of a teacher who has time to see everything. Gradually we’re building up the system for the institution to understand things that were the job of the teacher. Discourse around the loss and rebuilding of something. OR – professionalisation and improving – doing a lot better what we were trying to do before.


This work by Doug Clow is licensed under a Creative Commons Attribution 3.0 Unported License.
No further permission needed to reuse/remix (with attribution), but it’s nice to be notified if you do.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.

One thought on “SoLAR Flare UK – Feedback session”

Comments are closed.