I’m at the Second International Conference on Learning Analytics and Knowledge (LAK12) in Vancouver, Canada.
The conference hastag is #lak12, and updates are on @lakconf; there’ll also be a livestream video for the main conference.
This is the first of several liveblogs from me from the conference. There are notes mainly for me, but posted here in the hope they might be useful to others.
Today (Sunday 29 April) is the pre-conference workshops; the conference proper runs on Monday, Tuesday and Wednesday.
The workshops are taking place on the UBC campus in Vancouver, which is a fascinating mix of buildings set in stunning Pacific coastal parkland. If I had to emigrate I wouldn’t mind working here.
Sharon Slade and Fenella Galpin – Learning Analytics and Higher Education: Ethical Perspectives
Sharon and Fenella are from the Open University, UK.
Students leave behind much information, with little or no realisation of what we do with it. Does it matter?
Reviewed data we capture about students – e.g. gender, log in frequency, study goals, assignment scores, etc. The OU captures a lot of data like this, and uses it to calculate e.g. vulnerability.
What’s the concern? Most papers focus on privacy, what more might there be?
- Privacy – do they know we gather information, and do we tell them what we do with it?
- Transparency and robustness
- Power – who decides what happens, who gets support?
- Ownership – who else can mine it? Can learners opt out, and if so, what happens? How long do we keep it?
- Responsibility – for accuracy? Can students disguise themselves? To ensure equitable treatment?
The OU already provides differential support for ‘widening participation’ students, who receive extra support.
Snakes and Ladders activity
We played a game that the presenters had prepared, to illustrate chance aspects. There were little coloured cards with attributes – what gender, where you live, how you like to study, other characteristics. Then you play on a snakes and ladders board with custom snakes and ladders illustrating learning analytics impacts on learning progress. One player is a completely self-sufficient student … and they get to start, and ignore the snakes and ladders and everything on the board. The players tended to spread around over the board, with the self-sufficient student making steady progress.
In our group, people kept landing on the same space (the tutor objects to a system, all go back 5 spaces); we got frustrated with that and all decided to fire the tutor and ignore the effect of that space if anyone else landed on it.
The student perspective
Discussion of how it felt to be a student – feeling advantaged or disadvantaged, fair/unfair, impact of labels. Some points from the discussion:
- Issue when people got additional support but weren’t seen as being any worse off to start with.Another one about labels vs descriptors – factual, more objective things start as descriptors, but become a label when that links to decisions about resources.
- One snake – opting out of the learning analytics system – was raised too: why’s that bad? Support provided through the system or otherwise. It could be positive or negative.
- Someone lost a label, and could see the impact of that – we’re not as transparent as this with our own students.
- Talk about ‘being punished’ and ‘harmed’ by what’s essentially arbitrary; the perception was of something done against you.
- As students, it felt like we had no control or influence over what the tutors were doing with analytics, except when it dings you. Like when we as a group got fed up with the same penalty space and decided to ignore that rule.
- Danger when learning analytics is very rule-based; can lead to unethical outcomes. Decision-makers are key.
- We as students had no appeal route, no way to articulate an objection.
- Are there dangers to opening up the transparency around this? Almost certainly.
- It’s a danger to reify the constructs from the learning analytics system; may be misleading as it’s an aggregate system.
- Mistake to have too much faith in the power of big data.
Exploring from different perspectives
We were each given a small coloured cards (some orange, some pink, some purple) to be our ‘graduation certificate’, which sorted us in to three groups to explore questions – students, tutor, institution.
There were a whole series of questions on pre-printed sheets of paper, around the potential impact and issues of learning analytics. Our group was tutors.
- Interesting discussion about Signals at Purdue – since we had John Campbell at the table. The signals are coarse representations of a number, because they don’t believe their model is accurate enough to justify more than that. Interventions are generated by the instructor, usually picking up existing support mechanisms that were not previously used as heavily as they might be.
- Some instructors are crying out for information about students at risk – but not to be told what to do about it.
- Issue around releasing information about how the model works; potentially positive, but can lead to unhelpful assumptions and labelling. Many or most tutors don’t want to know the nitty-gritty detail, because they know their behaviour will change.
- Signals is trying to model/capture ‘help-seeking behaviour’ at the moment. Issue if students perceive asking for help as a failure – recast as asking for help is a successful behaviour.
- Big question around how to deploy support resource ethically and appropriately. Not a new one, but made more pressing by learning analytics.
- Many stakeholders seem to want to read more in to the outputs of models than they can legitimately support.
- Assessment scores are hard to use as metrics if not standardised – ‘great, 20% As again’. Many courses don’t have clear outcomes and good measures for assessing them. Comparison often only to the cohort of peers.
- This could drive some really big questions in the future – about value of textbooks, assessment, and so on.
The institutional group found a lot of commonalities across their diverse institutions. Big issue around the locus of control. The sophistication of analysis isn’t there to make the focused teaching-and-learning interventions we might like to. Question around for-profit: what’s the institutional stake? Is it improving learning, or cash bottom line? This is a big driving force – retention, years to graduation, number of degrees. These early analytics focus on early alert, intervention, helping students who may fall out of the mainstream – but what about the successful students, and using that, or what makes a quality course and spreading that out. If there are targets, how do people work to them?
The student group talked much about how the system might be set up. Central question of the locus of control. Most students wouldn’t expect to know what the system captures about them – most of us don’t know about Facebook. Perceptions and social issues important in labelling. Are you your academic record? College as a fresh start – should your record be cleaned? When do we forget? People are different at 15 than 20. Labelling and externalising the responsibility for performance. Feelings could be a significant source of tension. Bell curve grading, evaluation versus peers, rather than a fixed standard – really changes student behaviour. When they know only one person can get the top marks, the drive to share and help each other disappears – because others’ misfortune will help your fortune.
Towards a code of conduct
Links/overlaps with existing policies on data usage – e.g. the OU’s Data Protection Policy.
Many issues identified in Campbell, DeBlois and Oblinger (2007) EDUCAUSE Review 42(2) [PDF].
Ethical issues around control and experimental groups. Obligation to teach all we admit? As soon as you measure, you focus on what you can measure – ethical issue. Students can game the model; institutions can game the model too.
A lot of most these issues are not at all new!
Tough to know what’s gaming, and what’s just off-mission. Harder to see that for institutions.
Signals (at Purdue) isn’t mandatory; some department heads try to make it required, but then it dies a fast death. One group of faculty think ‘these are adults, they need to sink or swim, we shouldn’t coddle them’. Another group think ‘we should do whatever we can to make them independent learners’. But faculty are supportive about providing as much feedback as possible. Frustrated about students not showing up to things that are designed to help them. Signals has been a driver of use of these. Some Deans, department heads want access to the data for class-by-class, but they say no, because model not strong enough to support more than simple red/yellow/green.
Difference in acceptability of learning analytics among large class first- and second- year teaching; but in smaller, professional ones (e.g. dentists) much less.
Things are different enough at different places that a single code of conduct might not be appropriate. But there is power in a standard code of conduct, perhaps as a framework.
Next steps
Google Group being set up to coordinate this. Will post up all the materials used today, results from today.
[Later: Google Group is here: https://groups.google.com/forum/?hl=en&fromgroups#!forum/lak12-ethics]
It’s complex. Goes way beyond just privacy. We’re all very interested.
–
This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.
2 thoughts on “LAK12: Learning Analytics and Knowledge, Vancouver”
Comments are closed.