More liveblogging from LAK13 conference – Friday morning
Panel: Perspectives on Learning Analytics Research

Paulo Blikstein, Stephanie Teasley and Alyssa Wise
Stephanie Teasley, U Michigan
New York Times articles – about a product called CourseSmart (“Teacher Knows If You’ve Done The E-Reading”, April 8). ‘The plan is to introduce the program broadly this fall’. If we don’t act, critique, improve – even though something just happened, we don’t know what it’s about, it’s being moved ahead.
Hyperbole or reality? Many terms and phrases to do with MOOCs, online learning – big data, game changer=personalised learning, disruptive innovation, etc. Administrators are taking the hyperbole seriously. It’s a call to arms.
At the NSF, message that the budget proposal had gone to Congress for an increase, had one area around cyberlearning and online education, inter-disciplinary, with enormous potential – and that’s up to us.
Quick thoughts about challenges. We are a new field. Only the third conference. Don’t fight old battles (quant v qual) or create artificial battefields (LA v EDM). Expect rigor – not another thing that gets hot but never realises the promise. Have rigor – not only by one standard, we’re interdisciplinary, being rigorous about the method we each work in. Identify which data is significant and more importantly be able to say why. We have a lot of data – but why do we care about it, and how can we take action? Aggregate knowledge across data types and datasets, take things that you know from your students and think about how we can put the datasets together. Many journals require data to make it available; issues with educational data. We should be able to figure out ways to deal with privacy to create some really large datasets.
Many many dimensions – picked out three. (Rubik cube visualisation.) But our goal isn’t only to talk to people like us.
We need to do it well.
Paulo Blikstein, Stanford
“We teach what we can measure.” If we don’t measure what we care about, it will never be taught. For the vast majority of K12 and a lot of HE.
I came in to LA because I wanted to teach things not easily measured by multiple choice tests. Key goal is to identify types of learning we care about – e.g. project-based learning. Extensive literature saying how, especially for kids who are not top of the class, having tangible materials, working in teams, is tremendously beneficial. These types of learning are more inclusive, help people learn better. Our responsibility to design analyses to show that those kinds of learning can be measured in an objective way. Open-ended learning – can you use different sensing, techniques, to produce dashboard to help assess that? So not just looking at what’s taught today and just optimising it and making it massive. This community could have a really huge impact. Overnight. Schools have huge incentives to adopt measurable programs.
The bias towards “cheap data”. Easy to collect, already available. We should work on data analysis, but also on new data collection techniques. Multimodal – video, biosensor. Careful not to be seduced by large easily captured datasets that might not have a big impact on changing education. Sometimes you have to spend a year designing a data collection technique, but it might have a huge impact in the future.
We need to push the system to embrace multimodality and the social aspects of learning. People get information in different ways, react to learning differently. Easy to collect data from people who are using a computer. But there’s more going on around – talking to other people, other interactions that we can’t completely ignore and imagine we’re picturing the entire learning experience of students.
Finally, there are big political and economic interests at play. The community is unique, we are in an interesting time, huge interest in education from companies and governments. When you see some unanimity you should be careful, everyone talking the same way about the same thing, I get nervous. Those interests might not be those of the students.
We need to be disruptors and public intellectuals. A lot of impactful things going on. Universities being dismantled for MOOCs. K12 systems being revamped to use blended or online learning. It’d be easier to say, we just look at data, we sit on our computers and run R scripts. That’s not what this community should be about. When we see the cliches, that are not research based, we should speak up. We have a big opportunity and responsibility there. If you study emergent or complex systems, it’s not a linear thing. If there’s an article in the NYT or CNN, it’s all suddenly true! Dismantling this smokescreen, do research – a big responsibilities for the EDM and LA communities.
Alyssa Wise, Simon Fraser University
The other two set up the context. I’ll do something a little different. Suggest a contentious and provocative vision that’s different from what we’re currently doing. My title would be “Do we really need another dashboard?”
Why am I not enamoured of them? There’s a place, they’re useful. But they’re separated out of the learning activity, they’re large and universal. Instead of big, overarching ones, we should have small systems tightly tied towards the learning system. I understand the attraction of scale, but maybe a different way to go.
It’s a space that’s moving fast. We have a dual challenge – being rigorous, quality thinking; but at the same time, at a pace that’s much quicker, it takes time to do this well. Trying to do something big we have to sacrifice some quality – not sure that’s a tradeoff to make. Smaller projects – targeting more specifically.
Like in my talk – embedded analytics, tightly tied to what the students are doing. The interface for the activity and the analytics are one and the same. We should be designing analytics-enhanced learning tools. A bigger demand, but a better one. Not building the one system, but a lot of systems that are good for the learning activity we’re focusing on. Can move more quickly providing an alternative to some of the large systems.
This is so important. We teach what we can measure. Our LA will drive our pedagogy. Think about consequential validity – what assumptions are embedded in our systems. Are people going to need to teach in particular ways? E.g. reducing incentive to change assignments. Important value in diversity, and innovation.
We need to keep the human and the person at the centre of our focus. Data, algorithms, theory are important. But we are trying to help people learn. We shouldn’t be data miners, but data geologists. Or data archaeologists – we’re trying to tell a story. The easy-to-collect data isn’t capturing everything.
Many different methodologies we could deploy, but supporting human activity is very important. Agency – a lot of power involved in data. More excited about data used to empower people. Leave some power in their hands, some of the sensemaking.
Finally, about methods. Don’t play loose with theory or methods. Most people in this room people may be more excited about one or the other; we should be a little keener to play in the other half. Need to think of the hallmarks of quality LA. Some standards in methods. Looking at system as a whole, we don’t have the same things in mind.
Discussion
Q Inspired by your talk. Do you envision an app store – learning tools empowering the students. It’s still standardised and global but flexible and has small applications for enhancing your personal, human (context?)
Alyssa: That did occur to me. There could be an app store with diversity. I don’t know how much things are standalone pieces. Maybe there’s an analytics app store. There’s lots of different choices.
Q: We’re playing very safe here. How many things blew your mind? We should be like children on an Easter Egg hunt, running around finding interesting knowledge. We should try to fail more. Let’s see if the heartbeat of the children has an effect on their learning. We should do more risky research.
Stephanie: I like it. Risk taking has to happen. Not sure I haven’t had my mind blown at all. Rather than play favourites, maybe other people would like to say, something that came close to blowing their mind, or something not particularly safe. The best paper here, that blew my mind.
George: I’m reminded of a book on religious experience – no much a big bright light as a slow increase in intensity. It’s been a progressive awakening. We have a dual role to play. We should ask provocative but practical questions. My experience is a progressive understanding of LA significance. Duty of care to honour the structure of learning.
Q: I come from the ed tech field. This is mind blowing to me. Learning about data collection and data analysis. Being able to add that in to the educational technology field that has been somewhat static. There are two areas that need to be developed within LA. One is information visualisation. Ways of representing and interacting with data, that’s an area where we need to move forward. Second, tremendous research in computer-based instruction. A lot is fairly old research, coming up again now. Ed tech has moved on a long way in terms of the pedagogy.
Greg: What blew my mind is that .. I’ve heard a lot of talks by computer scientists, there’s often a lot of handwaving and imaginary learners when you talk about education. It’s the extent to which everyone were grounded in theory about learning. That’s really exciting.
Stephanie: I love the concept of imaginary learners. Or extreme examples. As if they represent the general population.
Phil: I’m an educational psychologist. I’m also from SFU where Alyssa is. If you’re developing an analytic, a report, what should it change? I’ve seen presentations about new kinds of report, it’s unclear what would happen to a learner or teacher on being presented with it. Second, what cognitive or motivational or social process is reflected in the data that are analysed? Connected to the idea of cheap data. It’s fun to play in lots of data, lots of methodological affordances. Most are distant proxies for what you’re really interested in. Challenge to get closer to the cognitive, motivational, social processes. Every learner I’ve met – unless punished for it – underlines or highlights. They’re making a distinction to that information. What standards are they using in doing that? Do they promote learning? I don’t see enough of that. It’s an important challenge. Old garbage in, garbage out maxim is worrying.
Paulo: GIGO is a concern we should have. Great things you can do with numerical methods. But sometimes used to mystify what’s actually happening. Big methods in machine learning, people assume we know what we’re talking about. I’ve seen meaningless blind application of methods to data that’s not well known. It’s easy to be seduced by new tools, put stuff in RapidMiner and find lots of clusters, but what does it mean?
Alyssa: Often we start with methods and see what they can do with us. If we start with questions, the methods we use are those we know about. Want to know more about what kinds of methods help with what kinds of questions. Phil’s point about cognitive, social, motivational processes – some people do that, others come up with indicators that are predictive but may not have a meaning. We haven’t understood them – is that useful?
Phil: Discovery is a wonderful thing. We should pay attention to it. But need to dig in to it – what is it indicative of. Correlation doesn’t mean causation.
?Avi, Turkey: Mind blown yesterday by EDM presentation. His grasp of those parameters – talking about kappa like it’s his best friend. I liked that. I was working on signal processing through the 90s, all these techniques – classifiers and so on – we used in 70s and 80s in audio coding. Felt grasp of familiarity in that presentation. We should look at those parameters and see what they amount to. I’ve not seen intervention papers here. That’s critical, the way you use a piece of information makes all the difference. Whole field of behavioural psychology. Results of LA have to be coupled with behavioural psych.
Alyssa: Yes.
Someone in a startup in field of dyslexia: Driver’s seat, co-piloting – academic community and the entrepreneurial community. Numbers are staggering. Year ago first accelerator on learning tech in California. Since then, about 20 accelerators in US, London and elsewhere. So expect 400 to 1000 startups in education. This is a major accelerating force – maybe for good, or for bad. How should this community work with entrepreneurs? They are a fact of life, much more important than they have been until a year or two ago.
Paulo: In Silicon Valley, bump in to them all the time. They talk to us on campus. Everybody has the same idea – education is so efficient, why don’t we put them online. I’ve heard that 20 times this week. People need to think more out of the box, not just we’ll scale up X, other industries transformed by tech except education … I’ve heard that every other week. Entrepreneurs bring energy, speed to implementation that academia can’t offer. The relationship is a little weird, goes both ways. Entrepreneurs don’t want to listen to research that’ll crush their dreams. Hearing e.g. it’s been tried many times, not keen. The way entrepreneurs consume research should be different, maybe the collaborations should be more productive too. Problems the other way – academics saying how about a five-year study before we start. The numbers that you mentioned, those things are accelerating in a way I’ve not seen before. Adjusting how those collaborations happen is important. It’s also important to think about how we allow multiple forms of training to happen. Entrepreneurs want the basics of the field, not a PhD – there aren’t good ways for that professional development to happen. Same for teachers, principals. We don’t have good models right now.
Alyssa: We’re all concerned about what the future should look like. As entrepreneur more concerned about what it should look like. But tricky place for us to go, what we try to do here is uncontaminated by real world concerns, profitability. A tension we should be more willing to live with. Some of the companies, trying to help them do things a bit better even if it’s not how we would do it. Lot of energy and motivation in terms of impact. I’m concerned not with what should happen but what will.
Chris: At LAK11 and LAK12, mind blew by Course Signals, SNAPP from Shane Dawson’s group. Didn’t see things like that at this conference. The immediate scalability throughout the institution, could affect 20,000 students, or half the faculty and instructors. I’m less interested in these highly refined questions and theory, small groups. More interested in results tried on thousands or tens of thousands. A lot of ed tech communities – lots of them – often focus on small studies, with strong questions, or clear ones. I thought that’s one difference with this year’s LAK and others. What’s the new big tech that someone’s going to spin off that I can say to my provost – your interest in LA can spin off here.
Dan: I’m going to change what I was going to say. This has to do with what we’re about here. Is LAK about MOOCs mainly? The large scalability has been a motivation for LAK – a whole lot of learners throw off a whole lot of data. But on the other hand, objective here – big players are moving quickly, bring the learning science in to that. Maybe we succeeded too far in that direction. Is LAK going to be a conference where different stakeholders meet? A boundary conference? Or a core identity? Or both – a core/periphery structure. The core focus looking at new analytic methods, or understanding existing ones in how they inform learning in an actionable way – want a stronger emphasis on that. Hard to do. Eye tracking paper, poster looking at graph motifs, has potential. An issue there, once you have it nailed down, then it doesn’t belong here but at e.g. learning science conference. We rejected a good paper that was learning science using existing techniques to address a problem. Are we informing big fast-moving things with big paper? New innovative techniques?
Linda Baer: The core function of LAK compared to analytics broadly. We have to be aware of the more big data continues to impact education the more we need research to inform what is happening. Urgency – politicians are grabbing stuff to be more efficient and effective. How do we stay ahead of the game with this community of learners and researchers that care about analytics – do it right, but stay ahead of those who do it regardless.
Alyssa: I don’t know that we can be ahead of them. We can think deeply, but not be right at the edge. Getting ahead isn’t feasible.
Paulo: Every community needs both people. Some people are already doing that. Doesn’t have to be an explicit choice, senior people talking to the press. Communicating a message, making a connection between this community and the public. Sometimes more explicitly, strategically done, sometimes not. Select e.g. papers that are seminal in the field and have sexy results and have that ready to tell the journalists – e.g. MOOCs are not that great. Journalists and the press get excited about new things. After that peak, there’s an incentive for other journalists to debunk the myth and sell more newspapers. Detect those cycles, deliver a message that’s not the overhyped message.
Stephanie: I like you call to arms to be public scholars. It’s becoming more common for academics to be visible in the public press. Used to be inverse correlation of perception of seriousness and your public profile. We can be more public when things like this article come out, pointing out where there are unfounded and ungrounded statements. Many universities have publicity machines, always trolling for stories about research that’s going on in the academy. We should be pro-active not re-active.
Naomi: Present different students with different analytics over time. Students have a disjoint experience of study. How can we pull that experience together?
Alyssa: Good question. The analytics integrated with learning activity they’re a part of. In different courses it’s often quite distinct; merge analytics with that. How to bridge that across is also important. Not saying no room for overarching analytics. Some might e.g. more academic analytic. That’s not the only thing but that’s a point to take in to account.
Caroline: Linda’s comment about keeping ahead. I don’t think we’ll get ahead. But we’ve learned from systems development to go through that lifecycle to nimble, agile computing. Nimble, agile analytics – use what we have now in a more action research way, we have a better chance of keeping on. We’re swimming in it, can’t stand on the shore waiting. Embrace the change and go with it. I like this time, closing the loop – not just analysing, using theory, but bringing it back to the learners [Learning Analytics Cycle!]. Outcomes assessment – it’s a reasonable thing to do to see if your intervention works! Last thing, one of our problems is dealing with multiple stakeholders – who are we talking to? Our core here is talking among researchers, that’s a good niche. But outside we have to be public intellectuals.
Phil Long: Reminding myself – Dan made comment about boundary agents. We can be boundary agents among these stakeholders, provide a framework for the hype. Express thoughts by buying something, think you’ve solved it and moved on. As interdisciplinary group, work with other stakeholders, policy people. Inform learners of options that could improve the management of their own learning. Juxtaposition – do we need another dashboard? And yet for many learners the dashboard is how their quick-and-dirty interactions are mediated. What other mechanisms of real-time feedback are effective and useful?
Greg: Speaking as a techie who wonders how research in CS can be useful, really move things forward as opposed to doing something everyone’s done but in a new way. E.g. yesterday computational linguistics methods, network methods. How to channel energies so as to say – not building another method, but take this problem and try to create a computational method that can do better along some metric. Alyssa – share your tiny system. EDM had a competition – what other ways can we take your data and try to present it to learners in the 50,000 ways we can imagine and see if they work as well or beter.
Chris: Linda’s comment, the companies that provide learner analytics and their visibles. A few startup companies. Where are the Blackboards, Echo 360s, what is the relevance of this community to these partners who seem to have disengaged a little bit, what will that mean over a few years. Change those products and they get sold back to products. ITSes have 30y research but not broadly used. It’s stayed as a narrow focused research question, it’s used in K-12 but in specific domains, unlike the LCMS which is used everywhere.
Stephanie: What many of you have said is interesting, I’m excited to be part of this community challenging each other. Look forward to you bringing these ideas forward, and ways to address it, to the next LAK.
Paulo: I want to mention a couple of quick things. The learning sciences community, one fundamental value is a lot of people care about inquiry rather than scripted curriculum. A small thing, if you ask people, they say trying to make things more inquiry-based. That gave the community an identity, made a difference. For this community, we should make some of those choices. E.g. between technology as a way to make education cheaper, or make it better. We might say make it better. Many choices like that we can make. We’re not like physicists locked in a lab. The ethical choices we can make are a really important thing. Pressures from private sector, government. There are always values at play, moral choices that we can make, which will make a difference. What are those choices? We’re changing the lives of children. This community has the power to decide how things will be through our research and our public voice. We should think carefully.
Alyssa: It comes down to taking the notion of what we think learning should look like, and engaging with the people saying what will it look like. It’s a tension, some compromises involved. How can we do something that we feel good behind, but not staying in our own sandbox. Ed tech work with strong foundations but not taken up widely. We need to create viable alternatives. That does involve compromise. Need strong sense of what values we need to hold on to. I was impressed in the mind-blowingness, there was a lot of diversity – there’s interesting and new things in the agglomeration of what we’re doing. Not breaking boundaries in the home discipline, but coming together in a unique way.
–
This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.