LAK12: Weds morning

Liveblog notes from Wednesday morning plenary session at LAK12 – George Siemens’ keynote.

Convention Centre

(This photo is taken from more-or-less where the conference centre/hotel is, and I have a view of this from my room’s window. Except this morning it was drizzly and almost totally grey. Welcome, as they say, to Vancouver.)

Shane Dawson thanks Al Essa and Desire2Learn for sponsoring the conference dinner at the Aquarium. Invites everyone to the SoLAR AGM. Gives out SoLAR T-shirts as prizes for helping, tweeting, and so on. Showed visualisation of reference map from the papers at the conference – it’s interactive so you can explore it; will be on the LAK12 website presently.

Dragan Gasevic introduces George.

George Siemens keynote:

Learning analytics: Envisioning a research discipline and a domain of practice

Somersault over Kits Beach - Image593
[NB Somersaulter in photo is not George, although I’m sure he can do it.]

Thanks the team putting the event together.

Looking at distinction between research and practice. Where are we? What needs to be done?

It’s not a guarantee that LA will develop as a field. If we assume we’re on to something, and will be ongoing research interest – how do we map out a field, without becoming prescriptive?

Stating the obvious: growth in LA interest, special journal issues, CFPs. In universities, VPs, Deans will go to conferences and hear about big data and analytics and want one of those in their institution. Granting agencies will turn attention to big data and analytics. Might not be learning analytics-specific.

Another ‘encouraging but disconcerting’ is rapid development in the vendor space. Rapid redeployment to meet new buzzwords. Terrific opportunities to partner with the vendors, but some distinct personality differences between vendors and researchers.

We’re externalising more of our activity on a daily basis. A stunning amount of our activity captured – not always because they are explicit. Can deduce e.g. beliefs from connections. Plus new sensors capturing data.

Scope and scale of analytics – top-down/bottom up. Educator interested in their class of students, effects of individual interventions – the fun stuff is happening in bottom-up. But the big impact is from the top-down. ‘Small data’, often done with low-threshold tools e.g. SNAPP. As you move up, you shift gear – a coordinated team; data integration; automated discovery and intervention – and that’s towards big data.

Domains of LA activity & impact: learning – has been a bit subverted. Many others, including systemic impact.

Ellul on technology: all-embracing approach of technique, it avoids shock and sensational events. When you move to a heavy analytics-structured world, risk that you strip out serendipity, magic of unanticipated connections. Are we at a stage where LA will lock down and reinforce everything we don’t like about the current education system? Or are we going to open it up? Not an inevitable question, but clouds are on the horizon.

Challenge of the research-practice gap. Education is great at not applying its research. Much innovative insight, e.g. learning science. We know more about good education than we apply in the classroom. My background was in the hospitality industry, used to run restaurants over 8y before going in to education. Significant – everyone whose Mom had a good meatloaf recipe thought they could run a restaurant; and everyone thinks they can teach better than the school.

Explosion of interest in disruption of education, in the language. Don’t like the term, but is framing the discussion. Research institutes – $60m in to open course initiative from Harvard/MIT (EDx) to compete with Udacity, Coursera, etc. Models are getting more complex. Our understanding isn’t. We’re doing things that are interesting, but don’t know why and what’ll happen. A lot of technological change, entrepreneurialism – without knowing what will happen. Students may suffer for years down the line if wrong.

This conference, focus on inter-disciplinary – technological, pedagogical, and social domains must be brought into dialogue. Need the breadth. Learner-centred education is good, but education is really society-centred. Holistic, integrated research. US Dept of Ed see increase in models of collaboration (Enhancing Teaching and Learning through EDM and Learning Analytics, 2012).

Four areas to think about.

  1. new tools, techniques, people
  2. data – openness, ethics, scope
  3. target of analytics
  4. connections to related fields

New tools, techniques, people

Tools should be practitioner-focused, not just researcher-focused. If new system takes months to learn, probably not horribly successful. Suppress complexity to give smooth UX. Flow of relevance between practice and research. Tools largely appropriated from elsewhere, not specifically learning analytics tools.

Techniques – a lot of importing, doing something with it that gives it a LA emphasis. Analysis is now a value point – Pearson OpenClass. ‘Free’ tool lets Pearson know what users do with their products. Many techniques important, a reflection of LA’s youth.

People – critical issue. Need multi-skill/capability. Certificates, seminars, professional development. Capacity building. What does a good LA project look like?

Data – openness, ethics, scope

Big word here is data. No, ethics. (!)

Openness – companies doing most interesting things are proprietary, not sharing all their insights. Not being paranoid. Knewton, Pearson, integrated in to university systems; ASU has become a conduit for education. Conduits are helpful but not a huge source of value.

We want to connect with vendors, they’re important; but recognise difference of views. They’re not open, accessible, testable – and hence not improvable.

Openness – glad John rapped LAK’s knuckles yesterday, emphasising that EDM has an open access journal. New journal will be open access; conference proceedings are closed, because of ACM, needed for travel justification especially from Europe. Beyond just publication – is about datasets, transparency of algorithms, render them more culturally appropriate. Completion rates – a big focus – is a very US-centric problem; Canada too, but in Europe less an issue because a different structure of enrolment. Big challenge around ownership of data, and who owns the analysis of the data – can I see what others see.

Ethics! ASU interesting study, didn’t clear the research board: swipe-card data analysis, how students move through the system. Clever on one level, but all sorts of issues. Did students know? Data as a transaction – you get my data in exchange for a service. But don’t expect that at a university. It’s enormously unsettled, talked a lot about it in Linked Data session. Big thorns in our side is how we address this. Researchers and participants in community need to be pro-active here. If you don’t regulate yourself, there’s a benefit to someone coming in from outside and regulating you.

Scope – very LMS-data-driven, and digital-only view of analytics. That’s a fraction of actual learning activity, won’t find the big insights in to learning if we confine to the LMS. Don’t need a Kinect in every classroom, tracking mobiles – but for research purposes that might be appropriate.

Integration and usability – bringing datasets together; harmonising data definitions. Analytics initiatives – can they bring in all the data together. Each additional node provides exponential value to the analytics, but growing ethics and technical issues.

SoLAR open learning analytics architecture – intended to be a Moodle for the learning analytics space. Within the next decade, there’ll be enterprise-level analytics, like our existing ERPs. The value of the analytics is multiple data sources, so expect specific analytics tool. Ensure openness for researchers.

Target of analytics activity

Nice graph … so what? Outcome of analytics process is not insight generation, but question generation. Why did those students in that group fare this much better? Why did this person have a black hole for grades? Analytics should prompt us to ask these sorts of questions. Beyond technical activity and in to sense-making for learners, educators, administrators.

Example visualisation – http://thedoghousediaries.com/3426 – simple, actionable intelligence.

Can we move beyond tweaking this system to something larger? For education, it’s unknowable how it will develop – it’s a large complex system. Need to recognise whether we lock down, or liberate.

Connections to related fields and practitioners

As long as someone’s doing the work, I’d rather connect to their work than to duplicate their work. Connected specialisation. No one person can built a 747. Education system moving in a similar direction. If good quality research in learning sciences, EDM, wherever, we need to ferret that out so we’re not a self-citing group of friends.

Not everyone has a Tony Hirst. Shows his 42-page analysis of Twitter feeds [PDF] from #lak12 – who retweets whom, and so on. With a limited amount of time, can pull together a very interesting analysis. Reach and influence graph fig 23.

Questions

What’s the balance between computers and people? The value of end user experience with computers. Hype would lead you to believe it’s all machines, all the time. Unsettling; at minimum need to question it.

Question raised – what’s the role of theory in this discussion? Broad movement in data science field, that says it all emerges from the data. Wired 2008 article on the end of the scientific method – with enough data, just crunch it, and get insight. Critical questions are the start of all insight, not algorithms. What’s theory’s role? To be validated by the data? (Or refuted.) Or for theory to emerge from the data? We need to think about this relationship to move learning analytics forward as a science.

What do we need to do? How will we know we’ve been successful? Don’t have an answer. Quick starting points:

  • Research-informed tools for analytics, being used in practice, not just labs. Have this sense that LA space has promise for different researcher/practice relationship than other educational research. (Gets a thumbs-up from Ruth Deakin Crick in the audience.) Iterative cycle.
  • Contributions to science of learning, understanding what is the impact of instructor activities; impact of learner activity x vs activity y. Can overstate, but accurate to say that we will understand a lot more about the learning process through analytics than we have been through previous models of research.

What are we missing in LAK? Have laid out four criteria, but what’s missing? What’s wrong? Own the solution. Have structured SoLAR so not to get people to do what they’ve decided – but want to support people coming in who want to own a particular problem. Currently very egalitarian mindset, camaraderie.

SoLAR

Many opportunities to become involved. Become founding organisation – universities, corporations.

Purdue conference – SoLAR Flare Practitioner’s Conference, October 1-3 at West Lafayette, IN.

Distributed research lab, interdisciplinary space.

Maggie

Questions/discussion

Simon Buckingham Shum: Should LAK accept papers that could appear anywhere else, and if not, what’s the difference?

G: Redundancy is important in any ecosystem. If there was an application around the paper that would inform our LAK principles, I’d say yes. But if we accept an EDM community paper, need to ensure it fits with e.g. sensemaking/practitioner context. Short answer yes, concepts from EDM relevant, but need reflection in to the LAK community. Likewise the other way – expect EDM to apply their standards. It might just be a matter of timing for your brilliant paper. We need to make sure papers from related disciplines recognise the community they’re submitting to.

Someone 1: Presentaitons heavily skewed towards HE, informal. How can we engage primary, secondary ed?

G: Tools that target K-12 education. Encourage researchers in faculties of education to become involved with the LAK/EDM communities. The work SRI has done with paper with US Dept of Education, was focused on K-12 space. Beyond that, not sure what specifically.

Someone 2: In a HE setting, what does LA as a field have to say about implementation – research in to how you move innovations in to the classroom, onboarding instructors.

G: Significant in terms of impact. Want that relationship to practice. Don’t know what model yet, more than one approach. Develop tools like SNAPP; takes 5 minutes to pop in to your browser, go to your LMS forum, can run a SNA. Might take a bit of time to figure out. But elevator pitch for LA tool. A few people in the research space need to pump up competence with tools like Gephi, R, SAS. But ensure impact in classrooms, it’s all about usability of the tools. Goes back to Erik’s point about the UX.

Someone 3 (Dai?): Open source issues around proprietary data. Add to the list the way in which the regulatory frameworks of different states – the impact on the meanings we can make of the data. Cast our net wider.

G: Should emphasise there’s a variety of definitions. Recall the joyous days of defining learning objects! Ideally we wouldn’t do that. There are different dimensions. Academic analytics targets more the administrative level, e.g. VPs, completion rates. Analytics more broadly, business intelligence – what happens between contexts, e.g. grant/loan agencies. It’s about connecting to other areas. There’s a lot going on in EDM, BI; vendors selling analytics tools to administrators already. Systemic impact not always clear. It’s the Viagra effect – you buy just because you’ve seen so many commercials. (Did I say that out loud?) We talk learning analytics – it reaches in to other areas, which will be filled by others. The learner-teacher interaction needs to be targeted by us.

Someone 4: Seems this has an impact on tools as well. Intervening with individual learners. Institutions interested in LA and impact on instructors. Academic programs, tuning them on LA results. Institutional plans, recruitment, enrolment. A wide range. Education is a big business, here’s another suite of tools that opens up so many possibilities, hard to remain focused. Learner analytics, learning analytics, or which?

G: Will be defined by the discipline rather than our hopes. You can make progress – there’s ethics and ego. Tough battle e.g. at Texas A&M around evaluating faculty – not very well received. Faculty evaluations online – no such thing as a quick improvement here. Strong faculty pushback around analytics on faculty members. Ideally I’d say learning, focus broadly, includes learner, academic, policy – funding too. Entities involved in an effective learning experience need to be under the microscope.

Erik Duval: You mentioned you wanted to go beyond capturing just digital activities? Can you expand? E.g. using microphones in the classroom.

G: Our thinking is in line, which is a compliment to me not you. Collection of data that happens in spaces like this – it is a physical space, a lot not captured. We’re LMS-centred, and that’s negative, unfortunate. Thinking learning analytics = online learning is unfortunate. ‘Honest signals’ paper – signals in physical spaces, capture that, analyse it. Ethics issue ramps up much faster, now you’re recording, observing interactions. E.g. RFID tags on schoolkids moving around their space. Interesting insight, but important emphasis that we consider the learning process. If we’re collecting physical world data, what questions can’t we answer with our current data?

David Wiley: I wonder, what’s good for the goose is good for the gander a principle? Should faculty member who won’t be evaluated on their data be allowed to put similar stuff on students? Signup for faculty e.g. I believe data can make us better – I’ll do it to you (Students) but will let it be done to me. If not, do you have a right?

G: Great point. Ethical, procedural, political point. It cuts both ways. May be a declaration of faith that could be problematic. It’s holding a mirror up to yourself as much as the students. Analytics need to be student-facing. Why don’t students see that stuff? E.g. quantified self. Give them the tools to understand themselves. Multiple levels of reflection.

David: Why don’t students see that information about you? E.g. ungraded papers sat in your inbox? Can’t have our cake and eat it too.

[applause in the audience]

Jon Dron: The mood has dropped. I share your fear of hardening, at this early stage. The model you present – a little different to some people’s perception – might actually reinforce or cause that hardening. It’s about your engine. I like the analytics engine, the intervention engine. Really worried about the personalisation and adaptation systems. Parallel to intelligent tutors and user modelling, adaptive hypermedia.

G: Great question. Need to reflect on those points. I agree that recommendation/personalisation engine is an issue. My Amazon account – never let your children get on your Amazon account. I get references to bare-chested men on horses, because my daughter purchases. YouTube is ignorant – got recommendation of Britney Spears video. Maybe they were on to something? If personalisation reduced to algorithm – best form of intervention is 1:1 education. Issue is ramping it up. Harden parts of the soft system to make it scale. Important questions to think about it.

Griff: learned from George not to buy Viagra through our Amazon accounts. Because faculty think they’re not releasing this data, doesn’t mean it’s not there. Students put it on Facebook, social networks. At a hiring committee – looked at RateMyProfessor.

G: Issue of structured/unstructured, and what’s appropriate. Klout. Quality of assessment depends on quality of data.

Caroline Haythornethwaite: What have we learned from IT development? We don’t shove systems down people’s throats. HCI, participatory design, participatory action research – much better way to think about bringing stakeholders in to the design process. Value-centred design hits the ethics area. One thing missing is the talk about design.

G: Lori Lockyer, done a lot of thinking in this space – understanding how we develop our tools and methods. At Athabasca, Terry Anderson, Jon Dron, rolling out Landing, a social space. Different metric for tools involve people doing things, than for managing data.  If tool to involve human being, different design model, iterative, progressive nature, co-evolve with the individuals in the space.

Someone 5: The notion of us doing stuff to students and them doing it to us, that language disturbs me a bit. Strong presentations are opening up space for partnership. Think about the language.

David: I used it to be disturbing.

Someone 5: What’s missing? A lot of interesting talks focused on learning, what students were doing. Need to bring in design of the educational activity – know they were doing a discussion, but how did that play in to the educational activity. It informs a lot of what we show back to people. Invite people who aren’t here yet to join the conversation. Educational researchers who may not flock to us may have something to offer. How do we be opening and inviting to those who don’t have affinity for data?

G: Terrific question. Dan and I discussed this last night. Bring in other communities with relevant research, don’t want them to be inundated with concepts. Each field will have nuanced languages. This year intentionally reached out to EDM communities. Maybe next year, finding keynotes from those spaces. Option to see who are natural partners we can interact with.

Someone 6: Underlying piece, around the LMS. Some things are why can’t we let students see this data? LMSes have had this already. Sometimes I feel the same way, people look at it as a closed box. Modern LMSes aren’t like that. Why slam it?

G: Because you’re obligated to at an ed tech conference. [laughter] You’re right, LMSes have changed. Talked to Desire2Learn, huge improvements, thinking pedagogically. More so than LMS vendors did in the 90s.

Someone 6: I don’t agree with that. Original vision was quite pedagogically-focused. Not acknowledging educators who spent a lot of time on it, seeing it close down and then open up again.

G: If asked have LMSes been good for learning, I’d say no. Influenced pedagogy for the negative. But changing. D2L toolset, wikis, blogs, analytics. They ahve come out of the valley of the shadow of death. More social spaces, more acknowledgement. Requirement that if you want to learn you have to do it in the space.

Someone 6: Wasn’t the LMS that did that, it was faculty members just going online, some were inside and outside the whole time. Not acknowledging that particular product has a space, it’s one of the biggest places we get

G: We’re out of time, thanks for your comments.


This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.

2 thoughts on “LAK12: Weds morning”

Comments are closed.

%d bloggers like this: