LASI13: Monday (1) Welcome & Why

This is the first of many liveblog posts from  LASI13 #lasi13, the Learning Analytics Summer Institute, at Stanford University, CA, 1-5 July 2013.

The liveblog posts will be gathered together here as I post them.

Field

Plenary: Welcome and why are we here?

Taylor Martin: The State of the Field

LASI, aka ‘Big Data Summer Camp’. Don’t panic – we don’t want to be in a safe space. But don’t panic.

Welcome on behalf of Roy Pea, not joining us this week to help his recovery.

Converging Developments

Data science growing, new domain expertise. New cyberlearning infrastructure – inBloom and state efforts in US.

Key Problems

Vital needs – personalized learning, learner-centred – connected learning can lose the learner. Students front & centred. But balanced with being exciting spaces – Scratch, Maker, etc – enable people to follow their own needs. We’re getting lots more data.

Need to keep this in the context of learning environments as enacted. Don’t lose the CSCL understanding of learning in context.

Exceptional Opportunities

It’s the new microscope. New tool, lead to better measures of learning and teaching, building the field – illuminate long standing issues re design of learning environments. And theory building too!

Five Essential Points to Grow the Field

  • University leadership, growing programs. Talk to each other here.
  • Funding – we have a session.
  • New workflow tools – in the workshops.
  • Industry engagement and private/public partnerships – again here.
  • Collaboratories and knowledge networks – plans for leaving this workshop. Birds of a feather sessions.

Thanks to sponsors.

Instructs people to move out of comfor zone, hatch new plots and schemes, go to at least one workshop that you wonder ‘who cares about that?!” – meet people you don’t know. Cross-time zone, cross sector, cross-method, cross-roles

George Siemens:SoLAR Activities

Welcomes everyone again. Two things – what they hope will come out, and what SoLAR is and is about.

There are about 8 or 9 or 10 LASI-Local sessions. Hong Kong, Ecuador, Europe, many others. Network model – don’t treat the experience of coming together as about a geographical space, it’s a global activity. Secondary component – join in the livestream event

Canada Day! And 4 July. And three people birthdays in the room too.

Thanks to – organizing committee, Addy Dawes, program committee, LASI-Local events! Not just academics, but industry and others. Takes time to pull together. Thanks to Simon Buckingham Shum re LASI-Local and doing things online. Sponsors, supporters, and you.

Stories

The stories around a field reflect who we are, and the kind of institutions that we develop. They’re how we make sense in complex socio-technical environments. It’s not just in our head, but in networks, and with technological agents in a network. The clyognitive attributes are distributed and need to be considered through a sense-making lens.

So here’s the story he wants us to take away.

It’s a welcoming space. Significant number of applications to be here, couldn’t accept everyone. Balance in terms of gender, international representation, fields and disciplines – that’s part of what we want. Collaborative dimension very important. There’s a spirit of collaboration. Not about positioning and parcelling off our domains – if it’s a good idea, do it, and people will join you. We are asking you to help build a field, not join one. Worst outcome would be if we had a hub and spoke view, or these are the people you must talk to. Distribute it!

RBFI – Really Big Freaking Ideas – what are we doing in this space? What do we want to do? What impact do we want to have?

It’s notable work. All of a sudden learning became cool a couple of years ago – MOOCs etc – is significant. It’s a good point to have a big impact.

Also tracking and improving how people collaborate and grow knowledge. Substantially transformative impact on society, organisations, individuals.

Essential because our society is facing complex problems that are more complex than we’ve faced before – climate, social changes, diseases – they won’t be solved in the minds of a small number of individuals, but by global connected cognitive systems. Continual complex problems.

Want balanced between social and technical sensemaking systems – what do we still need to rely on social/discursive inputs?

Learning analytics as a means of deciding if we are making the right decisions as a system. So most important thing in the school system – it’s a feedback loop that tells you whether any system will get a better or different understanding. It’s like the inclusion of milk in the school system, said someone else.

Conference series – Learning Analytics and Knowledge. Goal was to bring technical, pedagogical and social domains in to dialogue to create systems that don’t duplicate expertise but connect to it.

To advance, four elements:

  1. new tools, techniques, people – aim here to bring in new people, build awareness
  2. data -openness, ethics and scope – access for training, Gates Foundation grant around MOOCs. Like Pittsburgh Data Shop. Lots is textual or clickstream, not capturing contextual factors – need to tag everyone with RFID tags and monitor everything (shoutout to the NSA).
  3. target of analytics activity – not just IDing at-risk learners, but optimising the entire learning experience.
  4. connections to related fields and practitioners – don’t duplicate expertise elsewhere. If we’re going to advance as a field, we can’t be third-tier machine learning specialists. Not duplicating it, but connecting and growing.

SoLAR

Founding organisations – UBC, Athabasca, U Queensland, U New Englang, Maris, U Texas Arlington, U South Australia, OU, Stanford.

Things we do, all datas on the web: LASI (this!); LAK conference; LAK data challenge; Flare – local meetups; Storm – PhD training, distributed monthly distributed lab chat; OLA open learning analytics – gaining new traction with e.g. EdX recently.

Questions

Shane: Can you mention the journal?

There is a Journal of Learning Analytics – Dragan, Shane, Phil and John Campbell are editors.

Someone: SoLAR and integrated space technological, social etc – very different from singular focus. Talk about the other societies and conference series you see are tied in or converging?

There’s other folks who can comment better on their goals. We’ve had good working relationships with IEDMS, ideals very similar. Ryan and I did a paper on that at LAK. It’s a shifting field. From SoLAR, we felt we couldn’t understand learning through an exclusively algorithmic lens. In that context, you don’t necessarily have the understanding of what’s going on in all those dimensions. It’s difficult to do – technical people thing presentations aren’t technical enough, and others think there’s not enough learning in there. A challenge. Good conferences bring together the technical with the social and pedagogical. The publications from our first 3 years of proceedings reflect that. Some workshops and panels reflect that.

Caroline: People from other organisations, industry, would they like to comment?

Ryan Baker: from EDM society. Lot of convergence. Great to have multiple venues. Get different emergent values and interests. That talk George mentioned – interesting differences there. Linking to practice, which SoLAR and LAK does, and educational theory (interest in the EDM world, useful to make a field that makes a difference to education. Big step in this meeting. Neither conference has brought everyone together.

S/o: Who’s missing?

Ryan: Taylor Martin, Roy Pea weren’t at the first ones. There’s core learning scientists. Not many core machine learning specialists. Some strong from psychometricians.

Taylor: What we hope to bring is like SXSW Edu, started as 2-3 sessions, this year 5000 people. The ed tech startup community growing and exploding, they’re doing similar stuff. Hope there’s people here from that group too. We as academic organisations, how we make those partnerships to get learning science developed, get great tools, get great data, find out a lot about learning.

Stephanie: Other disciplines: HCI, data visualisation people. My goal goes back to George’s comment about us not being techie enough for technical people etc, bring in people who are doing things that intersect with what we  bring. I’m in a interdisciplinary department. People bring in key ideas and methodologies. Don’t think you have to think about stuff you don’t want, we can partner with those people and put the learning and teaching stuff in, let’s bring the best from those fields in to our space.

We want to connect expertise in to the community rather than duplicate a lower-quality activity. Create a space where there’s respect – technical experts are respected by pedagogical people and vice versa.

Someone2: People in education don’t know what’s possible. I’ve been working on a grant with other faculty, trying to talk to NSF officer about text mining – he doesn’t even understand what we’re trying to do. Build that awareness – there’s a lot of work there.

You’re 100% right. We hope to produce a report on how we learn, to serve university leaders, granters etc. Also Handbook of Learning Analytics, collaborative IEDMS and LAK. Hope by LAK14.

Caroline: Director of an iSchool, at intersection, learning is a natural fit.

Stephanie: They just don’t know it yet!

Caroline: They’re good on analytics, big data, so are natural partners.

Nicole: I’m in a Business School. Working with Taylor, there are other fields – management information systems. Lots of things looking at adoption and diffusion of technology. Influencing use among students, teachers etc. Use these techniques to make a big impact across. Some of my background is in businesses, situating these things in organisations – academic environments are just another organisation.

Taylor: Hope we expand and build our vocabulary.

Andrew: Robust studies on use of data in schools from a policy angle. 20y. Phil’s a good resource to connect to that.

Shirley: I’m wondering whether big ideas are big enough given the climate we’re facing. Could we use this week to come up with even bigger ideas and pose them as questions, take a trans-disciplinary approach. E.g. preparing for a climate of continual complex problems. For me, how we can do more and more with fewer and fewer resources – how can I use the data to make better decisions about which areas to fund. Post those later this week?

We have a wrap-up panel on Friday, get them to target those. Here, it’s not just a small research area, it can have a dramatic impact on society.

Taylor: Let’s get a Google doc going.

George: Don’t ask for permission, do things.

John

About people being chosen to come. As a program committee, also tried to increase the dimensionality of the group – folks who weren’t the usual suspects, to get in to the corner cases. This is a very high dimensional space, lots of ways to think about how to move it forward. Many epistemological frames to bring to bear on that. We’re starting in a place, but you’re here to bring that place forward.

The sessions are structured purposefully. In the morning, plenaries that are simulcast around the world. All in this room. Afterwards, breakout time. Two purposes: birds of a feather time, people who want to work on common goals. Also meet with the presenters in the morning. Not a conference goal where we all receive information, more an institute where we have responsibility to contribute. It’s not about us getting our heads filled – or having a vacation! We’d have had it on Maui if that was the plan. Then lunch, also time for conversation. Then afternoon is parallel tracks – you go to just one out of them. Extended break for time to discuss and move things on.

Also the online environments – the LASI-Locals that make it LASI Global.

There’s a reception or dinner every night. Highly scientific approach to collaboration-building.

Question: Workshops in the afternoon. Are there other summaries to describe what they’re about.

Taylor: We’re working on it. If you’re running a workshop, let us have the summary!

Taylor: Quote from Willy Wonka : “Invention, my dear friends, is 93% perspiration, 6% electricity, 4% evaporation, and 2% butterscotch ripple” – apposite because we have no aircon today.

Panel: Why Big Data & Analytics could transform the Learning Sciences and Education

Chair: Dragan Gasevic
Panelists: George Siemens, Dan Suthers, Phil Winne

Dragan

We have three panellists. George. Dan, from University of Hawaii. Phil Winne, Simon Fraser University.

Revisit the definition: Learning analytics is the measurement , collection, analysis and reporting of data about learners and their contexts. – for the purpose of understanding and optimising learning and the environments in which learning occurs. Came from the people arranging the first conference. Both parts are important. Like medicine – evidence-based discipline. We are trying to build one – are we? I’m married to a public health researcher, came in to this.

Do we have disconnected learning, where from a practical real world perspective, it’s separate from the world of research. Very few enthusiasts, mainly from Schools of Education who are trying to make a difference and follow evidence-based processes.

What values should we promote? Target ‘learners or average’ – tends to happen with stats using mean values. On a long flight, the bed was a bit short at 6’1″. There was a very tall guy where the bed didn’t fit at all. That’s a potential problem. Lot of fuss around dashboards – they’re mainly reporting. What happens with specific individual learners? Focus on courses that deal with knowledge transmission. Or do we prefer to promote e.g. individualisation, contextualisation, and socialisation. Tools and techniques in one course may not fit other courses – even later presentations of the same thing. Are we really after information transmission, or more critical thinking, etc.

How to advance science? We can move beyond self-reports! We’re using surveys here. A useful instrument. What do we need to do? Are we sure that trace data collected today can measure important constructs? Are our methods sufficient?

Societal impact- additional responsibilities to be careful about.

Extending existing practices, or new paradigm – self-reports, laboratory, intrusive, causality – we want to intervene/instrument. GIGO.

George: How will big data impact learning sciences?

Big data is not very clearly defined, and learning sciences differs too. Arthur Graesser, Editor, J Ed Psych -“escalating the speed of research on many problems in education. Not only can you look at unique learning trajectories of individuals  but the sophistication of the models of learning goes up enormously.” MOOC research call. Short timeline for grants. One person said it’s impossible to do something in 6 months – create hypothesis, design experiment, write up, etc. We’re saying, if you’ve run a MOOC, or you have the data – it’s there. It’s about asking provocative questions of that data. The traditional rigorous approach is important, but I’m emphasising there’s a lot of unanalysed data that’s sitting there. Stop gathering, start interpreting.

Baker and Yacef 2009 -(EDM) five primary areas of analysis: prediction, clustering, relationship mining, disctillation of data, discovery with models. Bienkowski, Feng and Means (2012) five areas of LA/EDM application: modelling, profiles of users, modeling domains, trends, personalisation/adaptation.

Combined: two principles. First, techniques – large field. HCI, Rich on personalisation/adaptivity back to the 70s. More recently cognitive models is 70s/80s, and on. e.g. citation analysis, best uses of knowledge mining, eg Google. Social network analysis is really important – who talks to whom about what. Knowledge domain modelling crosses over to Application. Applications includes adaptive hypermedia, BI, trend analysis & prediction, personalisation and adaptation.

What will LA do for education? It’ll add a new research layer, calling in to question the existing approaches, ask how we can do it faster, more iterative, relying on existing data – reasonably authentic research settings. Personalisation – holy grail and has been for a long time. (Me: I reckon it’s like fusion – 10y in the future and has been 10y in the future for a while.) In domains where there’s a right answer – STEM. New knowledge creation, artistic creation, etc are much harder. Optimisation – often about identifying students at risk. But big impact about optimising it across entire spectrum – move from negative orientation. Organisational insight – budgets, pressure, accountability. Improved decision-making. New models of learning, e.g. competency-based. Fragmentation, unbundling of HE drives that. Granularisation of learning experience itself – more and more, in threaded discussion forums – 7-8y ago, they’d post responses to things that week. But now have access to a range of resources – so they have more of a fragmentation/pushing-out: learners share resources. Network transition in education. It’s not the students talking just about what the teacher says, it’s moving beyond.

Dan – Learning as a complex phenomenon: challenges for learning analytics

Move beyond narrowly-focused education. Learning in a socio-technical setting (from a paper he did in ijCSCL 2006). Theories differ about agency – who or what is the agent that learns? Individuals, small groups, networks (up to societies). Also differ on epistemologies – what is the process of learning? Acquisition, intersubjective meaning-making/group cognition, change in participation (community reproduces by bring in new learners. The correspondence is not strict (not individuals = acquisition).

Individual epistemologies – can be local (e.g. Piaget on accommodation) – epistemological gradients, from what learner knows to what they can learn. Also network-level theories of learning – strength of weak ties, how information moves around.

Intersubjective epistemologies – can be local  or network too (Scardamalia cultural capital, etc).

Participatory epistemologies too. e.g. Communities of Practice and Legitimate Peripheral Participation.

Challenges:

Claim is learners participate in all those forms of learning simultaneously. With better tools, LA could put that back together.

Challenge to rise above one-dimensional analytics – to how learning takes place through the interplay between individual and collective agency. This demands analysis that connect learning at specific times and places with larger socio-technical network contexts, and requires multiple methods and traditions. LA must become multivocal: a dialogue between disciplines.

Two projects that address this.

First: Traces* Analytics Hierarchy (Suthers HICSS 2011, Suthers & Rosen LAK 2011). Activity is distributed, fragmented. Logs may record activity in the wrong ontology. Behaviour is contingent on its setting. Abstract from log file our inferences about what the interaction really is – who is taking up what was offered by whom; also abstract mediated associations – who’s reading and writing what messages. SRI’s Tapped In, looking at mediated associations – can abstract to traditional SNA. Want to be able to go back down – from networks of people working together.

Second: To do this work, have to bring different people together, who have different epistemologies behind them. Productive Multi-Vocality Project – learning sciences are diverse – how to bring multipl eanalytics ‘voices’ into productive dialogue to provide some coherence (Suthers et al CSCL 2011, book coming out). 5y project, 37 researchers. Shift from technical focus to social-dialogical focus – mix up people working on data. Easier to figure out the dimensions of difference. Hard to dialogue between epistemologies. Ideas about what you’re trying to do – the teleological question – and how you go about knowing what you want to know. Not mixed methods – here you may have disharmony, but still productive; productive tensions as well as harmonious use; multiple voices/agency. Strategies for this – dialogue about the same data from different perspectives. But need a vague objective – e.g. ‘pivotal moments’ – so have two boundary objects (shared social objects) – also bringing representations into alignment with each other and the original data – good tools help. Push the boundaries of traditions without betraying. Reflect on practice – step out of methodological view, become aware of methods as ways of producing evidence and sustaining arguments.

Learning in socio-technical settings involves multiple agencies and processes. Requires analysis across multiple levels, and coordination of diverse disciplinary traditions.

Learning analytics can help us understand and manage learning in its full complexity.

Phil Winne – Will Big data dn learning analytics transform learning science and education?

I’m an iconoclastic, focus-on-the-individual person. I want to approach this optimistically.

Roediger’s lament – aim to get general laws, hard to make generalisation … WHY? How can you answer a question for a person without adding ‘it depends’.

“laws” of learning are slippery.

Obstacles to transforming learning science – many multiple factors that have an impact, the question is whether they interact. Won’t work. Requires N about 12,000. Our experiments are too small.

Others beyond that. When students say ‘why did I do poorly?’, I say how, they say here’s how, they say if you’d said that I’d have done it. Papers often make things unclear in the operationalisation. You make things that fit so they meet your goals, not experimental goals.

Students would sit highlighting everything in their books. He’d ask why they’re highlighting, they’d say they’re trying to learn. “I’m condensing the text I have to read before the exam”. Multiple colours – some would say I happened to pick up this, some times had systems – eg. blue to ask about, yellow mentioned in the lecture so likely on the exam. So can get traces by asking them, or providing an interface, where you share tags. If we can develop traces like that, we’d be in a better position to understand what’s going on.

Transforming education. Data are typically incomplete, imprecise, impermanent (I-cubed I3 problem). Many people have different goals. Some just want to get through. Some are desperately interested in one small segment of the course. Others want to get in to law school or med school, need highest possible score in the exam regardless. People are afraid of being evaluated, yet it’s fundamental to how people engage with the world. Learners egnaged in metacognitive activity are evaluating themselves. Contexts matter. Experiments are too small – if they’re multiplicative effects – does it matter? Used to be you made your name by creating an instrument – e.g. an inventory. Are the data commensurate? Representing the same constructs? We have to solve this problem.  We almost never get the data we need, they’re imprecise, they’re very fleeting. In 38y at SFU, never seen a student keep a log in a way an experimenter in a lab would. If I’m a learner trying to improve, how would I know.

A simple question – e.g. do you like ice cream, until you discover there are some contexts where you don’t like it. Do you remember those when asked, or when you’re a learner trying to improve your learning? No. All those papers they fill out, the I3 problem is pervasive.

Goals that we have are not easy to work with, much resistance to gathering data. Many issues related to the I3 problem. The imprecision of our data – we pay too little attention to scaling. Precise enough that we don’t have to worry about error bars for individuals – e.g. cut a piece of wood, can be off by tenth of a mm but not 2mm – what’s the cutoff for us here?

Big data might help us get over the multiplicative problem. Gathering traces – group learners post hoc, look at a host of outcomes, and update the model continuously in real time.

Someone: Thanks Phil. Critical issues to discuss. One, what are the goals of LA. Two, what are the big potential dividends. What’s special about LA. Another is, we need multiple metrics, track high-quality data, not just big. Is it well structured, interpretable, and on and on.

Emily, Stanford: For Dan – lot of disciplines brought together, is it necessary to be grounded in a particular discipilne and come to a trans-disciplinary space, or train people to be trans-disciplinary? Hard to navigate.

Dan: Great question. We’ve talked about this a lot. Being grounded at first is good. Grad students have a lot to deal with it. Adviser is working in one tradition. We do need more to trade off with the awareness to look at other things. Caroline has experimented with her students, will be in our book – tried to get them to come together from different disciplines.

Jim: First set of slides, the definition you presented: anything we measure about learning that’s LA, that’s overwhelming. Especially withe Phil’s i3. How do we get more specific about the space of LA. When we say it’s everything, we tend to not to anything.

Dragan: We’re identifying these issues. Phil and Dan pointed to important issues and phenomena we need to study. Unless we start measuring, instrumenting to be able to measure constructs that matter, that are well established, have to count the pageviews, we’ll get nowhere. Paper that reviewed five best information-seeking models, for decades in library sciences tried to count everything e.g. number of visits, books – then realised that just counting wasn’t useful, couldn’t build research and practice – once they including more qualitative measures (got better).

George: Generalisability is hard to address. Blended, online, physical settings, different pedagogical approaches. Complexity of analytics. Education system to address a normative model of a particular learner, and target that, too bad for the outliers. Potential of LA is to move from that, it’s for individuals. Different way of thinking. The end is too small, but the impact is reduced to just one person.

Dragan: Economics matter too.

Caroline: How does that feed back out to the learners? How is it going to transform learning? It’s going to escalate the speed of research – it’s going to be transformative. It’s like saying the Internet increased the speed of information transmission. Yes, but. Provide information so they can do something about it.

Phil: The acceleration comes from the fact that every learner is an experimenter in an ecology of experimentation. Trace adaptations, how they unfold, looking at things from different angles. Get many many experiments. Also get longitudinal development. That’ll provide acceleration.

Someone3: Circling back. The definitions question. What is not learning analytics? If you believe what’s written, everyone’s doing analytics now. Was ‘we’re looking at data’ now ‘we’re doing learning analytics’ – but not everyone is. There’s an edge we’ve not marked yet. What doesn’t fall in to the bucket?

George: The LAK11 definition, not horribly eager to define things. Sense in which it becomes exclusionary. Wanted to be as broad as possible. Don’t want to say, that’s not LA. Went for broad, vague definition to allow people to see themselves in that space. I think it’s one that’ll become precise after a while. Adam Cooper’s JISC papers defining it, e.g. focus on practical activity. Multiple definitions, community defines it based on activity. LA has to include technology to apprehend a dataset that can’t be cognitively comprehended. Dataset exceeds cognitive comprehensibility. So have to talk about what technologies are required. So that shows some things that fit outside.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Advertisements

Author: dougclow

Academic in the Institute of Educational Technology, the Open University, UK. Interested in technology-enhanced learning and learning analytics.

One thought on “LASI13: Monday (1) Welcome & Why”

Comments are closed.