LAK13: Thursday (6) Keynote

More liveblogging from LAK13 conference – Thursday morning, keynote.

(Roy Pea is unable to attend, so Abelardo Pardo’s keynote has been moved here from Friday.)

Erik Duval welcomes everyone again. Dan Suthers gives an overview of the program. Best papers will be announced later.

Simon Buckingham Shum – in the 80s, neuroscience started, formation as a discipline. John Bruer had annual brain camps to form the field. The Learning Analytics Summer Institute is aimed at taking that role. First week of July in Stanford, but also a network of satellite event – LASI-Local events, running in parallel – five so far, across the world. You can create your own too!

Dragan Gasevic – new journal. Journal of Learning Analytics. Education-Computation-Sensemaking. Research papers, reflective practitioner papers. Any discipline. Any focus. Editors – Shane Dawson, Dragan Gasevic, Phillip Long – and John Campbell for reflective practitioner section. Peer reviewed, three issues a year. Open access CC BY-NC-ND. Issue submission April 30, 2013. Call for special issue proposals. @jla_editorial

Two special issues coming out, including American Behavioural Scientist. SoLAR has its AGM at lunchtime.

SpeakUp tool – mobile app, on Google Play and the Apple App Store – can use it to raise questions and peer-review (thumbs-up/down) other questions.

belgian chocolates
(cc) buttersweet on Flickr

Keynote: Abelardo Pardo – Bridging the Middle Space with Learning Analytics

Xavier Ochoa introduces Abelardo.

Slides will be at http://www.slideshare.net/abelardo_pardo

Focus on the ‘middle space’, telling the story of his journey in to the middle space. He’s an engineer, a computer scientist, all about technology.

Two islands: One is educational theory and practice – the practitioners live there. The other is tools. Learning analytics is the middle space between the two. It’s building a bridge, which arises from two forces. From technology, learning analytics is about how to apply tools to a real domain. Increasing pressure for technologies to make an impact on  real scenarios. From the [course?] designers, we have a similar force, and aim to gain as much insight in an educational environment as possible. The bridge arises from both of these pressures. It connects tools, technology, algorithms with educationa design, pedagogical strategies. Need agile feedback between the two – a tight feedback loop.

I was an engineer, know about algorithms, Intelligent Tutor Systems, IMS Simple Sequencing, IMS Learning Design. How did I discover that other island? Born ‘tooler’, but captivated by practice. I also have to teach. Find a balance between technology for learning, and his own teaching. Connection wasn’t there 10, 15 years ago. Ten years ago, view of learning technology was of a very complex, sophisticated view, with many knobs and dials. That view was wrong – countless difficulties convincing practitioners. View of teaching 10y ago was focused on the subject, the best examples – always around the topic. He was topic-intoxicated. Seemed perfectly Ok. Then realised things were not working as expected. Made a classic mistake – blaming the students: I put so much effort in the topic, increased the quality and quantity of the material but they didn’t learn. Was very struck by a publication from Kugel (1993) about transition to university teaching. When you attend university, your teaching style is on yourself, how you’ll be perceived. Then you build confidence, make a transition to the next step of focus on the course topic. Many remain in that stage. Some people make another transition, when they realise the students need to be receptive. Students are then at the centre. Two more steps – next is the student being active. (Self, Course Topic, Student as receptive, Student as active, Student as independent). Some people make some transitions, some are stuck – but an event may move them.

Then had the responsibility to design a course from scratch. So went to the designers island. It was a C programming course. Was going to write the course agenda, the topic. Then tried to do it. Valero 2003 – engineers tend to go for a problem quickly. Jumping in to the designer space, there’s lots of theoretical debate, wanted something to get to the solution quickly. Basic steps from an engineer. The course agenda is the last thing to write. Focus on the program objectives first. Review what skills the institution claims the graduates will have when they’ve finished the degree. Then translate in to more specific objectives. Bloom’s taxonomy. Then make a choice about how to address those. Then design activities.

Design stages: Programe objectives, Specific objectives, Project on course, Design activities, Course agenda.

Conceiving the activities as elements to be aligned to have a path for students to participate and achieve certain objectives. Very different experience. Then writing the course agenda was a bit arbitrary. Most academics laugh about learning objectives, see them as a waste of time – because they see this process upside down, and in that way it makes no sense, you’re just making them up. Connect to the objectives explicitly.  Still feels a bit upside down.

Focus mainly on the activities. Appear as a collection of bricks, to make a structure, to achieve the objectives. The course becomes like a bridge, making sure the pieces are located right, fostering participation, all connected to help the transition. Then crucially need to observe students while they work. Need to intervene and adapt, make changes on the fly – if it isn’t going to work, need to change it; react quickly about what happens. Once you have the information about what’s happening, offer it back to students, so they can reflect. One activity – first day of class, discuss the objectives, what they mean, and how they see themselves. Self-reflection.

Finally, learning analytics is an area that needs sophisticated algorithms, but fits in this view.

Research projects and how they’re shaped by this approach.

If creating a technology, apply it to my own courses. So use your own tools – ‘dogfooding’. We should seriously consider that. Google, working on Android, they have mobile phones – they run the version of the OS that’s about to be released. So if it freezes, it does so first in the hands of the people who are building it. Have to also maximise the impact, more widely than your own classes.

If we focus on these activities – most of them are outside the LMS. That’s a meeting place. But the activities from the course (which is highly practical) were outside. One objective was to be proficient users of industrial tools. We need to observe them outside the LMS, while they’re using those tools. Need to have events capturing those activities. Main driver behind research effort.

Course 250 students, 2y engineering, highly practical. Meet twice a week for lecture and lab. Also working in groups of 4-5. We provide them with a virtual machine (VM). A VM is a 3-4Gb file, it’s a new desktop, if you maximise it you have another computer running on top of your computer. We pre-install everything there. In exchange, we collect a lot of information about what students do.

Certain tools we thought were important are pre-installed in the VM. We instrumented each and every one – they record the way they’re being used with a timestamp. Our students had to submit certain documents back to the servers – to tutor or team mates. Activity says, read this doc, open this file, do this procedure, send back a file to your team mates. All the events travel with the file too when they make the transfer. So not quite real time.

What came out? They were supposed to use several tools, but not all were mandatory. Focus on solving a problem. You have tools you could use. So they analysed the pattern of tool use/choice. The use of certain tools correlated highly with academic performance. If they use one or the other, that’s a worthwhile pattern. We could derive a predictive model – if they barely use these tools, high chance of problems or low performance.

Second idea – trying to connect to real use. There was a browser in the VM. Collect URLs the students visited, compare with course notes, detect new resources. Put effort in to improving the course material, designed for the activities, not a textbook – thought was very high quality. They can find this in 15 seconds on the web. So they analysed the searches. Some are behind credentials/authentication, some are irrelevant, some are interesting. Analysed the text, capture most frequent terms, had a representation of those documents. Then also did it for course notes. Computationally expensive procedure, but found the most relevant documents the students browsed while working on the course. Took those back to instructors who said they were very relevant. Could create as a ‘related resource’.

Another idea – many events, but want to connect sequence of events to an activity. So – group activity steps, detect most frequent, relate them to activities. Fed back to teachers, can see it’s not working as expected, intervene – e.g. post in the forum redirecting attention. Useful, it was information not apparently visible.

Refined it, computing the n-grams. Had a correlation between key actions and academic performance. The higher performing students have certain key actions that characterise their performance, can intervene for those who don’t. Instructors found it useful and unexpected.

Worked on visualisation. ID for tutor the 5 highest and 5 lowest performing students. Another that’s a dashboard, shown to instructors, on activity.

If the course notes in HTML, embed multiple-choice questions. They don’t go to assessment centre, all that. It’s right there in the page. Finite state machine – chan check and get feedback; the buttons change. Can measure the level of engagement in the document. Relate back to instructors as fast as possible – just in time teaching. Get a report on how they’re engaged. Can do similar with e.g. videos in HTML5. Observing students while they work, relate back to the instructors. Detect misconceptions quickly.

Information related back to tutors easily. Like SNAPP. Hard to argue with central IT people to modify pages on the LMS. But can give tutors an add-in for their browser, that’s sensitive to certain URLs. When it comes up, modifies the page, embeds data from servers next to the students. Can see different profiles – who’s engaged, who’s not. Have to work on the technology. But then the least fun part, facilitating the use of the technology by the designers.

We realised the 250 students, half were breakdown sessions in 35-40; had 6 or 7 tutors, teaching different groups. See the group difference – instructor analytics. Not a big brother thing. Instructors are open, see the numbers e.g. the other instructor was better so can improve. Increase motivation, increase engagement. This was a lot of fun. Designed activities based on the fact that we could observe.

Bi-directional feedback: course design affected because they knew they could observe. They can detect if engagement is appropriate, and if not, prepare different versions.

Three more ideas, from University of Sydney.

Detecting events from students while writing. Writing’s an interesting activity, uses a lot of cognitive skills. The tool’s being deployed university-wide. (Calvo et al 2011, Liu et al 2013). People are open to this, but take care of the adoption barrier.

Automatic question generation. Get students to write a scientific document. Can analyse the text, detect patterns. Send them automatically generated questions, three or four – e.g. is it clear the connection between A and B? It’s fairly impressive.  (Liu et al 2010).

Facial expression recognition – not that far away. Microsoft Kinect applications. Algorithms becoming more efficient. Students watch a video, capture video of them, can get good attention data. (Monkaresi et al 2012).

Middle space. There’s value in remaining there.

Matrix scene – red and blue pill. Neo talking to Morpheus, faced with choice of two pills. Blue pill provides possibility of believing his life of deceit, nothing matters, life goes on. Red pill allows him to face reality, grasp the complexity of the environment.

Toolers: let the needs of practitioners drive the design and adoption of the tools. Take the red pill and face reality. Deploy the tools in environments where it’s used by many students, institutions.

Practitioners: LA is here to stay, it has big potential, make it work for you! You have to embrace it and use it, make the most of this confluence of ideas. It’s an exciting time. Can give you tools that improve your learning experiences.

Questions

Q What problems did you have with the virtual machine? Was it just wonderful? It connects pretty well with the hardware?

To tell the truth, we had a challenge to put everything together. Have worked for 2-3y, technology is maturing really fast. VMs used by IT already. Headache to configure on students’ machine. But significant decline in those events. Last year was fairly straightforward – they just download and install. Need certain amount of computational power, tendency is that way. Most of the wrinkles disappear but not all – you need tech support.

Sheila MacNeill: Middle space, teachers and developers. Where do students sit? How do they affect practitioners and tool developers? Effective agents?

The students are the reason to be in that middle space. They’re more like the end result. The presence of this collective is more obvious when you design the activities. We also used methods to collect feedback quickly – 9 of 10 professors over-program activities. Had to put estimated time in HTML page. Asked students how long it took. Raise flag quickly if 90% use 1.5h vs .5h predicted. Have to communicate quickly, and adapt.

Sheila: I was wondering if you notice students getting used to this feedback, are they driving more instructors to look at these tools? We could use this to promote the idea of LA.

This fluid channel with the students, they are not used to it. My perception is there is a lot of potential to exploit that channel. Students are ready. As soon as you provide the information, you establish the dialogue.

Q (?Linda Baer) Scaling of these tools across a department, university – adoption among faculty?

From technology, the scalability is something not to worry not to much. Some bumps but can solve with appropriate techniques. That can be taken care of. At a dept, or institution, it’s not a problem. Adoption by teaching staff is a different story. We need to engage teaching staff at the point of creating material. It’s not enough to say give me your Powerpoint, Word and we’ll create analytics. You can’t go to the moon on a bicycle. At UTS Sydney, proposed questions in the HTML, instructors wanted someone else to put them in, but there’s people already to help them. Same applies for videos, not everyone’s ready.

Q Is there a strategy for adoption, by this time so many will have adopted?

I wish! Try to disseminate the ideas, connect to the real needs, the real problems they have.

Q On the VM, how aware are the students that they’re being tracked – e.g. do your personal stuff in another browser/machine?

We try to adopt the most transparent paradigm. You can program the VM to start with the browser showing a document where we explain in simple terms what we’re tracking, why, and how to disable it. If they open the VM 15 times, that appears 15 times. This is going to be sensitive. Students are independent, they’ll take their own decisions. If half the audience disable it, we have to adapt to them. They always have the right to disable it. Have to be clear what we’re tracking, why, and how to disable. The events live in the hard drive of the VM, we tell them where they are. Also give them email to remove the data. Attempt to comply with legislation, or to the tendency in the legislation. Playing it really honest.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.