Learning by tinkering

Liveblog notes from “Learning by tinkering”, an IET Technology Coffee Morning given by Marian Petre on 7 September 2011.

The talk is a mash-up of three related research strands:

  • First strand is with children in robotics, with Jeff Johnson. Robocup Junior, Robofest – rich research base.
  • Second is a long-term observational study of children as end-users at home pursuing their own objectives – children as unwitting programmers.
  • Third stream is her primary work – empirical studies of expert software developers.

Danboard Super Box
Continue reading “Learning by tinkering”

Digital Scholarship debate

A structured debate, held at 2pm on Mon 20 June in the Jennie Lee Building, The Open University, on the following motion:

“In the next decade, digital scholarship (in open journals, blogs, and social media) will achieve the same status in academic settings as traditional scholarship.”

Martin Weller is presenting the pro argument for 5-10 minutes, followed by Rob Farrow presenting the con argument. Then 5 minutes’ response, open to the floor for 30 minutes, and then the vote. Jude Fransman is chairing.

This is a dress rehearsal for a similar debate planned for ED-MEDIA. There were a little over a dozen members of the audience. There’s a survey monkey poll to vote on the answer.

These are my liveblog notes.
246/365 - When I ruled the world (Explored!)
Continue reading “Digital Scholarship debate”

LAK11 – Tuesday morning

Liveblog notes from the morning session on Tuesday 1 March, the second full day of the Learning Analytics and Knowledge ’11 (LAK11) conference in Banff, Canada.

(Previously: The Learning Analytics Cycle, liveblog notes from Pre-Conference Workshop morning and afternoon, and from Monday morning and afternoon.)

view out of the window of the restaurant at the Banff CentreIt’s fearsomely cold here – about -30C this morning – but stunningly beautiful. This is a quick snap out of the window of the restaurant where we had breakfast. There are views of the Rockies all around. If I’d brought a better camera – and was a better photographer – you’d get some staggering pictures.

Continue reading “LAK11 – Tuesday morning”

LAK11 – Monday afternoon

Liveblog notes from the afternoon session on Monday 28 February, the first full day of the Learning Analytics and Knowledge ’11 (LAK11) conference in Banff, Canada.

(Previously: The Learning Analytics Cycle, liveblog notes from Pre-Conference Workshop morning and afternoon, and from Monday morning.)

Introduction to Xavier Ochoa. Principal Professor at ESPOL, Ecuador. Work on learning objects, inventor of ‘learnometrics’

Continue reading “LAK11 – Monday afternoon”

LAK11: Monday morning

Liveblog notes from the morning of Monday 28 February, the first full day of the Learning Analytics and Knowledge ’11 (LAK11) conference in Banff, Canada.

(Previously: The Learning Analytics Cycle, liveblog notes from Pre-Conference Workshop morning and afternoon.)

George Siemens – Welcome

Thanks to TEKRI at Athabasca; Centre of Educational Innovation and Technology, U Queensland; EDUCAUSE. Thanks to four platinum sponsors – Kaplan Venture, Alberta Innovates, Desire2Learn, Bill and Melinda Gates Foundation. Thanks to steering committee. Thanks to program chairs and program committee. Gill who helps organise and support administratively. Chris too. And Blaze who’s working the video on ustream – will be recorded and made available later. Backchat on #LAK11.

Why learning analytics? We’re in a knowledge economy, and the foundation is data. We need to better understand the data exhaust being produced by students, and the teachers, as they interact and work online. Various initiatives in Canada to make it a leader in the knowledge economy. Shift from physical- to knowledge-based economies. We haven’t had as much time in understanding how to create value in the knowledge economy. That’s where learning analytics becomes important.

Continue reading “LAK11: Monday morning”

The Learning Analytics Cycle

What is learning analytics? Well, the LAK11 open course has generated a useful definition:

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

Reflecting on this and everything else from the pre-conference workshop at the Learning Analytics and Knowledge conference today (see my liveblog notes from the morning and the afternoon), I realised that my underlying model here is of a Learning Analytics Cycle.

It looks something like this:

(If this was a proper academic article, I’d be linking all this explicitly to Schön and Kolb and all that at this point, and quite possibly Engeström and Wenger.)

The process starts with learners. (Always a good bet pedagogically.) These learners could be students enrolled at a university, or informal learners, or participants at a research conference. Data is generated by or about the learners. It could be demographic information, it could be a clickstream, it could be login time, it could be the full text of their contributions to an online discussion, it could be their assessment scores – everything. This data can be generated automatically, or with great and strenuous efforts by a large multidisciplinary team – or anything in between. There’s also data about learners who are like the learners in question in some way or other. One of the drivers for the interest in learning analytics as a field is that we have far, far more of this information than we used to, for all sorts of reasons. Next comes the bit in the process that perhaps most springs to mind when you think of doing ‘learning analytics’ – someone takes that data and generates some metrics or analytics based on that data, which give some insight in to what’s going on. This could be a visualisation, a dashboard, a list of ‘at risk’ students, a comparison of some outcome measure with some benchmark, or whatever. Some of these are easily generated because they’re built in to tools at hand; others take huge amounts of time and effort to research, develop and validate. Then those distillations of the data are used to drive some sort of intervention, which has some change on the learners. This could be a tutor making personal contact with a student that a model has identified is at very high risk of dropping out. Or it could be a dashboard for learners showing how their activity compares to others’. Or it could be a Government looking at the graduation rates across the university sector and deciding that sector-wide change is needed. And right back down at a simple level, a teacher reviewing the final grade scores for a course and deciding considering what to do next time they teach it is an example of the cycle in action.

Accountability comes in here too. There was much talk of ‘analytics gone awry’ today, when they’re used inappropriately as an unhelpful stick to bash people with. This cycle would show that pathological process spinning off and not closing the loop effectively. A process that doesn’t result in an improvement for learners is not actually helping learning. (Though it may, of course, have some effect politically.)

I don’t think you have to have all the elements present to be able to claim to be doing learning analytics. The bit at the bottom, where you create some representation of data about learners, should count on its own. But to make effective use of learning analytics – and unlock the huge potential value that’s there – you do need to close the loop in a cycle like this. There are lots of places where the cycle can break: the data may not accurately reflect the learners; the metrics or analytics developed may be flawed; the intervention may not happen at all, or be inappropriate or ineffective and not affect learners at all.


This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

LAK11 – Sunday afternoon

Liveblog notes from the afternoon of the pre-conference workshop (Sunday 27 Feb 2011) of the Learning Analytics and Knowledge (LAK11) conference.

There was an excellent discussion over lunch about the power of simple measures to infer great – and privacy-threatening – amounts of information. So, for instance, you can guess well at what appliances someone has in their house simply by looking at the usage of electricity over a 24-hour period, using a smart meter. And then that raised the possibility of identifying an appliance that was about to break down simply from a change in its use in electricity over time. Which would be gold dust for advertisers, since they could potentially target you with an advert for a new washing machine before you even know that you’re going to need to buy one in the next few weeks, because they know yours is about to break down.

Continue reading “LAK11 – Sunday afternoon”

LAK11: Learning Analytics And Knowledge, Banff

I’m at the 1st Learning Analytics and Knowledge conference, at the Banff Centre in Banff, Canada.

Today – Sunday 27th February 2011 – is the Pre-Conference Workshops . It’s being streamed live. The hashtag is #LAK11, which is also the tag for the pre-conference open course on learning analytics.

We are right in the middle of the Rocky Mountains, and the views all around are stunning.

This is the first of (probably) six or more liveblog notes – my plan is one per half day.

Continue reading “LAK11: Learning Analytics And Knowledge, Banff”

New project to help people learn about nature

I’ve just heard that iSpot Local has been recommended for funding by JISC! Having just seen a post arguing that JISC projects should blog, I thought I better blog about it.

iSpot Local is a project I submitted with a team of people from the OU and a range of partners, led by Ambios, to JISC’s eContent Programme last December . As you might guess, it’s an extension of the existing iSpot project. In case you don’t know, iSpot is a website where you can upload your observations of nature and get help from a social network of expert and casual naturalists in identifying what you’ve seen. It’s fantastic. (I may be a little biased.)

The best way to explain the project is to paste in some of the bid:

Aims

  • This project aims to explore the potential of an exciting technology-enhanced learning practice (Bioblitzes) to serve as the key mediating event for the co-creation and crowd-sourcing of digital content related to field biology, by extending an existing web tool (iSpot) and building a strong community partnership network.

Objectives

The project will:

  • Rapidly deploy a community website (iSpot Local) to facilitate, coordinate and mediate activity.
  • Deliver pre-event learning opportunities to engage and empower potential community users.
  • Deliver six Bioblitz events, generating significant crowd-sourced digital content related to field observations of nature.
  • Develop iSpot Local to integrate community-specific content with observational data.
  • Co-create, crowd-source and identify existing learning resources to meet the needs and aspirations of participants.

So what’s a Bioblitz? In a nutshell, it’s a bunch of people who go to a nature site for a day or so, and record and identify as much of the wildlife as they can see, usually with a bit of help from some experts. Or, more formally, “a Bioblitz is a time-limited wildlife survey of a particular site – usually a single day – during which all organisms encountered are identified and recorded by the public, working with a team of experts.” Bioblitzes are great – they generate real scientific data, and get people interested in the particular place, in the plants, bugs and anything else that they can identify, and in the scientific process. You don’t need any technology beyond a piece of paper and a pencil, but it’s a lot easier and more worthwhile to do it with a bit of helpful technology.

The idea is that we set up a website quickly (iSpot Local!), run some learning activities with local people in the places where we’re doing the Bioblitzes, and then run them. People will take photos of what they see, stick them up on iSpot, and log where it was seen, who saw it, and what species it is. Experts from our network of partners will be on hand to help on the day itself, and experts and enthusiastic amateurs from the existing iSpot community will be able to help online.

Then we’ll help the people who’ve got interested in the places and nature they’ve seen to learn more about them, and share more information about their local place, the local biota, and what’s going on to help look after the site.

Early days yet! The project  starts on 1 March, which is going to be a real challenge to do since we’ve only just heard we are (likely to be!) funded. And it’s not a long project (only 7 months), and we have a lot to deliver.

But it’s a very exciting project to be part of, and we have some excellent people working on it, and I’m looking forward to it.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.