OU on iTunes U – education 2.0 business models

The Open University is on iTunes U! As Denise Kirkpatrick, Pro-VC for Teaching and Learning says in the press release,

Making available selected video and audio items from among the University’s highly-rated course materials via iTunes U to audiences worldwide offers a new channel for the University. We can open up free access to educational resources as well as a window for our potential students.

John Naughton says “At last we have a proper global distribution channel for our stuff.”; Martin Weller says “This is for the proper quality stuff, and provides a good outlet for OU material”.

I think they’re right: this is the high-quality stuff that the OU has a well-earned reputation for, and the content up there is good.

OU iTunes U

(I note that the OU is taking a sideways look at the world, and placing the Arabian Peninsula at the heart of its activity – visually at least.)

It’ll be interesting to see how this fits in to the emerging ecology of online educational material. There’s been a lot of debate in the last week or so around new business models for education – kicked off by Tony Hirst, then followed on by Stephen Downes, Martin Weller, Gary Lewis, and others. It’s great to have good stuff available for free. But how we make that sustainable – particularly the high-quality stuff that costs a lot to produce – is a profound challenge that we don’t yet have tested answers to.

There was a good post by Mike Masnick on Techdirt yesterday, summing up a really interesting discussion on “The Economics of Free”, and pointing out that

The first thing to understand is that we’re never suggesting people just give away content and then hope and pray that some secondary market will grant them money. Giving stuff away for free needs to be part of a complete business model that recognizes the economic realities

Give-away-and-pray isn’t a business model. I don’t believe education is (or should be) a business, but in a world based on exchange (rather than a gift economy), there are bills, and to be sustainable, there needs to be some way of paying them. Educational resources – once produced – are infinite goods: the marginal costs of reproduction are zero, or very near to it. Mike Masnick points out that the price of such goods will tend towards zero, and suggests that to make a sustainable living in that environment, you need to link the free distribution of those infinite goods to scarce goods, so that the greater availability of the inifinite goods make the scarce ones more valuable. The canonical example is the music industry, with the give-the-music-away, charge-for-the-gig (and other stuff) model. But I think it’s very applicable in education as well.

The infinite goods are obvious. If we’re not already in a world where good-enough zero-cost educational resources are widely available, we’re very close to it. The OU’s offerings on iTunes U are just the latest goodie in a great and growing sack of wonderfulness (!).

The linked finite goods are less well articulated, and I think the discussion about ‘business models’ for education 2.0 could be improved with a focus here. Martin asks whether it’s acceptable to provide free resources and tools, but charge for support and accreditation. I think that’s exactly the sort of model we should be exploring. Learner support, guidance and accreditation are scarce goods: they depend on individual attention. The other thing that’s an obvious scarce good in education is bespoke production of learning materials. As with the open source software community, companies (and even some Universities) are prepared to pay programmers to develop specific bits of software as part of open source projects, to ensure that their particular needs are met. I’m sure this is also true in education. The employer engagement agenda is one aspect of this, and one we should be trying to link in with all this education 2.0/OERs stuff – I suspect that will make us a much more attractive proposition to businesses.

Swinging back to the OU’s offerings on iTunes U, I love our tagline “Warning! Content may transform your life” (as does Martin). It’s a lofty goal, but one well worth striving for. With all this unseemly grubbing around for money, it’s well worth keeping those noble purposes in mind.

Twittering in to the sand?

I’ve been twittering away for nearly a month now, and really enjoying it for the sense of tight community it gives.  Even when I was off work with the flu for a week and only managed sad whiny tweets.

One odd side effect is that it’s dragged me back to Facebook.  I’d more-or-less abandoned Facebook, until I wired my Twitter feed in to Facebook updates.  All of a sudden people who are on Facebook – that I’ve not been in touch with for ages – start responding to me there.

My colleague and noted Twitter enthusiast Martin is worried about Twitter’s ongoing technical issues, which are annoying, and sensibly points out:

there’s nothing really in the design of Twitter over Jaiku, Friendfeed, Pownce, etc that makes me use it – it’s just that it’s where my network is, and I can’t migrate without them. But if they started to go, the infamous tipping point might be reached very quickly.

Andrew Chen observed that Metcalfe’s Law – that the value of a network grows with the square of the number of nodes – can work against you.  He posits a reverse law – Eflactem’s Law,

As you lose users, the value of your network is decreases exponentially (doh!)

Chen has Facebook in his sights.  And I think he’s right, especially given Facebook’s determination to keep the walled garden thing going – in a networked world, that’s only ever going to work as a short or medium term strategy, and ‘short or medium term’ in Internet years can be not very long at all.  But I think Twitter is far more vulnerable.

The big danger – and big win – for Twitter is that their userbase is small (compared to, say, Facebook or MySpace) but highly skewed towards techie opinion-formers.  Those are precisely the sort of people who will find migrating to a new service very little hassle.

On the other hand, I think Twitter is likely to be robust over small, short outages compared to a lot of online services.  The great thing (for my money) about Twitter versus one great big IM clusterparty or IRC (does anybody use IRC these days?) is that you feel quite safe ignoring it
for a while if you want to do something else.  So if you feel like Tweeting, but can’t, it’s no big deal to get on with Actual Work instead.

It’s all a bit fluid, and who knows what will happen?  As Martin concludes, “that’s the fun of it – we get to see the new paradigms being created”.

Laptops and internet bans

After the (very mild and minor) fuss about me using a laptop at last week’s Making Connections, I said we have a mountain to climb in embedding technological change across the university.  It’s reassuring in some ways to see that our mountain is not, perhaps, quite as huge and daunting as some other people’s.  The University of Chicago Law School has removed Internet access in most classrooms, and some professors even ban laptops.  According to the Dean, Saul Levmore, the problem is that students

may overestimate their ability to multi-task during class and that some students have expressed distraction due to their peers’ use of computers during class time

The latter is a very reasonable concern, and I think it can and should be addressed through policies about acceptable usage of computers during f2f teaching sessions (which is apparently what Stanford have done).  But the former is more contentious. Levmore sums it up by saying the question is “How do you best learn? That’s for the faculty to decide.

Prawfsblog spots a certain amount of paternalism in the announcement, and urges them:

Be honest, and admit that you’re banning wireless access because the plugged-in student is usually a disengaged one and has sucked the fun out of the classroom experience.   Students are more likely to accept a top-down policy change if it’s justified based on faculty morale than student learning.

B2fxx goes a bit further (and links back to me, spurring this post) and says:

Banning laptops in class is a bit like the education sector’s equivalent of the entertainment industry wishing the Web had never happened.

That’s more what I’m thinking. It does seem like a panicky over-reaction to an irreversible technological change, which will harm both the legislators and the punters.

I do buy Levmore’s argument that the question is “How best do you learn?”.  I completely reject the idea that the faculty (or teachers or whatever) know the best answer.  Particularly if they think the answer is traditional lectures, which we’ve known since 1972 (Donald Bligh’s What’s The Use Of Lectures) are no better than other methods for information transmission, and almost entirely useless for getting learners to think.

Almost any teacher can help a student learn more effectively than the student can alone, and a good teacher will help the learner understand and improve their own learning processes.  But the idea that the teacher knows how best their learners learn is … wrong.  How can you possibly know that?  You can have a lot of good ideas about how your learners might learn, and if you teach the same topic over and over again, you can accrete a comprehensive toolbox of ways of helping learners learn those particular subjects and a lot of experience in judging which are likely to help which learners with which aspects.  But that’s a very long way from what Levmore is saying.  And all of that presupposes that what you’re teaching (or should be teaching) hasn’t changed profoundly as a result of new technologies – and there are few if any courses where that’s true.

Stephen Heppell has long lamented that most (British) kids have great access to some of the most extraordinarily powerful learning tools (e.g. mobile phones, word processors), but are banned from using them in (many parts of) the formal school system.  It’s a bit depressing if (parts of) higher education are heading down that same “Teacher knows best” route.  At just the time when teacher is freed from having to know best!

Twitter away

For those of you reading this via RSS feed, you won’t have noticed my spiffy new site look.  You’re not missing much in graphic design terms, if I’m honest, but you may have missed that I’ve added a Twitter feed for me in the sidebar.  As instructed by my boss (Patrick), and in fulfilment of one of my objectives from my annual appraisal, I’m trying to Twitter properly for at least a week.  We have this semi-formed idea to try to do some more Web 2.0-style management, and Twitter seems like it could be part of that.  At the very least it means we have slightly more idea about what each other is up to on a daily basis, which is a good thing in and of itself.

(I note sadly that my neologism ‘twittorial‘ has failed to gain traction – rating a grand total of six hits including my original land-grab post and two from a Spanish site.)

Learning journeys

Had a good meeting today with some folk from the Natural History Museum about our work on OPAL – we’re mainly doing the Biodiversity Observatory, they’re doing the portal for the parent project, and some related work of their own on a ‘Bug Forum’ which is a potential big overlap.  We had some good discussions about general principles and the beginnings of some more concrete and practical issues around things like OpenID and Google Open Social.  One of the things we agreed we needed to think about more together was the user journey through each of our sites and between them.

The NHM seems (to a visitor) like a great place to work – there are cubbyholes and rooms all over the place, in interstices of the public museum, and mazes of twisty passages.  I first visited the museum as a wide-eyed teenager from the sticks, and vividly remember losing all track of time in the minerals gallery.  That visit was part of the reason I ended up doing a chemistry degree.  So I was delighted when I had to wait in that very gallery while people fetched wallets and dumped bags before lunch.  The rows and rows of cabinets with interesting minerals were almost exactly as I remembered, and the gallery didn’t seem any smaller with the passage of the decades.  There was more by way of interpretation boards around the sides, there was a more obviously sexy bit up one end called The Vault with a (fascinating but showy) display of gems, and some of the signage had been updated, but the serried rows of cases and specimens seemed entirely unchanged by the new museum revolution that started in the 1980s.  It was still enthralling, and I had to be dragged away to lunch – a nice reprise of the early parts of my personal learning journey that has ranged far and wide in between.

And a good reminder that traditional forms of learning aren’t always surpassed by newer technologies.  Apparently, these are original oak cases from 1881 when the museum opened.

iPlayer on Wii

I was excited last month about the BBC’s iPlayer service being available on the iPhone and iPod touch.  Today I’m excited about it being available for the Nintendo Wii.  Internet TV … on the TV!

It’s pretty easy – you just need the Internet Channel (Nintendo’s silly name for Opera for the Wii, and an excuse to charge you £3.50 for a browser that’s available free on pretty much every other platform ever) and then … just go to the iPlayer site and off you go.

(Incidentally, the Internet Channel on the Wii is a fantastic idea, but really brings home to me a) how poor a TV set is as a computer display and b) just how desperately poor the text-entry system is on the Wii.  Watching YouTube is workable and more fun than on a computer.  Very little else is.)

Of course, it’s been possible to stream videos from your PC to your Wii via the Internet Channel for a while, through various bits of software.  And it is also entirely possible – if somewhat dubious – to strip the DRM from iPlayer downloads so you can stream them.  (Or indeed blow them to DVD and walk them through from the PC to the living room.  Never underestimate the bandwidth of sneakernet!) So this has been possible in principle for some time, but a lot more technical faff than most people can be doing with.  iPlayer is about bringing P2P to the masses, rather than the geeky copyfighting few.

As another aside, I’m amused at the ISP industry taking against iPlayer.  (See, e.g., El Reg’s piece on the recent spat between Tiscali and the BBC.)  Parts of the IT industry often seem to want to defy ordinary economic gravity – I’m reminded of the dot-com nonsense (“How could that possibly make money?”  “If you read our business plan you’d see that we will develop a monetisation strategy in Q6”).  In what other industry would businesses get terribly unhappy if the demand for their product increased beyond what they had ever anticipated?  Madness.

To be fair, it’s more as if they’ve worked out one way of making money and don’t want technical development and change to stop that working.  Rather like mobile phone ringtone vendors, traditional record industry executives, blacksmiths and indeed the original Luddites, then.  For technology companies to take that position is particularly odd.  And life-limiting.  I’d advise against buying Tiscali stock.

Anyway!  Enough asides.  What about iPlayer on the Wii?  Is it any good?

Alas, no.  At least, not for me tonight.  The resolution is great.  There seems to be far more content available than when the iPhone/iPod touch version was launched.  But the bandwidth is so rubbish as to render it unwatchable with stutters and stops.  Don’t know for sure what that’s down to – no problem at all on my desktop PC over Ether to my router, or my iPod touch using the same WiFi network as the Wii.  Anthony Rose from the BBC mentions in his announcement that they’ve had to up the bitrate for the Wii from 500 to 820 Kbps because they need to use a less-efficient codec to work with the ancient version of Flash the Wii uses.  That could be it.

Still – maybe the BBC’s wizards will fix this with tweaking.  Maybe I’ll think of some way round the physical barriers to running a hard connection from my router to the Wii.  Maybe something even better will turn up next month!

And while I’m waiting, my copy of Dance Dance Revolution Hottest Party for the Wii should arrive very soon …

JIME: The Once and Future Future of Academic Publishing

I’ve just been to a meeting about the Journal of Interactive Media in Education, or JIME.

I’d been vaguely thinking that I may or may not be on the Editorial Board of it, but the meeting has usefully confirmed that I am as of this afternoon actually one of the three core Editors along with my colleagues Patrick and Will.

In 1996 it was a very exciting new development in academic publishing – it aimed, inter alia:

Through its innovative use of interactive Net-based media, to be an action research project which explores the changing face of journals, and more broadly, scholarly practice in the age of digital publishing and communication.

(Ouch. The site uses frames! Making linking to that aims page hard. Oh dear.)

It had a cool new idea about being a proper journal but freely available online, and about the reviewing happening in the open. After an initial quick ‘threshold’ review, the article appears, the reviewers make comments, and the authors respond. All in the open.

Alas, the current technical system to support all that is Broken. And not fixable for boring reasons, on top of the reason that fixing an out-of-date kludgy system that you didn’t build is a deeply boring task. Things have rather moved on from 1996.

So we need to do Something. We had some fun (and despair) thinking about what. I think we have two main principles for the journal:

  • Firstly, we definitely want a Proper Academic Journal. That clearly still has value, and is part of what JIME always was and could be. So that means a proper Editorial Board, and proper reviewing. And – note to self – proper indexing in major citation indeces, which indirectly probably means a regular publishing schedule, which is a serious – but not insurmountable – tension for a very-rapid publication model. (e.g. it might be possible to come up with a hybrid where things appear as ‘accepted’ as soon as they are, and then every four months we create an ‘issue’ which formally moves any and all currently accepted articles to ‘published’.)
  • Secondly, we want to continue to explore new, more open ways of doing that – being open access is a minimum. So teaming up with a publisher (which would get our hands on their lovely money to support the process) isn’t likely to be an obvious big win (since it would be extorted from academic libraries by means that would prevent us being openly available). I don’t know what our current licensing agreement with authors is – implicitly it must involve permission for open access – but that might want upgrading to a CC licence.

The current sketchy idea is to use Open Journal Systems (OJS) for the nuts-and-bolts of the threshold review and the publication process, then some cunning system where the reviewers post their review in their own blogs (opening up that process much more widely in a very interesting new way), and JIME picks that up via trackbacks. Ideally we’d do something clever where the initial submission appears in the author(s)’s blog(s) too, as well as their response/revised final article. Should be pretty easily do-able … just needs the time fiddling with OJS.

There’s much more we could do, it’s just thrashing out what it should be – and convincing people that it has value and then getting the resource to do it (!). We talked about deeper issues – in particular, Patrick’s fine theory that Word has set academic publishing back decades by inhibiting structured authoring and referencing, which were solved problems by the late 80s.

But following the Web 2.0 philosophy, we should do the least that would work as soon as we can, and build towards the bigger vision, rather than waiting to build the whole thing in one huge leap.

It is fun being a (small!) part of the revolution in academic publishing. We’re also looking to refresh the Editorial Board as part of this revamp – anyone interested in joining us?

Biodiversity Observatory – social networking for natural history

I’m heavily involved in the Biodiversity Observatory project. It’s part of Open Air Laboratories (OPAL), a huge (£12m) project funded by the Big Lottery Fund with ambitious goals to develop and engage a whole new generation of environmentalists. We’re not supposed to create a big media splash about OPAL yet until our media team are in place so we can make a very big deal about it, but I’m sure a little blog post here and there is worth it. The project includes the Natural History Museum, the National Biodiversity Network, the Field Studies Council, Imperial College and many others. There’ll be a network of regional activities to draw people in, led by regional universities.

I’m working with two of my colleagues – Will and Richard – who have blogged about this already. Will recently discussed the reputation management system, as did Richard. Richard’s entire blog is about the development of the site, plus the Evolution Megalab which is strongly linked.

Our job is to specify, develop and test the Biodiversity Observatory site. The tagline is that it’s a social networking site for Natural History. The aim is to draw in people who are interested in – say – the butterflies in their garden – and connect them up with a learning community, some appropriate resources, and the expert society for the area.

It’s a really interesting project. It’ll be big, too. With my professional hat on, there’s the social networking side, and the intersection of informal and formal learning. The mass audience will be very much learning-for-fun people, but there will be two OU courses to link up with it: a short level 1 course on Neighbourhood Nature, and bigger second-level course on biodiversity. And there’s the chance to play a small, indirect role in helping to save the planet, which is always a nice to have thing on a project.

Today I was talking to Richard about what we should use as the platform for the site. We’ve ruled out building it from scratch (the timetable is too short), and even building it in a framework like Symfony would be a huge job. We want something open, and we strongly prefer PHP. We are probably going to go for a more full-featured open source project that does most of what we want out of the box, and as Richard says, we are looking at Elgg very closely at the moment. We’re also considering things like Drupal or WordPress, or CMSes like e107 and Joomla! (which! reminds! me! of! Yahoo! coverage! in! The! Register!). And even Moodle, but that’s looking very unlikely.

So it’s interesting that today saw the announcement of two new offerings that could be relevant.

The first is Google Sites, which seems at a glance to be yet-another site-building-site, but has Google momentum and some of the Google Nature so isn’t to be dismissed entirely out of hand.

The second is Wikia has released some of its social networking tools, which sit inside Mediawiki, the wiki software behind Wikipedia and a host of other community-edited sites.

Research directions

I’m lining up some serious research time in the new year.  It should give me space to pull together a lot of interesting strands.  But I do need to focus on what I’m ‘interested’ in, in that funny academic sense of ‘interested’.  I’m actually interested in all sorts of things – I find pretty much all human endeavour endlessly fascinating.  But that’s not going to get papers out and bids in so I need to narrow down.

Fundamentally, as I mention on my ‘about’ page, I’m interested in new technology in teaching in Higher Education, although I’m increasingly minded to widen ‘Higher Education’ to ‘post-compulsory education’.

Within that, the two main areas I’m most interested in are the ongoing transformation of the online world (with a strong link to OpenLearn) and the ongoing transformation of the physical world (with a strong link to the new labs).  The U3A stuff I blogged about before definitely fits in here.

For various reasons, I want a reasonably strong theoretical take.  The theory du jour when I did my PhD was constructivism, but that has various shortcomings, rehearsed over the intervening decade, and has been eclipsed in the area by Activity Theory.  Only trouble with AT is that I’m not mad keen on it.  Either I don’t fully understand it, or I don’t agree with it, or quite possibly both.  (Exploring that is something else I’d like to do with a bit of research time.)

What’s more promising to my mind is Theory, as in the literary-criticism/media studies idea of Theory, expounded on David Gauntlett’s Theory.org.uk.  (I knew Dave quite well as an undergraduate but lost touch since, although I’ve long been a fan of his web presence.)  I see that he’s recently published a book called Media Studies 2.0, which sounds just the job as a starting point.  I’ve no taste at all for the obscurantist tradition that often comes with the lit-crit po-mo world, but there are some extremely valuable and interesting ideas there, and I’ve always found David’s writing extremely lucid.

This feels like a great idea since I already have a grounding in that area from personal interest, so pulling that in to the day job should pay off well.  (Although it’s a different matter to decide to give up struggling with unmediated Judith Butler when you need it for urgent revisions to a paper, rather than because it seemed interesting when you started.)  Pulling my personal interest in computers and new technology in to the day job worked out well in the past, so I have a happy precedent.

Not sure where this will end up – working this all out is a project for the research time itself – but I am getting quite excited about it.

Third Age

Had a really interesting meeting a couple of weeks ago with Jean Goodeve from the Third Age Trust, the national body that supports local University of the Third Age (U3A) groups. The OU has a Memorandum of Understanding with the Third Age Trust, and there seems like there’s lots of potential for collaborative work. I’d have been keen to meet them anyway, but I had a double motivation since Jean’s son happens to be a very good friend of mine.

U3A is very much about learning for fun. The OU is about learning for accreditation … which can also be fun (I like to imagine). The boundary between those two is increasingly blurring, and there’s a lot of potential for us to expore that new space between us. We’re both organisations with the stamp of the marvelous Michael Young on them. We share a fundamental belief that our learners are experienced, smart people who can help themselves to learn, particularly if appropriately supported.

(I note in passing that U3A’s commitment to learners being teachers and vice versa is explicitly stated in their founding principles; at the OU it’s more diffuse and part of best practice … and probably a lot patchier as a belief if I’m honest.)

They’re obviously very interested in OpenLearn, and we’re very interested in what they make of it. In conversation with Patrick McAndrew (my colleague currently leading the research and evaluation of OpenLearn) this week, we realised that U3A provides an organisational layer that’s exactly what we’d like to be able to provide with the tools around the OpenLearn content. Groups of people who want to learn something come together, find what resources they need, and support each other as they try to understand the topic. At the moment U3A operates more along geographic lines, but that’s changing, and the potential that online tools offer for forming interest groups for learning nationally or internationally seems pretty huge. A bit like Martin Weller‘s ideas about very-niche learning. (I’m sure he’s said something about this but can’t put my hand on the post quickly.)

One of the ideas that came out of the meeting with Jean that I’m very keen on is using U3A people as co-researchers: they get to learn about the research process and are partners, rather than subjects; we paid research types get access to a network of active, intelligent co-researchers who can snowball out to an even larger sample. Everyone benefits from each others’ expertise.

As well as the obvious link to OpenLearn, there’s the potential for getting U3A folk involved in some of the more close-up work in the new lab we’re building in the new Jennie Lee Building (which I’ve not blogged about here much yet). The idea there is to explore new and near-future prototypes of ambient and ubiquitous technologies (previously mobile devices were hot, now it’s multi-touch interfaces and there’ll be others) with learners to see what the potential is for expanding how people learn.

I think U3A people would be great as groups to bring in for this. They’re motivated, smart, used to learning, and represent a sector of the population who are growing in both numbers and influence. If this is starting to sound like a pitch for funding … that’s the plan! Although I think there’s a lot we can do with our own resources.