Content Battles

Martin has another interesting post, arguing that “Digital content wants to be free, and will seek the path to maximum access.”

He makes a good case based on some examples from photos, broadcast and music. I’ve two points of departure.

Firstly, I think ‘photos, broadcast and music’ are old-media concepts that don’t have a guaranteed right to exist in the new-media world. Online, these map – in a complex way – on to images, audio, video and combinations of those. (FWIW I don’t think ‘streaming audio/video’ category is a stable, separate category in to the future either – it’s a workaround for limited bandwidth.) It’s a tribute to how embedded that way of thinking is that even an analyst of Martin’s stature and experience paints the world in those terms.

Secondly, the analysis is incomplete without acknowledging that digital content also wants to be expensive. The original information-wants-to-be-free quote was from Stewart Brand back in 1984, and is worth restating in full:

On the one hand information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other.

And that’s what’s been going on with the music/audio industry.

And that’s what’s just starting to go on in the video industries. We’ve got YouTube playing the Napster role and any number of consumer-hostile walled-garden DRM solutions from bone-headed unimaginative existing market incumbents.

These include, alas, a lot of people who should be in a position to Do The Right Thing, but sadly aren’t, such as the BBC and Google. The BBC have made what I reckon is their worst decision of recent years by going for a DRM-ed offering (tied to Microsoft), despite overwhelming public offering. For stuff the licence-fee payers have already paid for! And Google Video is another disaster. Google is shutting down its video service. Punters who signed up in good faith and bought DRMed video from them now face being unable to play those videos.

The battle in audio is far from over. The battle in video hasn’t really got started.

Leitch Review of Skills

Have just been to a meeting where we discussed, inter alia, the Leitch Review of Skills and its potential impact on the OU.

For those of you who’ve not had a chance to read it cover to cover, the general gist is – surprise, surprise – that the UK needs a lot more skills.  At all levels.

How this is to be achieved varies by level.  The Review urges shifting much more Government resource in to basic and intermediate level skills.  It also says there should be far more at degree level and above, but says that the expansion here should be funded by employers and individuals.  The Review also says that offerings from HE providers must be much more “demand-led”.

The OU’s Council – our ultimate governing body – looked at all this, and I’ve seen the briefing paper they had and indirectly heard their response.  It seems pretty smart.  As I understand it, it goes:

a) The OU is pretty well connected with employers already – though of course we can do better;
b) Don’t for one moment assume that there will be a sudden huge flood of new money in to HE from employers – there won’t; and
c) Note that the Government has yet to set out a timetable for implementing the Review – assuming it decides to do so.

There is a lot of potential for exciting stuff post-Leitch, but there’s a lot of problems too.  (I’m particularly skeptical of the role they envisage for Sector Skills Councils, for one thing, although at least it’s not recommending a whole new machinery for doing that job.)  I think we’ll need to wait and see before anything dramatic arrives.

Joining things up in my head, I think that the Leitch push to be more demand-led, more bespoke, and more cost-effective (all at the same time!) cries out for a Web 2.0-style mass customisation operation.  How we do that at scale, though, is a huge challenge.

Principles 2.0

As part of the ‘BBC 2.0’ project, the BBC have come up with Fifteen Web Principles, according to Tom Loosemore’s blog. John Naughton observes that “Like all great ideas, they’re pretty obvious — once someone else has thought of them.”

They’re good stuff, and fairly obvious, but not, of course, out of the blue – it looks like a hybrid of previously-stated usability principles and Web 2.0 ones. Which is of course what you’d want.

There’s one I don’t entirely buy:

7. Any website is only as good as its worst page: Ensure best practice editorial processes are adopted and adhered to.

That looks much more old-school BBC than Web 2.0 to me. This is, of course, entirely true of content that centrally-managed and presented. But when you’re going for something more open and user-created, you have to live with the fact that some of the stuff there is going to be, frankly, rubbish. Indeed, I’d bet that you’ll have a power-law distribution of quality: a small number of outstanding items, and a very long tail of what looks like dross. Except, of course, a small number of people might find some value in a given individual item … and if your infrastructure works right, you multiply this small value by the much larger number of items and get greater value there than up the other end of the distribution.

One of my current projects is revamping the OU’s Knowledge Network. This is a knowledge-sharing site for expertise about teaching and learning. We invented it in 1999-2000, and if we’d done it later we’d just have picked up a lot of web 2.0 stuff. We pretty much re-invented social tagging before it was widespread, for instance, but our implementation was too crufty to catch on (and we didn’t start right). The reason we haven’t just dumped it in favour of open tools is that the ability to give fine-grained control over access to the information – integrated with the OU’s existing security infrastructure – is a key feature.

We’re going live with a facelifted version today, and then we get our teeth stuck in to a major change to produce ‘KN 2.0’. We have our own principles for this, which look similar:

  1. As open as possible
    The Web 2.0 philosophy of radical user empowerment is very much in tune with the direction of the original KN. Anyone can publish, and there are no gatekeepers.
  2. As secure as necessary
    Radical openness is not appropriate for some of the information in the KN. Fine control over access enables free discussion of information that needs to remain confidential.
  3. Expertise exists in people, not computers
    Try to enhance existing person-to-person links for knowledge exchange, not replace them.
  4. Don’t duplicate effort
    Take advantage of other systems wherever possible. Don’t try to do what other services do better (e.g. quality-assured document repositories and gold-standard archival).
  5. Play well with others
    Make it easy for the KN to work with other systems and processes, e.g. by open-sourcing (entire system or new components), creating/using open APIs and standards

It’ll be obvious to anyone that principle 2 is the ringer in Web 2.0 terms. Just like the BBC, we’ve got our own departure from the Web 2.0 philosophy. That doesn’t worry me. In fact, if we had no difference, we really should be just using what’s already out there – or, of course, joining in with developing it, like we’re doing with Moodle.