Cory Doctorow – a little bit pregnant

Cory Doctorow, the geek Dad, digital rights activist, writer, sci-fi author, and famous cape-and-goggles-wearing blogger from Boing Boing, visited the OU on 18 May 2011 to give a talk. He’s also a visiting lecturer in the Computing Department at the OU.

Portrait, home, Hackney, London (by Paula Mariel Salischiker, pausal.co.uk, CC-BY) 4.tif
Photo by Paula Mariel Salischiker, pausal.co.uk, CC-BY

These are my liveblog notes.

Cory last came to the OU on 26th February 2009, to give a Computing Research Centre Distinguished Lecture titled “Freedom and technology: who’s the master?”. I was off sick at the time and followed it from my bed on the web stream; unaccountably I didn’t manage to take notes. From memory, it was pretty good. The video is archived – you’ll need to scroll through the ‘Past Events’ dates to find it. (At the time of writing, it’s on page 11.)

‘A little bit pregnant’ – regulation and technology

There are many regulatory demands for outlawing modification of tools to un-desired outcomes. Many examples – regulating general-purpose computers and networks.

Background first. Two modes of regulation. We regulate to achieve some end – e.g. prefer no physical harm to people. First mode – prior restraint, we say these things are not allowed, or constrained in some way. E.g. can’t own a gun except in some circs, and some guns not at all. But do not regulate every object that might harm someone, a) because cost is too high to regulate, b) it wouldn’t work. Second mode – regulate after the fact. Can own some things – e.g. saws, cricket bats – but if you do bad things with them, you get punished. General purpose tools are bad candidates for prior restraint.

So, to general purpose computers. Traditionally we thought of computers more as single-purpose thing – we think of complicated things as specialised. Also expensive – and easy to regulate expensive things. But expense and bulk of computers is ending. And special-purpose character was a blip too. Now we throw the same commodity hardware at whatever problem comes up. And many more of our problems are computational – more as computation power increases.

Universal character is a feature, not a bug.

Networks used to be very specialised too – faxes, phones, network. Faxes are interesting: we once had an entire parallel infrastructure for moving low-res bitmaps around. Seems bizarre to be so narrow. But the Internet treats it all as mere applications, layers on top of the general purpose network.

Nearly everyone wants it to get faster, better, cheaper – if improves everyone’s life, so lots of resource in to improve it.

Reflection
Cory is from the Internet, where there are many photos of cats, often with captions that are funnier than this one, and only marginally less self-referential.

This has funny consequence for regulation. Some specialised esoteric things are just computers running a particular software. Think radios – used to be specific hardware, but now software radios can change characteristics based on software – could be radio, 3G, wifi, TV tuner, FM tuner, and so on. This matters because radio regulation assigns certain frequencies to certain purposes, and insist radio makers confine their devices to those bands for their purposes. So baby monitors are hard wired to not interfere with airliners.

Copyright wars. Restrictions on copying software longstanding and variable – but none of it actually works. BBC iPlayer has measures to try to stop you storing it for 30 days, but can be routed round easily. They are transmitting the same file in the clear, nationwide, at broadcast time. Or pick up a cracked version that someone else has done from iPlayer. Even the most technically unsophisticated can work out how to do it.

General purpose networks are under attack – measures to stop wrong bits. Censors, terrorism, rights enforcement are being required of ISPs – and it’s easy to get round the blocks. Example of child porn blocking company in Australia: the contractor said the stuff doesn’t stop determined people from getting at it, it just stops people accidentally seeing it – and can have many bad consequences. E.g. 98% of sites on a child porn block list were not that at all.

Using a proxy, tunnelling, or whatever, are built in capabilities of general purpose networks.

Also attempts to interdict the hosting – Viacom/YouTube suit. YouTube volume 29h uploaded every minute a year ago – is now BIGNUM. More hours than lawyer-hours exist in the world to check it. Viacom want to establish that intermediary has to do best effort to block copyright violations – and when you chuck in all the content created, you have no chance, would need expert systems running one per hydrogen atom to get the scale.

Can design software that isn’t general purpose. So FCC on software radios, proposed trusted-platform-only devices – but that would cover any PC. What seems like a small piece of regulation – radios must control which programs they can run – would cover every general purpose computer on the planet. Also wouldn’t stop bad guys who want to do bad things. These things matter – e.g. radio interference with airlines.

In some sense there’s no such thing as a stream versus a download – you’re still sending someone a stream of bits. Sending a ‘stream’ means “I hope your software doesn’t save this”. There is simply no way to show a picture, or play a song over the internet, without giving them a copy of it.

There are incentives to create software antithetical to users’ interests. Games console companies charge monopoly rent to developers – sell consoles cheap, get money from game developers for keys so code will run on it. Mobile companies want to lock your phone to a contract. In UK is legal to unlock, but technically difficult sometimes; in others jurisdictions it’s illegal and/or very hard. Apple play all sides of the lockdown game – want to prevent apps bypassing their Store, and bypassing iTunes Store for audio/video; legal and technical restrictions to stop you moving it.

But this doesn’t work.

It’s not without consequences: it’s designing computers to betray their owners’ interests.

Prior restraint to general purpose PCs means they are designed in this way. Free Software Foundation has raised alarm about the Nintendo 3DS this week – it does many interesting things: tries to connect to the Internet, download software updates, run check on whether you’ve done something they don’t like, and bricks the device if it thinks you have – and you can’t stop any of that. Also comes with legal agreement that they own all the photos, videos and so on that you create with it.

This is happening more, and harder. Whole realm of regulation. None is particularly effective at stopping bad things from happening, but the tools and legislation can themselves be used for bad purposes. The systems empower powerful and unscrupulous people at the expense of weaker and more scrupulous people.

This problem isn’t going away, and isn’t unique to this moment. The more tech we have, the bigger it comes.

3D printers open up a whole load of other people who want to outlaw it that too. The fact we call them ‘printers’ shows we don’t know what they’re for yet. His favourite 3D-printed object: had a hip problem; his wife smuggled out his MRIs, and had his femur printed in stainless steel at a smaller scale, which he uses as a key fob. Problem: jurisdiction where sex toys are illegal – combine the 3D files, and the 3D printers, and have a legal issue. Many people could be concerned – indigenous people wanting to prevent duplication of sacred objects, or Mattel worrying about interchangeable anatomically-correct Barbie replacements, gun regulators who don’t want you to be able to convert your gun from one sort to another, and so on. And bioprinters are on the way – biohacking.

7Portrait by NK Guy, Please use the following attribution: Photo by NK Guy, nkguy.com.tiff
Photo by NK Guy, nkguy.com

The general purpose computer is everywhere – your car, phone – and will be under your skin and elsewhere. Will be even better than wetware they replace. Cory expects to have a hearing aid in time, and expects it will have a buffer, input from device he’s carrying, etc. What if hearing aid designed so you can’t control the software, can’t open it up and see what it does?

Anti-copying software that hides itself … is a great place to hide a virus. Once we introduce these measures, we build tools for people to spy and control us. Just ‘a bit locked down’ is like ‘a little bit pregnant’. When you can do it for a legitimate purpose, can do it for illegitimate purposes.

There are many genuinely bad things you can do with general purpose stuff. We genuinely want to prevent them. But stopping these commodity networks and computers won’t stop it – only the bad guys can do bad things.

Regulating general purpose computers and networks is not like regulating cricket bats – it’s like regulating wheels. We all use them.

We need to get this right. First step is to stop thinking we can be ‘a little bit pregnant’.

Example from software radio engineers: if we all have software radios, and some person’s radio clobbers them – we’re in a position to locate them, identify them, and ask them or make them stop. The collapse of prior restraint model is the collapse for the need for it. Finding the naughty radios in the field is obvious and easy (for the gross level, at least). Regulation after the fact, and that works.  German sex abuse survivors came out against a firewall against child porn: when this material disappears from the Internet, makes it easy to forget this abuse exists – divert resources to tracking it down, and stopping it at source, rather than regulating it to be out-of-sight-out-of-mind.

This is the major regulatory battle for the next 30 years.

I don’t have the right answers – but I know what won’t work.

Meat
Meat

Questions

Mike Richards: Devil’s advocate. Many people buy Xboxes, iPads, because consistent user experience – done by locking down. Perhaps most people don’t care, or like these devices.

Cory: I don’t buy that if people like it, you need to make a countermeasure to stop it happening. Even if there’s market demand, doesn’t stop e.g. situation where stuff that has your credit card details can run software you don’t approve, will bite you on the bum. Curatorship – only way to navigate the wealth of info is human and algorithmic curation. Fine to choose curation. Ubuntu is a curated version of GNU/Linux. There is a confusion of ways of solving problems in there. But Ubuntu makes a linear experience for you – here’s the word processor, desktop photo, and so on. But if you disagree with any of it, can add anything else instead. So you get a fully curated experience – it works out of the box. But when you hit something they didn’t think of, or you disagree with, you can over-ride it. And can step back to the curated version if it all goes wrong. One thing to be curated, another to be locked in. Seattle public library strongly designed, said only certain fonts in the signs, have to pay to print – toilets hidden and they can’t afford to make signs for them. As much as you like your Xbox, doesn’t mean you should stop other people from changing theirs.

John Naughton: This fills out gaps in more general narratives. E.g. absurdity of regulating generative computers – in Jonathan Zittrain’s account of generative systems. It’s hard to explain to non-tech people why open, general purpose stuff is important. Your examples are good way of putting flesh on the idea. Question Mike asked about Apple, general narrative is like Tim Wu‘s argument, you start with chaotic optimistic unfettered tech, then after time it gets locked down. In case of these industries, it’s people offering ordinary consumer a better product they really like. E.g. AT&T offer was you pick up phone and get a dial tone. Steve Jobs is our contemporary equivalent of movie moguls in the C20th. They do it by offering to consumers who don’t like tinkering, don’t like stuff that doesn’t work. The challenge is to find ways to explain why that’s important. You may love your iPhone, but it’s Faustian bargain. But when I tried an Android phone, none of the apps would work.

Cory: My Android phone works today! But this is important. The account of why Apple software works blends two different control systems and treats them the same: the initial curation, with the lockout. AT&T made this argument – we provide dial tone, if you plug in anything, you endanger the dial tone – the Hush-a-Phone case. If you add a bit to an iPhone to let you add stuff Steve Jobs hasn’t approved, it doesn’t make all the beauty drain out of it. It’s not that people make good decisions – often they make bad ones.  Fine to ask you several times if it’s Ok before it does something dangerous. Microsoft ruled out Information Rights Mgt – DRM added to Word documents. Could send someone a Word doc they could read for a day, or couldn’t print, or couldn’t copy-and-paste out of. The line was you wouldn’t want someone at a medical office to leak the fact that you were HIV positive. But – if have someone at medical office who was malicious enough to leak that fact, and was so dim as to only be able to do it by copying and pasting a file, you have an HR problem, not an IT one. But this would also block making OpenOffice and Google Docs. It was proposed as a normative method to encourage people to do it – but wouldn’t a dialog box do the job?  The deliberate blending of two causes for control – it’s one thing to say, this Store has apps with guaranteed quality; but another to say no other store can exist.

Arosha Bandara: Is your position that there’s no place for prior-restraint based regulation in the tech space, and post-hoc is sufficient? What happens when there’s people who don’t care about the consequences?

Cory: It is much simpler to create a computer that can run all programs, than to create one that can run them all except some you don’t like. I’m skeptical prior restraint can accomplish its goals. Also technically very challenging, so the likelihood of e.g. vulnerabilities increases. It’s not that I say it shouldn’t be done, it’s that it won’t work, and it’ll make bad things happen. If there was a formal proof of a computer that can run all software bar one, that’d be Ok, but I’m not aware of that as a theoretical possibility. Opportunities for malicious code – virus writers infect your computer all the time, they exist in this complexity of conflicting security goals. Adding in mechanism to run software without user control creates a whole host of extra vulnerabilities. The response is – we’ll fix it, we’ll iterate.  So for people who don’t care about consequences, get consensus among computer scientists that it doesn’t work: DRM is just a speed bump. And it’s not a very good speed bump. P2P distribution audit company, looked at mean time between song only released on iTunes with DRM (back when it had it) and when showed up cracked on pirate sites – mean time was 180 seconds. This solution doesn’t work. We have increasing scientific consensus on climate change. There are people who’d like it if homeopathy worked, and would like it to be possible to prevent duplication of certain bits. But we don’t live in that world. You need to approach this by looking at solutions that work, not what vendors will sell you. If you put out an RFP for magic beans, you’ll get bids, but the beans will not be magic. There’s a narrative side to it – when we imagine worst consequences of DRM, it’s I can’t install an app, or my Xbox is locked out of the network. But e.g. allowing privacy flag, or a hearing aid that puts stuff in your ear that isn’t happening, or sends it to someone else – we’ll see people reacting to that. It’s not getting a Lady Gaga single without paying it – it’s your VOIP connection was disconnected erroneously and you couldn’t dial 999. Have a book coming out called Pirate Cinema, deals with 3 strikes rules. Increasingly, our lives are grounded in the network – need access to be fully functional member of society. In the near future, costing your parents their internet access ruins your family – they can’t work, get benefits, sister can’t study, etc.

David Bowers: Very cogent arguments for why this is a difficult problem. How do you rate the chances of dealing with regulation when it’s done by politicians who are not as well versed in the technicalities as computer scientists might be?

Cory: Hard question. Have dealt with politicians. I think the younger politicians tend to be a little better. But they’re susceptible to the easiest answer, the inoffensive middle: some say everything should be free, some say PC should be tied to your neck, so sensible middle is to have some lockdown. Better framing of question is, should your private property rights be trumped by IP rights, can shift argument. Need to coordinate arguments. There’s no inevitability to us winning. You have to believe you might win, but not that the internet just solves it. Need to make good tech, good arguments, good political intervention.

Richard Leggett: Ownership of your own files. Have seen horrifying thing happening with close friends and family who are non-technical. Last week was at Google I/O, I’m a fan of disruptive tech. Also final release of Chrome OS. Seems like a sure thing – is likely the future of operating systems, on TV, phone. How much control will you have when it’s not even stored on your own device? People are already giving up so much privacy e.g. on Facebook; this completely takes control away.

Cory: Two separate questions – giving away privacy, and cloud computing. Google is full of smart people, some are my friends. Google suffers from wishful thinking with ChromeOS. Wireless devices, chances you have enough uncontested bandwidth for just general computing – e.g. bus full of commuters listening to their music. We only have one RF spectrum! For Google, intimate relationship with how you compute is good from a business perspective. But taking hyperabundant local storage, replacing it with tight, fraught, limited spectrum, and networks. Just bought a 600Gb SSD, paid same amount, $1,500, that I paid for first 16K RAM I bought. That screaming hockey stick of progress mapped against bandwidth availability, they don’t even come close – it’s the tortoise and the hare, only the tortoise isn’t going to win. There are some potential improvements, but there’s only so much energy you can put in the sky before birds start falling out of it. Privacy is a different question – danah boyd released a draft paper a few weeks ago. Lots of ways to establish privacy: one is normative, others are regulatory and control. And for teenagers, only have normative available. They have normative codes, but to say they don’t care about it is untrue, what they have is limited technical ability to enforce compliance. They may not be good at privacy, but they care about it very much and think about it a lot. I do worry about privacy and disclosure. If we want to raise young people to care about privacy, don’t install spyware on everything they do – e.g. smartphone apps to locate kids. There are scenarios where it’s the right thing to do. But to say that privacy is important, you can’t get it back when it’s gone… and also say that everything you do, send, I’m going to listen in on. Those two messages are in conflict. If we instead say, creeps and spies are trying to track you, my job is to tell you how to be safe about that. Instill a belief that privacy matters.


This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Author: dougclow

Data scientist, tutxor, project leader, researcher, analyst, teacher, developer, educational technologist, online learning expert, and manager. I particularly enjoy rapidly appraising new-to-me contexts, and mediating between highly technical specialisms and others, from ordinary users to senior management. After 20 years at the OU as an academic, I am now a self-employed consultant, building on my skills and experience in working with people, technology, data science, and artificial intelligence, in a wide range of contexts and industries.

One thought on “Cory Doctorow – a little bit pregnant”

  1. Oooooh thanks for writing this up.

    I was just talking to someone yesterday about risk management in child protection, and the difficulty of explaining it to that field’s equivalent of “non-tech people”. The arguments in favour of intrusive blanket monitoring can seem at first glance so apparently simple, whereas the arguments against it take actual thought!

    Co-signing the point about narrative – that is so true. Stories help people to imagine what we’re talking about.

    And this is v cool and widely applicable:
    It’s not that I say it shouldn’t be done, it’s that it won’t work, and it’ll make bad things happen.

Comments are closed.

%d bloggers like this: