Suppose someone designed a television that monitors the programs you watch and blocks access if it detects you watching an objectionable program. Or an mp3 player that shuts down if you start listening to music from an illegal file-sharing website. Or an e-book reader that prevents you from reading anything on the Vatican’s Index of Prohibited Books. Would you buy it?
The answer is probably a resounding No. Yet there are corporations like Apple and Viacom who intend to design computers that are capable of betraying their users.
As an expert on this terrifying subject, Canadian blogger and journalist Cory Doctorow gave a keynote address on March 5 at the third annual University of Toronto iSchool student conference, “Boundaries, Frontiers & Gatekeepers.” Doctorow is a noted science fiction author and technology activist from Toronto, and currently resides in London, England. His first book, Down and Out in the Magic Kingdom, was the first book released under a Creative Commons license online so that users could copy and share the work.
His other novels, Little Brother, Makers, and For the Win, have garnered commercial success and critical acclaim. Much of Doctorow’s fiction addresses contemporary issues in a near-future setting.
Doctorow regularly lectures on issues such as copyright, digital rights management, and technology regulation. His talk, A Little Bit Pregnant: Why it’s a Bad Idea to Regulate Computers the Way We Regulate Radios, Guns, Uranium and Other Special-Purpose Tools, explored attempts to regulate technology to prevent copyright infringement. According to Doctorow, “Building a general-purpose PC [a desktop or laptop computer] that is just a little bit locked down is like finding a woman who is just a little bit pregnant. Once the facility can be used for a legitimate purpose, it can also be used for illegitimate purpose.”
He began his talk by listing a series of scenarios and asking what they had in common. These included Viacom demanding Google create an artificial super-intelligence that can instantly delete copy-righted videos, and mobile network providers locking their phones so customers cannot take them to rival characters. The commonality was that these were examples of regulating the general purpose computer.
“Historically, we’ve thought of computers as a special purpose object,” said Doctorow. “It seems the expense and bulk of computers was an extremely temporary condition; and that every year we’ve seen an accelerating trend of computers that become cheaper, smaller, more powerful.”
Computers are now becoming less and less specialized. They do everything: radio, Internet, videogames, word processing, self-publishing, graphic design, and more. Anything can be accessed, hacked, rebuilt, and broken down into new forms. This is exactly the “maker” culture Doctorow addressed in his book of the same name.
To illustrate this issue during his lecture, he brought up the example of BBC’s iPlayer, a device allowing users to download and view BBC shows. Despite its aim to make these programs more accessible, it featured “strange characteristics.”
“For some reason or another, they only want you to be able to look at those files for thirty days and only be able to download it for seven days after the program airs, giving those files on your computer a maximum life of over thirty-seven days,” he noted.
This seems an absurd and heavy-handed approach to policing what people watch on their computers, and the response from users has been typical. The software meant to delete these files has been easily hacked with the result that every program from the iPlayer ends up for free download on the Internet.
The response of those corporations who consider themselves copyright holders has not been to look to user-based solutions or even give up their attempts at enforcement. Instead, they are trying to find more ways to lock down computers and networks.
“Internet service providers are being told all over the world that they should act as copyright police and they block certain websites,” he said. If this violation of net neutrality isn’t bad enough, there is finally the attempt to embed these kinds of copyright protections in hardware.
“They want to design computers from the ground up that have this ability to run programs that their users can’t control or inspect, or [are] designed to work even if the user doesn’t want them to,” Doctorow explained. “They could run even if the owner of the computer does not believe it could be in their best interest for them to be running.” Locked phones and Apple products that only allow you to use apps that have been approved by Apple (apps that give Apple a thirty per cent cut) are early examples of this type of technology.
“Indeed once you start calling this what it is, a computer that is designed to betray its owner’s interest, it becomes immediately obvious why we shouldn’t do it,” argued Doctorow. “We are adding the legal and technological infrastructure to arbitrarily prevent code from running on computers, or to covertly run software on a computer to eavesdrop on all network communications, to block certain websites and services, and to have websites remove content on an even greater set of nebulously defined pretences with greater penalties for failure to act.”
Frightening. Corporations are creating a regulatory framework that would seek to put Big Brother on your PC and punish the people who use their products. Imagine accidentally visiting a website that may be deemed “copyright infringing” or even illegal and having the government or a corporation sending you a friendly email explaining that you will be fined or arrested.
But why are corporations hitting the panic button now? “Well that didn’t work the first time, so let’s try something harsher and more draconian.”
To put Cory Doctorow’s talk into context let’s consider a discovery made by designers of artificial intelligence that can explain some reactions to technology. This discovery is the “frame problem” and it denotes the inability to understand or see all consequences of a particular object. One cannot properly see outside of their “frame” of reference.
For example, no one was able to foresee that the car would have uses or implications beyond moving a person from point A to point B. Cars not only revolutionized the structure of cities, they also became the most efficient emitters of harmful pollutants into the atmosphere. Communications theorist Marshall McLuhan referred to it as “rearview mirror thinking” — the attempt to apply the concepts and approaches of the old technology to the new. Think of “horseless carriages” and you get the idea.
In the case of the computer, it was never foreseen that it would be a device that could not only connect people, but could also allow users to quickly and efficiently share music with one another or even create whole other areas of “cybercrime.” Having failed with the traditional “top down” regulatory framework, corporations are now looking to bottom-up hardware solutions that do nothing to solve the problem and will inevitably punish the rest of us.
“These rules and systems have the effect of magnifying the advantage of the powerful and unscrupulous at the expense of the scrupulous and the honest,” explained Doctorow. Attempts at blanket solutions for computers and networks only aid and abet such groups as medical quacks who use, in Doctorow’s example, Britain’s strict libel laws “to pursue science writers who make such outrageous claims as ‘AIDS cannot be cured by vitamins’ or ‘chiropractors won’t cure your cancer by holding their hands over you.’”
Doctorow was himself a victim of a blanket solution to solve copyright problems over the Internet. Copies of Doctorow’s book Down and Out in the Magic Kingdom, which was published online under a Creative Commons licence, were removed from Scribd, a site that allows users to share text files with each other. The Science Fiction and Fantasy Writers of America, using the Digital Millenium Copyright Act as justification, argued that a number of works on Scribd were infringing on the copyright of SF writers Isaac Asimov and Robert Silverberg. However, the list only included their last names and pulled up every work on Scribd that mentioned Asimov or Silverberg even if they didn’t write the books themselves. Included in the takedown were a high school teacher’s SF bibliography and the back issues of an out-of-print publication called Ray Gun Revival. The works, including Doctorow’s, were restored when SFWA realized their mistake.
“Designing general purpose computers that sneak around their owners’ backs is a terrible idea. We’ve already seen what happens when you add just a little bit of control to networks and computers — most recently we saw Iran’s and Egypt’s secret police mining Facebook to figure out whom to arrest,” he explained.
Furthermore, imagine if you’re a virus writer, an identity thief, or a hacker. These hidden programs that users would not be able to control or access would be the perfect ones to break into since you could adversely affect the normal functionality of their computer or spy on them without their knowledge. Locked down computers are the bad guys’ paradise.
Doctorow’s conclusion can only be described as a stirring call to action not only for librarians, archivists, and information studies students, but for anyone who regularly uses technology.
“This fight is the leading edge of a series of regulatory battles that are going to take us through this century and have at stake whether the infrastructure of the information society is going to have embedded in it control surveillance technology that will do nothing to mitigate harm, but put us all in harm’s way,” he concluded.
Computing technology stands at the precipice of an exciting frontier in which all of us can become “makers” and enjoy a world in which we can innovate and recreate old technologies to make something better and new that can be easily shared across free networks. Or it might succumb to a dark future of locked-down computers and monitored networks that will be used by the unscrupulous, the dishonest, and the distrustful to control us all.