Towards a new dot.commons
I wrote this in 2003. It is painfully out of date and wrong in many respects. It is also accurate, though it seems that instead of the trusted computing architecture that I believed would underpin a sanitised, corporately controlled and regulated online environment, our gaolers of choice are Facebook and the App Store.
I leave it here for your delectation and delight.
“After the correct political line has been laid down, organizational work decides everything, including the fate of the political line itself, its success or failure.”
from the report of Joseph Stalin to the 17th Congress of the Communist Party of the Soviet Union, 1934.
The Death of the Liberal Internet
The Internet that we know today is dying, its values and freedoms choked by oppressive and intrusive governments, its standards broken by rapacious corporations seeking to trap their customers, its common land abused by spammers and virus writers who exploit openness to spread misery.
Unable to sustain itself against this onslaught, the Net is changing, turning from an open, enabling and profoundly public space into a communications system which can be regulated, controlled, monitored and — where necessary — curtailed.
This is not inevitable. A regulated Internet does not have to be a closed Internet. However the trends today are towards increased control and towards the loss of the freedoms which the Net has provided thus far. We must understand how this is happening before we can identify ways to resist it.
This essay explores the way forward. First, we will look at the problems that face today’s Internet. We will then explore one possible direction for its development, a direction that offers the most hope of eventual salvation for the most important aspect of net culture: the public online space called the dot.commons. Finally, we will see that once the political line has been agreed, and once we decide that the dot.commons is worth preserving and extending, then it will be organisational work and programming that decide everything.
Too Open to Last?
Today’s Internet has a technical architecture which expresses certain liberal values, largely concerned with fair access to the net’s resources, lack of centralised control, support for freedom of speech, openness to innovation, and resistance to monopoly — either cultural, economic or technological.
These values are implicit in the way that it links computers and networks together and moves data around, because they are a consequence of the way that every computer on the Net communicates with other computers. They are embedded in the network’s protocols, the standards which determine how connections are made and how data is moved.
One important consequence of this is that anyone can write an application that uses the Internet to create a communications channel between any two co-operating computers, and the network has no reliable way of knowing what the data being transmitted means or how it is being used.
This makes censorship, monitoring and control remarkably difficult. They are not impossible, but the network tends towards liberal values just as a flower turns toward the sun.
The idea that the network just moves bits around and does not concern itself with the meaning of the data is generally called the end-to-end principle[1]. Unlike a political ideology, the end-to-end principle is not an abstract philosophical issue but a statement of the technical capabilities of the network. It tells us what facilities are available to those who write programs that use the network, and is therefore a much stronger determinant of behaviour than a belief in social justice, free markets or even a god.
Unfortunately for those who also espouse those values, the fact that the protocols embody liberal values does not guarantee that the network itself will be a force for social good. The freedom that they provide is available to all, even those who pursue an illiberal agenda, and the tools created on top of the network, programs like a Web browser or an email client, do not have to demonstrate the values that underpin the network as a communications medium. It is as easy to write the CyberPatrol internet filtering program as it is to write the KaZaA peer-to-peer file sharer. It is also as easy for an oppressive, illiberal and authoritarian government to make use of the network as it is for a liberal social democratic administration — as we see in China, Singapore and, more recently the United States of America.
The greatest problem is that the net’s lack of any mechanisms of control or regulation have also left it remarkably porous to ideology and subject to colonisation. Like the Labour Party under Tony Blair, the Net has no core beliefs of its own and can be hijacked by any sufficiently aggressive ideology.
In the early days the dominant ideology of the Internet was that of the academy, with sharing of information, freedom of speech and a commitment to the non-commercial use of the network. This changed from 1990 onwards, as the free market/free speech values of the US asserted themselves online. The result is that the dominant ideology on today’s Internet is that of the United States of America, and we see a blind adherence to what Richard Barbrook and Andy Cameron have termed the ‘Californian ideology’ in almost any discussion of network regulation or politics.
In fact US values and US interests dominate so comprehensively that it is reasonable to view this as hegemony and to see the Internet today as expressing a US world view. That world view is largely supportive of free markets and the interests of companies operating in those markets, rather than explicitly based on the values embodied in the Constitution of the United States.
Now, however, a far more worrying situation has arisen. The Net is in trouble, not because US culture and US values are dominating an essentially open network, but because governments and corporations around the world are making a concerted effort to dismantle this open Internet and replace it with a regulated and regulable one, one built to impose an ‘architecture of control.’
We have lived with US hegemony in the wider world for many years, and on the open Internet it has never been absolute: US values may be dominant but they are not exclusive, because today’s Internet cannot be completely controlled. On the Net a thousand flowers can bloom, and all points of view can find expression. Thus even those who object to the values which the US seeks to promote online have been able to live with them.
If a closed network is built then the losers will be those who want to use the Net freely, to share information across borders, to explore ideas or challenge institutions. With no space for resistance or revolution, the shared social space provided by today’s Internet will vanish, and the potential for play, exploration, discovery and innovation will vanish with it. The dot.commons will disappear, and only those activities approved by government or
The Need for Engagement
The technologically astute within the neo-conservative cabal that currently dominates US policy making and drives the Bush administration’s trade agenda realised some time ago that the operation of the Internet Protocol — usually abbreviated to IP by those who built and manage the Net — was potentially very damaging to their preferred IP, intellectual property.
The reason is simple. The Internet provides for simple and easy copying of digital data, and it expanded its reach enormously just at the time when the content industries — publishing, music and film — were all moving from analogue to digital means of production and distribution.
This means that the Net can be used to share — or steal, if you prefer — music files, ebooks, the latest movies and any other form of digital content. Not only that, but the current version of the Internet Protocol does not even provide for authentication, user identification or efficient surveillance of data transferred over the network. Any remotely competent programmer can sit down and write software like Napster, Gnutella or Freenet[1], and with it challenge the distribution model that has generated such generous profits for the music and movie industries over the years..
This is not a situation which the executives of those industries find acceptable, and as a result they have put pressure on the politicians whose election they bankrolled to force through massive changes in the way the Internet works.
The Net is being redesigned to provide greater control to those who would regulate and legislate its operation and far greater technical and legal support to those who have commercial interests in the information transmitted across it. Technical innovations such as trusted computers, signed content, digital rights management and protected systems find a legislative echo in the Digital Millennium Copyright Act, the European Union Copyright Directive and the European Cybercrime Treaty.
This campaign has two main goals. First, it is being promoted as a way to manage the online use of intellectual property of all kinds, as discussed. However there is a wider agenda. Its proponents believe that the new network technologies will make it impossible to resist the imposition of the US world view, so that hegemony will become imperialism. Eventually US economic, political and cultural values will dominate the network just as US interests dominate the World Trade Organisation, the World Bank and the International Monetary Fund.
However there are likely to be unanticipated consequences of this, not least because the closed network will give each country the ability to assert its own values on those parts of the global Internet that fall within its jurisdiction. On the closed network nation states will be sovereign, as they are in their physical territories. And this will mean that those states who do not share US values when it comes to freedom of speech, the publication or sharing of information, openness to critical comment or unregulated markets will be able to do what they want, using the tools provided.
Whether the US will accept this remains to be seen. Recent proposals in the US Senate to fund ‘anticensorware’ and to support projects which undermine restrictions on the use of the Internet imposed by other states[2] would lead one to think that they will resist it, just as the resisted the spread of cryptography for many years.
Trusted Systems
There is already a growing sense among politicians of all parties, in all countries, that this is the right time to regulate the Internet. The Internet is widely used and plays a key part in Western economies, and its ownership is now sufficiently concentrated in large corporations to permit governments to feel that laws will be obeyed, since instead of tens of thousands of recalcitrant programmers and Web publishers it is only necessary to frighten a few tens of company chief executives into complying with new rules. This in turn means that active resistance to new laws and mor e restrictive technology is likely to be much harder than it was in the past.
Proposals for ‘trusted’ computers lie at the centre of this nexus, and have become the flashpoint for the battle between those who want to retain an Internet much like today’s and those who want to see a network which is amenable to regulation and control.
A trusted system is one made up of processors, networks and code which can be relied upon to operate in specified ways and which, because of this, can be made to follow externally imposed rules governing their operation.
On a regulated network power consists in having the ability to certify programs and files. This is done by using a specific, recognised digital key[3] to create an electronic ‘signature’ for the binary data which makes up every computer file, whether it is a written document, an image or even a program which can be run.
Instead of writing a program like Napster, releasing it onto the Internet and watching as millions of people use it freely, any programs that will run on trusted computers will have to be digitally signed by an appropriate authority before they can be used. Dangerous, subversive or inappropriate programs will not be signed, and so will not run.
Current plans are to make the use of these ‘trusted’ components optional, so that unsigned files and programs will still be usable, but it seems inevitable that use of the open network will decline and may even by outlawed in some countries. It is reasonable to assume that the regulated, trusted, network will be dominant within five years of its general availability — by 2010.
The regulated network that is currently being designed and built by the large companies who are members of the Trusted Computer Platform Alliance checks the digital signatures of a file or program against a comprehensive database and will only read, edit or run those which are approved.
The most likely technical architecture for this signing system is a hierarchical scheme in which one digital key, the ‘root’ key, is used to sign other keys, which are then in turn used to sign keys and files. The authority of any particular key then derives from the root and can be traced to it.
This system is very similar to that used at present for certifying secure Websites, where a central signing authority issues keys to online shops, and they use these keys to verify their identity and to make possible the encryption of traffic between their server and a customer’s computer.
That system is entirely outside government control: the root key is created by and held by a private company; the technology is developed by and managed by the server and browser developers; and the decision as to whether or not to certify a particular Website is entirely a commercial one.
It is therefore clear that the fundamental decision to be made when considering the development of the regulated Internet is whether the signing process should be in the hands of private companies or whether it should be carried out by government and controlled by statute.
Resistance is Useless
It is not enough simply to resist or seek to derail moves to develop and introduce this new technology. The corporations who favour trusted systems already exert a great deal of influence over the development of the technology through interest groups and industry consortia, and they have started to construct the legislative wrapping that must go around the technology to ensure that its use serves their interests over those of other groups, like individual network users or governments who wish to promote social objectives.
The Digital Millennium Copyright Act and the European Union Copyright Directive both provide strong legal protection for digital rights management systems, a key component of a future trusted network, but they do not provide guarantees that fair dealing or moral rights will be protected. These two laws, one in Europe and one in the USA, will be followed by many others, promoted by politicians who are acting according to the wishes of their corporate sponsors or by those who are simply unaware of the other sides of the argument.
The actions of the Recording Industry of America in suing individual users of file sharing networks, attacking providers of such services in the courts and threatening ISPs, companies and universities with legal action, are part of a clampdown on the free use of the Internet, begun at the behest of large entertainment companies like Disney, AOL Time Warner, Sony, BMG.
The freedom of expression which was once available to users of the Internet Protocol is being stripped away, and the public space of the Internet, that unstructured, unregulated zone of innovation that I call the dot.commons, is being destroyed. Our freedom to play, experiment, share and seek inspiration from the creative works of others is increasingly restricted so that large companies can lock our culture down for their own profit.
It is also important to realise that governments from all sides of the political spectrum, from open and closed societies, from Denmark to Saudi Arabia, have realised that effective legislative control over the network and online activity requires a technical infrastructure to support surveillance, monitoring and sanctions.
This is not being done in order to make the Net safer or more secure for its users, or for the five and a half billion people who have yet to use it. It is an attempt to assert complete control over an increasingly important area of people’s daily lives so that companies and owners of intellectual property can maximise their profits and ensure that they alone benefit from the creative use of the online space. In the end we will see the complete destruction of the public spaces defined by the Net and we will have lost the freedom which made cultural, social and technical innovation possible online.
This is a tragedy for those of us who operate in the Net’s public spaces. It is the equivalent of the enclosure of common land in the 18th century, depriving the people of space to graze their animals and grow food crops. However it is a tragedy which can be avoided, but only through concerted political action. The solution does not lie solely in programming but in activism, organisation and political will. Instead of rejecting trusted systems and hoping that our protests and activism can derail their development and confound the knavish tricks of their proponents, we should embrace them, and ensure that they are developed and implemented in the interests of the people.
There are good reasons for building trusted systems but there are also bad ones. An architecture of control is only as good as those who implement it, and it seems that the people who want to be in charge of tomorrow’s network are interested in neither freedom, truth nor justice but only in power.
Anyone who is concerned about the future development of the Internet, and who cares about the dot.commons, should work to ensure that laws passed to allow for effective regulation of the Net do not have the side-effect, whether deliberate or inadvertent, of destroying the Net’s public spaces.
Getting Real About the Net
Before we can begin to formulate a plan for maintaining online freedom over a regulated network we need to clear our thinking of two illusions which have distorted so many analyses of the Net’s future.
The first is the temptation to see the Internet as a new world, one in which old ways do not apply and real-world laws, politics and forms of engagement are irrelevant. There is a great temptation to take the idea of ‘cyberspace’ – William Gibson’s word for the space defined by all the connected computers in the world – and treat it as if it was really there.
It is, it must be admitted, an easy trap to fall into, but the reality is that talking about the Net as if it were a place is merely an abstraction, a useful way of encapsulating the many ways we interact with each other over the Network. We do not need to give up this concise and valuable way of thinking about the whole range of activities which we carry out online, but nor should we allow it to lead us to treat online interactions as if they happen away from the real world.
Treating the Internet as a place – as Clay Shirky[1] argues we should – rather than just a communications channel is a useful way of conceptualising it. Acting as if it is a separate universe is just foolish: the space defined by the Net is an extension of our real world, and it inherits many characteristics from that real world.
It is also dangerous to fall for the idea that the Net has any essential qualities which we cannot alter. Essentialism is attractive – we all think we know what makes something a table, or what counts as human. But once you look closely you find that none of the supposedly ‘essential’ qualities is really necessary: a comatose victim of a road accident lacks consciousness (and perhaps even a cortex) but we do not therefore treat them as we do other animals; a step can be a table for someone picnicking in town.
Lawrence Lessig and I agree that networks are entirely human creations, and that the Internet we see today can be reshaped in whatever ways we choose. The direct consequence of this is that we must not allow our political strategy in this area to be determined by a reliance on a particular technical aspect of the Net, because technology will be at the service of political will, not the other way around.
A New Dot.Commons
At the moment most of us think the absence of regulation online is a good thing, because it does not allow corporate control, government interference or excessive regulation. Anonymity is possible, even if it is hard to achieve. Obfuscation is easy.
I think we must accept that this unregulated network is going to disappear over the next five years. It will be killed by business, by government and by the freely made choices of millions of people who will select ‘trustworthy’ systems over promiscuous ones, regulated ISPs over libertarian ones and ‘safe’ applications over ones that can be compromised.
Parents will choose to use an Internet connection that gives them a means of limiting their children’s exposure to adult content, pornographic emails or anonymous chat partners. Governments will choose to make e-government services available only to those whose identity can be authenticated. Film studios and music companies and broadcasters will provide content only over the secure network to trusted systems.
Today’s technologies simply do not allow the degree of regulation and control that the Net requires if it is to become embedded in our lives. They will be replaced by computers that do.
There will always be ways to break the security of even the most secure processor, or get unlicensed code running on your secure processor, just as there will always be people who play with technology and do stuff that is unethical, illegal and cool.
But most of the people, most of the time will be using systems that are secured, signed and regulated. And they will be happy to do so because the benefits will be great: they will have online access to government services, banks, shops, schools and other facilities. They will take advantages of improved ways to block spam, viruses and content they find disagreeable. They will feel safe letting their children surf.
Even if we accept that trusted processors, Palladium-style operating systems, signed code and authorised content will define the online experience for most people, most of the time – and that they will accept and even benefit from that – there needs to be more.
If the Net is a city then let it have its office blocks, children’s parks, schools, tourist areas and suburbia. But we should also have seedy dives, places to buy recreational drugs, smoky meeting rooms in which to plot the overthrow of the state, and hotels that rent rooms by the hour too.
We should not be so arrogant as to dismiss these many benefits of the regulated Net or to despise those who don’t care about running their own code, having secret correspondence or changing the world. It is the mistake that revolutionaries on the left have made for generations – it is not one that we should make now.
Accepting the inevitability of the trusted network is not the same as accepting how it will be regulated and controlled. The fundamental question, and one that will be answered within the next two years, is whether the controls on the new network will be in the hands of governments or corporations, and the answer to that question will largely determine the quality of our online experience.
ENDS
This paper began life as the New Media Knowledge/Cybersalon Christmas Lecture, “Is big business destroying the Internet?”, which was delivered at the Institute of Contemporary Arts in London on 9 December 2002.
Since then it has been through many changes, not least as the result of extended argument, discussion and chat with James Crabtree of the Work Foundation’s iSociety project, Dr Richard Barbrook from the Hypermedia Research Centre at the University of Westminster, and Tom Steinberg, currently ‘somewhere in the administration’.
Find out more about Cybersalon at www.cybersalon.org and about iSociety at www.theisociety.net. Find out more about me at www.andfinally.com or get in touch at bill@andfinally.com.
This paper was written in Cambridge, London and Venice. And on trains, planes and boats in between.