I gave this talk on 27 May 2009, in Lecture Theatre 1, at the Computer Laboratory, University of Cambridge to an audience that included Maurice Wilkes. It was, as they say, challenging.
The world has moved on since 2009 but I stand by what I said: computation has been loosed upon the world, and we must adapt.
This is how my lecture was advertised:
In May 1959 CP Snow used the Rede Lecture in Cambridge to explore the notion that British society, its education system and its intellectual life were characterised by a split between two cultures, the humanities and the sciences.
Today the divide that matters is that between those who can count in binary and those who can’t, between the culture of the technologists and coders and that of the users. The division is similar to, but not co-extensive with, that identified by Snow simply because most scientists are, thanks to the technological basis of their research, computer-literate, while many of those in the arts, humanities and politics will be wondering what happened to the other eight cultures referred to in the title of this lecture.
Writer and journalist Bill Thompson took the Diploma in Computer Science in 1983 and now describes himself as a ‘technology critic’, straddling the two worlds in his work for the BBC, Arts Council England and others.
In this lecture he will consider what level of understanding of computer science is needed in order to be an effective and engaged member of modern society. Is there a technological equivalent of Snow’s complaint that the literary people of his acquaintance did not know of the Second Law of Thermodynamics?
Should everyone code, or is it enough to understand what Roger Needham meant when he claimed — as he so often did — that every problem in computing can be solved with another level of indirection?
Fifty Years After Snow
Fifty years ago CP Snow stood up in the Senate House, the imposing building on King’s Parade which some of you will have walked past this morning on our Computing ‘treasure trail’, to give his Rede lecture on ‘The Two Cultures’, a lecture that has echoed through the years since and been the subject of a great deal of debate in this, its anniversary year.
Snow spoke at an institution that was then 750 years old, one that could look back on three-quarters of a millennium during which it had worked hard to regenerate our view of the world and our understanding of it.
Fifty years on, and in the midst of the University’s 800th anniversary celebrations, we sometimes wonder what academia is for, especially in a world that seems to value the contribution of scholars so little.
I think the Computer Lab’s own Ross Anderson got it right in his recent essay on Cambridge University — Unauthorised History when he said:
Just as fire regenerates the forest, so a great university regenerates human culture — our view of the world and our understanding of it. We incinerate the rubbish. And Cambridge has long been the hottest flamethrower; we’re the most creatively destructive institution in all of human history. And big new things come from that. The ground we cleared made us the cradle of evangelical Christianity in the sixteenth and seventeenth centuries, of science in the seventeenth and eighteenth, of atheism in the nineteenth, and of all sorts of cool new stuff since — including the emerging sciences of life and information.
Part of that process is a constant questioning of the assumptions that are so embedded in ordinary discourse that they disappear from view and become part of the ‘common sense’ that has so often misled us in the past.
Another part is the creative imagination that engenders new hypotheses, new explanations, new models and, of course, new technologies. And it is one of those technologies, one that owes a great deal to work done at Cambridge, that has prompted this attempt to recast Snow’s two cultures.
Because when Snow stood up to speak at the Senate House, ‘a bulky, shambling figure’ according to Stefan Collini in his excellent and insightful introduction to the Canto edition of ‘The Two Cultures’, an invention that was to change the world was celebrating its tenth anniversary just a few hundred yards away in the Mathematical Laboratory.
EDSAC, the Electronic Delay Storage Automatic Calculator, ran its first programme in May 1949 and for a decade it had been offering data processing services to mathematicians, economists, biologists and any else who could think of use for the radical new technology of the general-purpose stored program computer. EDSAC 2, which was to replace it, had already been running for a year, foreshadowing the Titan, CAP and many other Cambridge computers.
EDSAC was not alone, with the Mark 1 in Manchester only the most notable of the other computers that were being built around the world. These early computers were the first signs of a revolution that has proven to be just as significant as the industrial revolution that so preoccupied CP Snow, a revolution which we are still living through and the outcome of which is far from certain.
Snow, perceptive as ever, glimpsed what was going on. In The Two Cultures he differentiates between the industrial revolution, which he sees as essentially complete, and a ‘scientific revolution’ which is in its early stages, saying:
I believe the industrial society of electronics, atomic energy, automation, is in cardinal respects different in kind from any that has gone before, and will change the world much more. 
He was right, and we now live in a world that is shaped by the science-based technologies which he identified. However the enormity of that change has also invalidated much of the argument that animated The Two Cultures, and in particular it has stripped away any last vestiges of credibility from his belief that rapid industrialisation was the way forward, as I hope to demonstrate.
What was Snow saying, and what was he trying to say?
Snow’s argument in The Two Cultures, and in his later work on the same topic, most notably the 1963 essay on The Two Cultures: A Second Look, has become such a common trope that it is almost impossible to retain the complexity of the original argument. Like a soft-bodied creature flattened and fossilised in the Burgess Shale we find it hard to reconstruct the internal organs of a discourse that was not only of its time but also shaped by the particular history of one man in post-imperial England.
As I read it, his concern about the breakdown of communication between literary intellectuals and scientists was not driven by a general desire to see harmony and shared thinking in the halls of academe but came from his observations as a former practising scientist, as a writer of some note and as technical director of the Ministry of Labour from 1940 to 1944 and a civil service commissioner from 1945 onwards.
Snow believed that science-based technologies could be applied to solve the world’s problems as he saw them, and that industrialisation was capable of removing the division between rich world and poor world. The scientific ignorance of the chattering classes mattered to Snow because it had real practical consequences, and the denial of culture by scientists was a cause for concern since it limited their imaginative creativity when it came to finding applications for their discoveries.
He also thought that things would inevitably get better:
Life for the overwhelming majority of mankind has always been nasty, brutish and short. It is so in the poor countries still.
This disparity between the rich and poor has been noticed. It has been noticed, most acutely and not unnaturally, by the poor. Just because they have noticed it, it won’t last long. Whatever else in the world we know survives to the year 2000, that won’t. Once the trick of getting rich is known, as it now is, the world can’t survive half rich and half poor. It’s just not on.
This belief was not simple naivety.
In 1959 it was difficult to see just how hard the affluent industrialised countries would fight to preserve their privileges, how the geopolitics of the Cold War required keeping many countries in poverty in order to achieve perceived advantage, and how the realisation that human impact on the biosphere was significant enough to threaten the survival of the species would make rapid and untrammelled carbon-based industrialisation no longer acceptable.
It was also impossible to predict just how much the electronic technologies to which he referred would change the rules of the game. In 1959 Snow could write:
One truth is straightforward. Industrialisation is the only hope of the poor.
This may have seemed the case at the birth of the digital age, but it is no longer our core belief, partly because we are becoming aware of the negative consequences of the industrial age but also because we see that another way is possible. We are living through a digital revolution, and the use of computers is having an impact on all aspects of our lives and on the societal structures that are being built in all the countries of the world.
Twenty-five years ago, half way between Snow’s lecture and today, I was completing the Diploma in Computer Science and being inducted into the modern freemasonry of programmers. I learned Pascal from Frank King and BCPL from its creator, Martin Richards and went off to work for a software company on King’s Parade — in Sinclair and Acorn’s old offices — coding in C on a UNIX system.
I have watched the industrial world become the networked world, a world that depends on electronics and digital processing just as that of Snow, like that of Orwell, depended on steam to power the ships, trains and machines in factories.
Of course underneath it all we still depend on coal — and oil and gas — to generate the voltage differences that drive the electrons through our circuits and make the processing possible, but the fact the the electricity generated no longer animates copper coils in magnetic fields to move machines but instead opens and closes gates in transistors etched on silicon is what matters most.
A Digitised World
The modern world is bit-driven in the way that the nineteenth century was steam-driven. Those with access to digital technology are able to dominate those without, not by creating expansionist empires as in the first industrial revolution, but by creating the structures of the global economy around their perceived interests. The gunships and civil servants may have been replaced with copyright treaties and WTO sanctions, but the effect is much the same.
It is over a hundred and fifty years since Karl Marx and Friedrich Engels outlined a theory of history which claimed that the economic base, or infrastructure, of a society shaped what the rest of society would be like. They believed that politics, culture, family structure, the mass media and everything else — what they term the ‘superstructure’ — depended on the way the economy works, so that if the economic base changed then daily life would also change.
We need not accept the political philosophy which Marx built around his economic model to realise that this is happening now, that daily life and cultural structures are changing as the underlying economy transforms from post-industrial to digital capitalism.
The network, built and designed to permit fast exchange of information between companies, built to facilitate financial transactions both wholesale and retail, has become a conduit for individual self-expression and the result, at least in the developed West where access is becoming universal, is that political processes, media models and the assumptions of everyday life are being changed.
In 1999 I wrote a comment piece for trAce, the online writing community at Nottingham Trent University, in which I argued that the really important writing of the latter part of the twentieth century was not prose but code.
The literature that matters is not the work of Joyce or Piercy or de Lillo or Weldon or any of the authors we celebrate. The really important literature, from the 1950’s onwards, was written by tens of thousands of programmers writing software for the computer systems that now underpin our world. It was lines of COBOL (for the business users) Fortran (for the scientists, mathematicians and engineers) and C (for everyone else) that changed the world.
The programs written in these languages enabled us to fly to the Moon and back, allowed global capitalism to triumph over communism — whether you think this good or bad is irrelevant — and allowed the security services to listen in to our phone calls. They made the Internet possible, so that the few thousand lines of code written by British physicist Tim Berners-Lee in 1989 to implement his ‘World Wide Web’ have changed the world far more than any novelist or poet could.
At the time I was just trying to stir things up, but in the ten years since it was published I think my argument has been strengthened by the massive growth in internet use, the widespread adoption of mobile technologies like smartphones and of course the complete and comprehensive rewiring of the world’s financial and regulatory systems around the affordances of information and communications technologies.
The point of literature is to reflect, analyse and endorse universal or shared experience, while the point of code is to make or enable a process, and I don’t want to conflate the two, just to note that code has an importance that is often overlooked.
The serious issue is that much of this important writing is closed off to the majority of those whose lives are shaped by it. A nyone with a basic level of literacy can read a poem or a novel, but the lines of code that shape our existence are meaningless to most.
While the key writing of the twentieth century may have been the works of Joyce and Beckett, the key writing of the current era is exemplified by Apache — which I can show you — and Windows 7 — which I can’t. If one of the triumphs of the industrial revolution was the move to almost universal literacy then the current failure of the digital revolution is surely that most of humanity are functionally illiterate when it comes to code, and that many do not even realise that there is a language there to be understood. It is time that we changed this.
It is probably clear by now that there are only two groups here, not ten. The 10 in my title is the sort of geeky joke that appeals to those on the digital side of the dividing line, marking a separation from those who still inhabit an analogue and profoundly decimal world.
Knowing how to count in binary, useful though it may be, is not enough. Today the key skill, the one that transcends all others once language, reading and numeracy are acquired, is what can best be termed computer literacy. Not what is called ICT in schools. Not the ability to turn on a computer, lay out a spreadsheet or enter data into a form, but, but an appreciation of how computers and networks work and how the basic principles of computer operation result in rich complex interlocking systems.
For Snow the division between artists and scientists in British culture was important because it meant that those who were charged with managing and planning the economy were poorly-equipped to do so. The division between those who understand computing and those who do not seems to me to matter more, and could have more impact on the way modern society develops than that between Snow’s parodic literary intellectuals and scientists, because it is a division that grants power to those who understand how the network society is built, who appreciate the deep meaning of Lawrence Lessig’s mantra that ‘code is law’ and that the choices we make in protocols, interfaces and implementations directly determine the capabilities of the tools we put into the hands of others. There are no guarantees that those granted power under the new dispensation will use it wisely.
The point is not that everyone should be a programmer, but that everyone should understand what it is that programmers do and how their work is embedded in a broader way of thinking about the world — what can best be called ‘systems thinking’.
In addition, just as anyone who is literate can read Shakespeare, even if they do not understand all of it, so anyone who is systems literate should be able to read code, to appreciate its structure and the algorithms and behaviours expressed there. After all, if code is law then an inability to read the code implies ignorance of the law, with all the risks that entails
My two cultures are not the same as those identified by Snow, and the issues that I am trying to address are also different, but we are both concerned with the exercise of power in society.
Snow believed that the separation that concerned him emerged in the twentieth century from the growth of science and a consequent scientific careerism that required specialisation from an early age and an almost wilful ignorance of all things outside, especially high culture. He thought the inability to engage in culture came partly because scientists saw no utility in it, for their careers, for their lives or for the wider society.
Overspecialisation in the British — or rather the English — education system was allowed to run unchecked because it served the interests of the universities — those destined for power were offered the classical/humanities education, those for science/engineering pushed in the other direction. Of course he also noted that the system was different in the US and USSR. China and India did not feature, as Two Cultures is very much an essay of its time.
The world has changed since 1959, and the context within which Snow was speaking has vanished, most notably in the period since 1989 when the Berlin Wall came down and the Soviet Union imploded.
Yet since 1983, when the Internet was created, and 1991, when the World Wide Web was invented, a new world order has emerged, one based around digital networks and one that creates an entirely new set of issues.
In the period since I was studying for the Diploma the digital world has succeeded so rapidly that anyone who did not fully engage is left behind. And the pace shows no sign of letting up.
For me the key division is between those who know what coders do and those who do not, and the question to ask someone is not ‘what is the second law of thermodynamics?’ but ‘what is a recursive function?’, but others see the dividing line in a different area.
John Naughton, fellow of Wolfson College here and Professor of the Public Understanding of Technology at the Open University, believes that the real ideological division is between those who welcome open networks and open standards and those whose thinking is closed. He says:
the gap is not between the humanities and the sciences but those who are obsessed with lock-down and control, on the one hand, and those who celebrate openness and unfettered creativity on the other.
And goes on to note that
The odd thing is that one finds arts and scientific types on both sides of this divide.
John refers to James Boyle’s description of the divide between those who are ‘culturally agoraphobic’ and those who are not. Boyle, the William Neal Reynolds Professor of Law and co-founder of the Center for the Study of the Public Domain at Duke Law School, contrasts those who are willing to embrace the creative potential of the Internet with those who favour planning and management.
In one thought experiment he asks which of us would have chosen an open source, user-edited collection of the world’s knowledge over the profitable but rigidly controlled Encyclopedia Brittanica in the early days of the Internet.
For Jonathan Zittrain the division is inherent in the technology, and lies between the closed and tethered systems of Apple’s iPhone and games consoles and the open, generative potential of Linux running on a generic PC.
All of these are real and important distinctions, but choosing intelligently between an open and a closed network or a generative versus a tethered system is surely only possible if we understand enough about computers and how they work to be able to make sensible decisions. Otherwise we are relying on the positive connotations of the word ‘open’ to distinguish it from ‘closed’, and as any proponent of closed networks will tell you ‘open’ also implies open to abuse, to viruses, to phishing and to content that could anger or offend.
We cannot evaluate the work of the Internet Watch Foundation, the voluntary body that censors the UK internet, or decide whether Phorm should be allowed to inspect our packets or Virgin Media throttle our connections or the French HADOPI terminate the internet connections of customers accused of copyright infringement unless we have some level of understanding of what the network is, about the different layers that make up the TCP/IP stack, about protocols and data structures and computation and programmes. And about programmers.
Why it matters
I trust that most of you will accept that the distinction I’m describing is real, that there is a gulf between the majority of those who use computers here in the UK and other developed countries and those who can write or even read code.
However you may think that the distinction, even if it exists, does not matter that much and certainly does not merit any real attention. I hope that I can convince you otherwise.
I drive a car but I don’t understand it. When the clutch needed replacing I could not tell if a repair was possible, if the work was needed or even if it was well done. When the brakes ‘broke’ I had to trust in KwikFit, who told me that the nut on one of my ‘calipers’ was seized and charged me £300 for the privilege of having them break the relevant part so they could replace it.
Those who do not understand computers are in the same position as I was with respect to my car, and when they have problems with their home or office computers they too may find themselves unable to determine what is reasonable.
But the importance of computers and computerised systems in so many areas of modern society means that they are also exposed in other ways. If they cannot understand the systems that are used to record, monitor and control so many aspects of their lives then how can they play a full part in the debate about the development and deployment of those systems?
Without this understanding people are in danger of giving credence to absurd and inflated claims about the capabilities or dangers of new technologies, falling for media scares like Susan Greenfield and Aric Sigman’s nonsensicial and self-serving speculations or indulging in moral panics over new tools, services and technologies and the ways they threaten us. They are likely to acquiesce to the procurement of vastly oversold IT systems like NHS Connecting for Health or the national identity database, to accept that ‘the computer says no’ when faced with poor customer service, and to miss the many opportunities that computerisation offers to make the world a better place.
It is time to do something about this.
Similar but Different
As an aside, it is worth noting that the two cultures I’m identifying do overlap quite significantly with Snow’s two cultures. A couple of weeks ago I had supper with the writer Martin Amis after he’d given a talk at an event organised by Writers’ Centre Norwich, where I sit on the board. During his talk one of the audience had asked a question in which they mentioned something Amis had posted on Twitter, so I checked and found that there was indeed a Twitter user called ‘MartinAmis’.
Over supper I enquired whether it was really him, and was told that not only did he not use Twitter, he didn’t know how he might use it and was only just capable of reading his emails. The similarity to Snow’s tales of literary intellectuals who boasted of their ignorance of basic physics was not lost on me.
I doubt that many scientists could cope without email, while most hard science requires programming skill, and all of it requires more than user-level familiarity with hardware and software. SPSS, Mathlab and Octave are all really just high-level development environments, while the internet came out of the scientific research community, and programmers and software engineers are in many respects the intellectual children of the physicists, mathematicians and engineers of Snow’s day.
However co-extensiveness does not imply identity, and the distinction to which I am hoping to draw attention is not the same as that identified by Snow, even if the relevant populations are similar.
Snow’s division between physical scientists and literary intellectuals has been replaced by the gulf between the geeks and their managers, between the coders and the policy-makers, between those who pursue technology-based scientific research and those enacting laws that fail to account for the transformations brought about by the resulting science-based technologies.
Like the engineers of the Industrial Revolution, who created the world over which the classically trained managerial classes held sway, the programmers have given us a digitised, networked and connected world full of wonder and opportunity.
Anyone can use a modern computer with a modest amount of training and practice, but genuine digital literacy is not about using. It is about understanding, exploiting, changing and specifying. This is why open source is important and free software is vital if all citizens are to appreciate the affordances and the limitations of digital technologies and exert control over them.
The Programmers Are Revolting
It’s important to engage because the revolution is all around us. At the event I mentioned earlier Martin Amis read from his new novel, to be published next year, called The Pregnant Widow. The title comes from a rather patriarchal observation by Alexander Herzen that countries in revolution are like those where the king dies leaving a pregnant widow — the heir is apparent, but not yet present.
In a recent interview in The Independent Amis offered the following explanation:
Alexander Herzen said that after a revolution we should, on the whole, be braced by the fact that one order has given way to another; but what we are left with, he added, is not a birth, not a newborn child but a pregnant widow — and there will be much grief and tribulation before we hear the baby’s cries.
Clay Shirky said something similar in his essay Newspapers and Thinking the Unthinkable when he pointed out that during revolutions old ways of doing things get broken faster than the new ones can replace them. We see this in the flailing stupidity of the music and film business, in the ridiculous news that newspapers are to be offered tax breaks that are denied blogs, in the incoherent ramblings of Rupert Murdoch, a man who understands print and broadcast but now froths at the mouth like Lear on the blasted heath, railing against the new media.
History doesn’t end, but eras do. Between 1992 and 2002 we saw the end of the post-war era and the first flowering of the network world, and Web 2.0 technologies are now taking us into a new phase of social and economic development in which the network is both the cause and the product of globalisation and a key driver of what the economist Joseph Schumpeter called ‘creative destruction’
This has significant implications for all of us. Sometimes areas of business and even whole areas of the economy simply cease being useful or necessary, and their practitioners must look elsewhere for inspiration or employment or profit, because no form of business, not even the mightiest company or the most philanthropic publicly funded corporation, has any right to existence.
Today we are seeing massive and continuous innovation both online and offline thanks to the network and the technologies that it supports and sustains, and those who do not understand and appreciate how those technologies work will be left at a significant disadvantage.
Those who were granted power under the old dispensation will either lose it or find themselves unable to exercise it in the ways they previously could, while those who understand and control the technologies may find themselves granted influence that they do not merit.
Let us return to Cambridge, the source of so much destruction but also the root of creativity and inspiration. Ross Anderson sees the university as a machine for inflicting creative destruction on industries, ideologies and academic disciplines. If the primary purpose of this institution is to break things, then it is clear that digital technologies are the next big toolset for breaking the established order.
But Ross believes that the destruction is a way of ‘clearing the ground’ for what will come afterwards. Can Cambridge be a source of renewal and help to bridge the new cultural divide?
Britain is a small offshore island in the large European Union, and compared to the power of the United States — even during the current economic unpleasantness — and the growing importance of the countries like China, Brazil and India what happens here would seem to count for little. We comfort ourselves with talk of ‘creative industries’ and ‘knowledge economies’ in order to ease the pain that would come if we acknowledged our loss of manufacturing capability and the parlous state of our financial services ‘industry’.
But we are and remain a significant node in the emerging network world, and the influence of our culture and scholarship is still strong, so perhaps the way we do things here can have an influence over the wider world. Perhaps we can find a way to reconcile the different worldviews that seem to animate the programmer and the non-programmer.
A Programme for Change
I went to university in the late 1970‘s at a time when the recommendations of the 1963 Robbins Report had been put into effect, so that the state offered an advanced education to anyone who could benefit from it. I was lucky enough to make it from my comprehensive school in a dying steel town to Cambridge University in that brief period when the UK believed that educating those who could benefit from it was advantageous for the wider society, just before Thatcher and Blair between them tore the system of student support apart.
My first degree was in Philosophy and Experimental Psychology, and the first time I touched a computer was on the top floor of the Pyschology Lab where I used Acorn Atoms connected to one of the earliest Econet networks to control an experimental rig. I programmed in a real-time variant of BASIC called, I think, ONLI-BASIC. I did it because nobody told me it was hard.
Two years later the Diploma taught me how computing works all the way down to the silicon, and over the years I have realised that this is what has enabled me to keep up. The knowledge I gained twenty-five years ago has supported me throughout my career in computing, journalism, new media and policy-making. It is the difference that made the difference for me, between being just another ill-informed hack and someone who understands and can contextualise.
We live in a world where science-based technologies have led to technology-based science and the positive feedback that results allows us an unparalleled ability to manipulate the natural world, to transcend our evolutionary history and shape the small area of the universe that we currently inhabit to meet our emerging needs.
We are making key decisions about how we organise the world, what we will do about the shifts in the biosphere caused by the runaway industrialisation which Snow saw as the solution to the worlds problems, and how we can remove want and suffering in a world that may soon hold ten billion humans.
We increasingly rely on digital technologies to help us, and projects like Andy Hopper’s ‘Computing for the Future of the Planet’ offer an insight into the many different ways computers will be used to solve problems, avoid difficulties and aid progress.
Policy, practice and spending decisions rely on understanding the options, so we cannot have politicians, businesses or citizens operating in ignorance of what these things are, how they work or what they can or can’t do. We need to ensure that those in power now are properly educated, and we also need to give young people the tools and awareness needed to shape the world they will inherit from us.
We also need the political system to respect and ingest geek wisdom, and not just in Tom Watson’s enclave in the Cabinet Office where the Power of Information Task Force beavers away.
This is vitally important, but bootstrapping a new world in which an understanding of computing is seen as part of basic literacy is an immense task, not least because relatively few people even know what bootstrapping is.
Perhaps we need a set of initial orders instead, on the lines of David Wheeler’s originals from 1949, something that will take an awareness of the central principles of computer operation and load it into the popular consciousness so that more detail can be added.
Infecting Popular Culture
There are a number of things that could be done to dispel the mythology, reduce the degree of mutual ignorance and suspicion and encourage new forms of literacy. One option would be a campaign of popular education.
We already have SysAdmin day on July 31st so perhaps we can also have Hug a Geek Day, or a day to encourage everyone to Embrace your Inner Coder?
Perhaps those of us here who do know why memcpy() is so dangerous should talk coding practices at supper parties just as one would dissect the use of an unreliable narrator in a popular novel, or explain to our friends why understanding inheritance in C++ is as important as it is when reading Bleak House.
We could also work harder to build an awareness of computing and computer science in popular culture. Why is that not one of the youngsters of Albert Square is heading to university to read Computer Science? Jennifer Archer runs the Ambridge Website but clearly uses a simple template-driven content management system, yet who remembers that Robert Snell originally ran a software house?
Is it time for Buffy the Vampire Slayer to become Buffy the Buffer Overflow Slayer, or even Buffy the Bounds Checker? Should Kevin Smith’s remake of Clerks feature Jay and Silent Return?
Engineers have the Enterprise’s Scotty and Firefly’s Kaylee, but coders have nobody in popular culture. Yet Andy Hopper could be a cult hero all by himself — flying in to squash bugs, computing to save the planet…
Or perhaps not.
Whatever it is, we need it soon, because a digital revolution is sweeping over us.
Things fall apart, the centre cannot hold, and computation is loosed upon the world. The division between the scientists and the literary intellectuals mattered to Snow because the elite who he believed would run the world were drawn from the scientifically ignorant and therefore likely to fail. Today we look for a fairer world, so the task is not to educate the oligarchs and central planners but to ensure that every single one of us understands how this remarkable, transformative technology works. Only then can each of us be a full participant in the debate over how we change the world.
I don’t have a programme, like Snow, to industrialise the world as a solution to our many problems, and I don’t choose to impose the will of the western managerial classes on the ‘poor’ of the ‘developing’ world. Instead I believe that if people are to make their own choices they need to understand computers, because our world is being remade by the capabilities and affordances of digital technologies. Just as a farmer needs to know how to fix the tractor used to plough the fields, so anyone who wants to shape the world rather than be controlled by others needs to know what it is that coders are up to.
If that is granted, much else will follow.
The Two Cultures is the title of an influential 1959 Rede Lecture by British scientist and novelist C. P. Snow. Its thesis was that the breakdown of communication between the “two cultures” of modern society — the sciences and the humanities — was a major hindrance to solving the world’s problems. As a trained scientist who was also a successful novelist, Snow was well placed to articulate this thesis.
The talk was delivered 7 May in the Senate House, Cambridge, and subsequently published as The Two Cultures and the Scientific Revolution. The lecture and book expanded upon an article by Snow published in the New Statesman of 6 October 1956, also entitled The Two Cultures. Published in book form, Snow’s lecture was widely read and discussed on both sides of the Atlantic, leading him to write a 1964 follow-up, The Two Cultures: And a Second Look: An Expanded Version of The Two Cultures and the Scientific Revolution.
Snow’s ideas were not without critics, however. For example, in an essay published in The Spectator, the literary critic F. R. Leavis dismissed Snow as a “public relations man” for the scientific establishment. On the other hand, The Times Literary Supplement included The Two Cultures and the Scientific Revolution in its list of the 100 books that most influenced Western public discourse since the Second World War.
 Cambridge University Press, Canto edition, 1993. ISBN-13 978–0–521–45730–9 paperback.
 page 30 of the Canto Edition of The Two Cultures.
 p42 of the Canto edition
 p25, Canto edition
 See for example ‘s Theory of History: A Defence, Gerald Allan Cohen,1978, Oxford University Press
 See his recent Observer article at http://www.guardian.co.uk/media/2009/may/03/digital-media-john-naughton, or his earlier comments from his 2002 address at University College Cork, at http://www.ucc.ie/opa/naughton.htm
 According to Schumpeter, Creative destruction is a ‘process of industrial mutation.. that incessantly revolutionises the economic structure from within, incessantly destroying the old one, incessantly creating a new one’