Talking 'bout a revolution

The ancient city of Sravasti, nestled at the edge of the Himalayan mountains in northern India, was a favourite haunt of Nepalese-born Siddhārtha Gautama Buddha. For two decades he came to a monastery just outside the city for his summer retreat; at one side of the monastery was a grove, where he would, on occasion, preach to the other monks present. Sometimes over a thousand would gather to listen to his words.

On one particular occasion he chose to join the other monks in gathering alms, going into the city with a bowl to do so before returning to the cushioned areas of the grove. On his return one of his more revered disciples, known as the venerable Subhuti1, asked him a series of questions. “World-Honored One, if sons and daughters of good families want to give rise to the highest, most fulfilled, awakened mind, what should they rely on and what should they do to master their thinking?” asked Subhuti. As the Buddha gave his answers, they were documented in a short book, for the benefit of all those who could not be present. The title of the book compared the words of wisdom with the perfection and accuracy of a diamond. For centuries the Diamond text, or Sutra.2, was painstakingly copied by hand.

Until, that is, some bright spark had the idea of carving the words into blocks of wood, which could then be coated with a layer of ink and pressed against a sheet of material. In doing so, of course, he or she unbeknowingly changed the world. As we saw with Darius the Great, the history of human conquest has been limited by the ability of those in power to get their point not only across, but also then distributed across long distances. It is unknown who first ‘invented’ the woodcut – but the practice of making such printing blocks developed at similar times in both Egypt and China, in the first few centuries AD. The earliest books we know about are 7-8th centuries AD — such as the Diamond Sutra itself. Without it, we might have known some vague tales about a historical figure who was renowned for his wisdom. With it, we know what he had for breakfast.

Wood cuts signalled the arrival of mass production, and the beginnings of literacy. Before it, the only way to get a message to the masses was to copy it wholesale, then distribute it as widely as possible via military staff, priests and town cryers, all of whom required the ability to read. While woodcuts may have become prevalent in what the West knows as the East, they took a long time to reach Europe. In 1215 for example, when the Magna Carta was signed by the King of England and two dozen of his barons, a team of scribes then painstakingly copied out the document so it could be sent to the key population centres of the nation. Even once distributed however, messages would not always reach an able audience: restrictions on availability of writing scholars also linked to the ability of the general populace to read — which may have been controlled by religious and secular authorities who wanted to keep such capabilities to themselves.

In consequence the two notions — supply of written materials, and demand through general literacy, went hand in hand. For writing to reach the broader population required a significantly more efficient mechanism of mass production. The principle of arranging blocks of letters and characters in a frame was not that massive an extrapolation from carving out entire pages (indeed it fits with the first two characteristics of Perl creator, Larry Wall’s remark, that “The three attributes of a programmer are; impatience, laziness and hubris.”) All the same the idea of moveable type took many more years to develop. It was first identified as an idea in China and Korea, but it was not until the middle ages, when Gutenberg developed the printing press, that it really ‘hit the mainstream’ in modern parlance. Gutenberg’s story was like that of the world’s first technology startup — he was short of money and his ideas were challenged at every turn.

But persevere he did, in doing so assuring that the world would never be the same again. Gutenberg’s press immediately solved the problem of having one painstaking process causing a bottleneck to many others. While Roman historian Tacitus’s writings could only3 reach a limited audience, diarist Samuel Pepys was bestowed with the advantage of printing to spread his message. Quite suddenly, there was no limitation on spreading the word, a fact jumped upon by individuals who felt they had something to say. Names we may never otherwise have heard of, such as Witte de With4, gained notoriety as pamphleteers — they could express their views to a previously inaccessible audience at relatively low cost, their paper-based preachings becoming a catalyst for establishments set up for discussion and debate (and good coffee) such as Lloyds of London, spawning whole industries (in this case, insurance) in the process.

With Gutenberg, the genie of influence was out of the bottle. It is no coincidence that we see the development of mass literacy in the period that followed, as the written word found its place as a fundamental tool of communication among the masses. What with pamphlets such as Shelley’s5, “The necessity of atheism,” one can see why the established Church sought to suppress such a devilish scheme. Once the telegraph also came onto the scene, either messages could be printed in bulk and passed out locally, or a single message could be transported to a distant place where it could be typeset and printed (incidentally, a precursor to today’s computer-based mechanisms that balance the constraints of the network with that of local storage). And the French and American revolutions were built on a trail of paper flyers from the likes of Maximilien Robespierre, Thomas Paine6 and Paul Revere. Indeed, alongside manifestations and battles was fought a ‘pamphlet war’ in which opposing sides presented their vies on paper. Biographer Thomas Carlyle called7 the period 1789–95 ‘the Age of Pamphlets’, as authoritarian censorship was washed over by a wave of considered writing.

Once the advent of the telegraph did away with the challenges of distance, it was only a matter of time before the considered thoughts of anyone in the world could have an influence on anyone else. Online bulletin boards and Usenet News sites had their audiences, but these were largely closed: even the arrival of broader-market services such as America Online tended to hide debate away in hierarchies of conversation. The advent of the Web started to change things of course, as suddenly the notion of pamphleteering could extend online. According to Wikipedia, one of the first8 ‘socially minded’ web sites was that of the human rights organisation Amnesty International, created in 1994. All the same however, both cost and technological literacy meant the threshold remained too high for all but a handful of individual voices.

One of which was student Justin Hall, who was inspired by New York Times journalist John Markoff’s article9 on the Mosaic Web browser. “Think of it as a map to the buried treasures of the Information Age,” Markoff had written. Duly inspired, Hall had built a web server for himself, and created his first web page — “Howdy, this is twenty-first century computing…” he wrote10. “(Is it worth our patience?) I'm publishing this, and I guess you're readin' this, in part to figure that out, huh?” Hall was by no means alone, but still, the ability to create HTML pages containing words and hyperlinks remained in the hands of a more technically minded few. Not least Jorn Barger, a prolific Usenet poster and ceator of the Robot Wisdom web site, upon which he created a long list of links to content that he found interesting. Barger coined11 the “fundamental’ principle of signal-to-noise in posting online: “The more interesting your life is, the less you post… and vice versa.” While this doesn’t say much for his private life, he would not have been particularly uncomfortable among the coffee drinkers of old. He also, as it happened, invented the term ‘web log’.

Over time, an increasing12 number of individuals and companies started to maintain online journals, posting in them anything that took their fancy. By 1999, when the term ‘web log’ was shortened to ‘blog’ by programmer Peter Merholz13, they were becoming a phenomenon. Finally individuals, writing about anything they liked, were connecting directly with other individuals on a global basis. Blogs may only have been text- and graphics-based, but they gave everybody everywhere the power and reach of broadcast media. From the Web’s perspective, it was the equivalent of blowing the doors off, resulting in a generation of previously unheard voices suddenly having a say. Unlikely heroes emerged, such as Microsoft engineer Robert Scoble, or journalist and writer David ‘Doc’ Searls. In a strange echo of the established church in Gutenberg’s time, the established community of influencers (including the mainstream press) was none too happy about the rise of ‘unqualified’ opinion. Still, it wasn’t going anywhere — exploiting the sudden release of potential, web sites such as Techcrunch and Gawker came out of nowhere and became billion-dollar businesses almost overnight.

The blogging model has diversified. Today, there are sites for all persuasions and models — for example, the question-and-answer site Quora, or the long-form content site Medium, not to mention YouTube, Soundcloud, Periscope and the rest. There really is something for everyone who wants to get a message out, so many options that it doesn’t make sense to consider any one in isolation. And indeed, mechanisms to create and broadcast opinion have become simpler and more powerful, with Twitter’s 140-character limit proving more of a challenge than a curse.

The downside for today’s Samuel Pepyses, of course, is that while they may feel they have a global voice, they are having to compete against millions of others to be heard. In this click-and-share world we now occupy, sometimes a message will grab the interest of others in such a way that the whole world really does appear to take it on — so called, ‘going viral’. In 2013 for example, the little-known Amyotrophic Lateral Sclerosis (ALS) association invited supporters to have a bucket of ice cubes poured over their heads in the name of the charity. The campaign proved unexpectedly, massively popular, raising14 some $115m for the charity and resulting15 in sales of ice cubes increasing by 63% in the UK alone. And indeed, as in previous centuries, blogs and social tools are being used to harness public opinion, through sites like Avaaz and PetitionSite. Even with examples such as UK retailer Sainsbury’s ‘tiger bread’ loaves, which were renamed following a letter from a child (the letter was reprinted millions of times online with the obvious benefit to the company’s reputation), it is easy to be sceptical about such efforts, particularly given the potential for abuse (for a recent illustration look no closer than Invisible Children’s Kony 2012 video, which was been watched 100 million times on Youtube and Vimeo in the first month it was uploaded. The problem was, it was later found to be a fake) and the fact that for every success story, a thousand more fail to grab the imagination. Today’s caring populace has too many pulls on its valuable time, which means campaign sites are looking for more certainty before they launch. For example, Avaaz adopted a process of peer review, followed by testing suggestions on a smaller poll group, before launching campaigns on a wider scale. Good ideas percolate through, hitting targets of relevance, currency and emotional engagement before making the big time. And if they do hit the big time? Then the cloud can ensure that the sky's the limit.

The nature of influence extends beyond such examples, as the virality of online social behaviour can affect whole countries, or even regions. In the UK the public backlash against the News of the World newspaper took place online, perfectly illustrating how the relationship between printed and online media has changed beyond recognition. And the harnessing of popular opinion in the Arab Spring might not have been possible without social networks. Of the ousting of President Mubarak in Egypt, Ahmed Shihab-Eldin, a producer for the Al-Jazeera English news network was quoted as saying, “Social media didn't cause this revolution. It amplified it; it accelerated it.”

Each time that a snowball effect occurs, millions of individuals contribute one tiny part but the overall effect is far-reaching. On the positive side, online influence this enables power to the people and new opportunities for democracy — a good example of a referendum-driven society is Iceland’s constitution, which was devised and tested online. At the same time, online debate is instrumental16 in broadening views on gender, race and other issues. The outcome may not always be so positive, however: the incitement to riot and loot in the UK, was partially blamed on social media and the encrypted Blackberry Messenger tool.

The even darker side of such minuscule moments of glory that our actions may not always be entirely our own. Psychologists are familiar with how people follow crowds, and there is no doubt that online tools are enabling group behaviour as never before. Not least we leave ourselves open to the potential for a kind of hive mind. A million tiny voices can drive good decisions, but can also yield less savoury kinds of group-think, which is as exploitable in reality TV as in incitement to riot. The mob should not always rule, as shown in the later stages of the French revolution.

One technique in particular – the information cascade17 – can lead us to respond in ways we wouldn’t if we actually thought about it. And in 2015, Facebook ran a controversial experiment to raise the importance of ‘good’ messages in someone’s news feed. What the company found is that people were more likely to click on topics that reflected their own opinion, meaning that the site demonstrates a clear echo chamber effect. Could this, even now, be used to influence our thinking on social issues?

The notion of influence is also highly open to exploitation. Propaganda is nearly as old as pamphleteering as a technique, so how do we know governments are not influencing social media right now? Obama’s election was massively helped by the careful use of social media: could this be put down to manipulation, or should we simply see it as “he who uses the tools best, wins”? Obama had the loudest megaphone with the best reach, for sure, but was it right that he won on the strength of being better at technology, rather than a better manifesto? Social media is already being treated as an important tool in conflict situations such as Gaza18.

One group that is not waiting for the final answer is the media industry, which has its own journalistic roots in pamphleteering. The media only exists by nature of the tools available, indeed, the name itself refers to the different ways we have of transmitting information: each one a medium; more than one media. And social networking is just the latest in a series of experiments around tools of influence. As first remarked remarked PR guru Lord Northcliffe, “News is what somebody somewhere wants to suppress; all the rest is advertising.” Right now, media researchers are identifying how to accelerate and amplify for themselves, applying such techniques to achieve the own goals and those of their clients. An ever-diminishing boundary exists between such efforts and manipulation of audience, customer and citizens, and history suggests that if the opportunity for exploitation exists, some will take it.

It would be a mistake, however, to see corporations as faceless. The changes that are going on within them are as profound as outside as technology challenges not only how we live, but how we work.