The world at your fingertips

When Darius the Great became king of the Persian empire in 549 BC, he recognised how geography limited his ability to communicate his wishes. As notes historian Robin Lane Fox, “Centralised rule is the victim of time and distance1.” The consequence, and one of Darius’ most significant achievements was to construct what became known as the Royal Road, stretching from his first capital, Susa to the distant city of Sardis, some 1677 miles away. With regular stations about 15 miles apart along the route, a relay of couriers on horseback could cover the entire distance in seven days. “There is nothing mortal which accomplishes a journey with more speed than these messengers, so skilfully has this been invented by the Persians,” gushed2 Greek historian Herodotus. “Neither snow, nor rain, nor heat, nor darkness of night prevents them from accomplishing the task proposed to them with the utmost speed.”

The road thus became the earliest established postal service. Economic growth across the region increased as a consequence, noted3 historian Craig Lockard, helping the Persian Empire become the largest the world had seen. But just as swords can cut two ways, so the road was also a significant factor in the demise of the Empire some 200 years later, when Alexander, also known as the Great, took advantage of its well-maintained state to pursue his arch-enemy, King Darius III of Persia. To add insult to injury, Alexander used the very same courier system to co-ordinate his own generals.

Other systems of communication existed — such as so-called fryktories (signal fires) and even a complex system of semaphore, as documented by Polybius, in the second century BC:4

“We take the alphabet and divide it into five parts, each consisting of five letters. There is one letter less in the last division, but it makes no practical difference. … The man who is going to signal is in the first place to raise two torches and wait until the other replies by doing the same. These torches having been lowered, the dispatcher of the message will now raise the first set of torches on the left side indicating which tablet is to be consulted, i.e., one torch if it is the first, two if it is the second, and so on. Next he will raise the second set on the right on the same principle to indicate what letter of the tablet the receiver should write down.”

However, it was ultimately via well-kept roads that empires grew and could be sustained. Whereas fires and flags served their singular purposes well, the roads provided extra versatility and, above all, reliability — particularly important messages could be sent with additional security, for example. Nobody knew this, nor used such systems better than the Romans, whose famously straight roads came about primarily to aid the expansion of empire. Indeed, Emperor Diocletian created a new legion5 in about 300AD specifically to protect the Royal Road at the point where it crossed the Tigris river in Amida in eastern Turkey, now the modern-day town of Diyarbakır.

The road-based system used throughout ancient times did have its limitations, however. Either big messages (such as whole manuscripts, carefully copied by hand) could be carried over long distances to smaller numbers of people, or short messages such as edicts could be cast more quickly and broadly, relying on heir distribution at the other end of the line. The whole thing was a bit of a trade-off – either say a lot to people who were close, or a little to people who were further away. For all their constraints, such models of tele-communications (tele- from the greek, meaning “at a distance”) lasted some two millennia, with even the invention of the printing press (which we shall look at later) doing little to improve matters.

The real breakthrough came at the turn of the nineteenth century, when Allesandro Volta6 stumbled upon the creation of the electric circuit. It is hard to imagine a discovery more profound, nor more strangely circuitous. At the time electricity was all the rage among the intellectual classes across Europe and America, and theories about how it worked were legion. Volta’s first breakthrough came in 1778, when he worked out that static electricity could be stored in a device he called a condenser (today we’d call it a capacitor). Then, in 1792, he set out to disprove7 another theory by Galvani, who reckoned that animals possessed a unique property called animal electricity, a consequence of which, he was able to demonstrate in an experiment typical of its time, was how frogs’ legs could still move when they had been separated from the frog.

Over a period of months Volta conducted a wide variety of experiments, many at random, before achieving the singular breakthrough which would mark the inception of the modern world. Having no doubt worked through a fair number of amphibians, he discovered that a frog’s leg would indeed move, if strips of two different metals (copper and zinc) were applied to them. Having surmised that the metal strips were creating the electricity in some way rather than the tissue of the frog, Volta only required a short hop (sorry) before discovering that a more reliable source of electric current could be created by replacing the de-connected legs with salt solution.

The resulting explosion of discoveries included the invention of the electromagnet by Englishman William Sturgeon in 1825, thirteen years after which Samuel Morse demonstrated the first workable telegraph8. It took a further five years for the public authorities to commission a longer-distance telegraph line, at a cost of $30,000. Using a coded system of dots and dashes that would forever more be known as ‘Morse Code’, the first message to be sent across the wire, on 24th May 1844, by a young girl called Annie Ellsworth, was the biblical9 (and quite prescient), “What hath God wrought?”

The still-nascent USA was an ideal starting point for the telegraph, being such a vast, unpopulated country. It still took a goodly while for the mechanism to achieve any level of ‘mass’ adoption however — while demand was great, the distances that cables needed to cover in order to be practical were simply too great. In August 1858 a message was sent from the UK to the US (once again with a biblical theme — “Glory to God in the highest; on earth, peace and good will toward men”). In the meantime, some in the US felt it viable to establish a trans-continental Pony Express service — this came in 1860 and had a route to rival that of ancient Persia, going from St Joseph, Missouri to Sacramento, California, some 1,900 miles in total. Once again, stations were set along the way between 5-30 miles apart, a distance set by the practicalities of horse riding. The first letter, addressed to a Fred Bellings Esq. and stamped April 3rd, took 10 days to reach Sacramento.

When Abraham Lincoln made his inaugural address10 as president on the eve of the American Civil War, it was transmitted via a hybrid of old and new — it was first telegraphed from New York to Nebraska, then carried by just-founded Pony Express to California where it was telegraphed on to Sacramento. While the civil war may have been involved in the demise of the Pony Express only two years after its launch, the telegraph was already being reeled out across the continent — indeed, just two days after the transcontinental telegraph was finished, on 24th October 1861, the courier service folded. “The transcontinental telegraph put the Pony Express out of business in the literal click of a telegrapher's key. That's not an exaggeration,” commented11 author Christopher Corbett.

Quite quickly the telegraph extended across Northern America, even as it did the same across continental Europe and beyond. The notion of a simple wire and an electromagnet led, as we know, to innovations such as the telephone patented by Alexander Graham Bell in 1876, followed shortly after by the invention of radio. Alongside other inventions — the LED, the fibre-optic cable and so on — humanity now had everything it needed to transmit larger quantities of information to be transmitted, both digital data and content — text, graphics, audio and video — across the globe.

The proliferation of mechanisms of information transfer has also led to an explosion of approaches12, protocols and standards. One of the clever bits about data — it being ultimately a set of zeroes and ones with some topping and tailing — is that any transmission standard can be interfaced with, or integrated with, any other. As a consequence, in the 1990’s, a familiar sight for data networking engineers was a wall chart covered with boxed and arrows, showing how different standards interfaced.

Until relatively recently, the real contention was between standards that sent a constant stream of data between two points, and standards that chopped data into packets and sent it off, joining it back up when it reached its destination. The latter model had several advantages, not least that packet-based networks operated like a mesh, so packets could be routed from any node to any other — if a node wasn’t working, the packets could find another route through. Whereas with a point-to-point network, if the route failed for whatever reason, the connection would be lost — something we have all experienced when a telephone call gets dropped. Trouble was, packet-based networks couldn’t work fast enough to send something like an audio or video stream. If packets did get lost they had to be sent again, adding to the delay. The trade-off was between a faster, less reliable connection or a slower, more choppy connection that was less likely to fail.

As networks gradually became faster, it was only a matter of time before packet-based protocols ruled the roost. Or, should we say, a pair of protocols in particular, known as Transmission Control Protocol (TCP) to handle the connection, and Internet Protocol (IP) to manage the route to the destination. TCP/IP was already some two decades old by this point — the standard was defined in 197513 by US engineers Vint Cerf and Bob Kahn, and had already become well-established as the de facto means of data transfer between computers.

What with Tim Berners Lee's 1989 creation of a point-and-click capability he termed the World Wide Web, using a proven[^14] mechanism known as Hypertext Markup Language (HTML) to describe information and a mechanism known as Hypertext Transport Protocol (HTTP) for transfer of text and images across the Internet, the future of TCP/IP appeared to be assured but it still was not adequate for the transmission of more complex data sets such as streamed audio and video — the required increases in networking speeds and volumes, based on laying huge quantities of fibre between our major cities, and from there to smaller locations.

The final piece in the networking puzzle came with the creation of Assymetric Digital Subscriber Line (ADSL) technology, better known by the name 'broadband'. Experiments to use twisted pair cable — the type used in telephone lines — for data transmission had started in the Sixties, but it wasn't until 1988 that Joseph Lechleider, a signal processing engineer at Bell Labs, had a breakthrough idea14. "What if the download speed is set to be much faster to the upload speed," he thought to himself. Not only did the model reduce interference on the line (allowing for greater transport speeds overall), but it also fitted closely with most consumer use of the Web — in general, we want to receive far more data than we send.

ADSL broadband achieved mass adoption in the Far East, with inertia in the West largely down to incumbent telephone companies, fearful of cannibalising their own businesses. It took waves of deregulation and no small amount of infrastructure upgrades for broadband services to be viable for the mass market, a situation which is still unfolding for many.

All the same networking capabilities have kept pace with computer procesing, with Nielsen's Law showing15 that bandwidth has increased 50% per year between 1984 and 2014. Nielsen’s Law says we should all have over 50Mbs by today, which of course we don’t for all kinds of reasons, not least that we rely on others to put the infrastructure in place to make it possible.

The final piece in the puzzle took place when the Internet first started to be transferred across radio waves. Two mechanisms are generally accepted today, namely Wireless Fidelity (WiFi) and data transmission via mobile phone technology, each of which is extending in its own way. According16 to the GSM association, mobile coverage now reaches “6 billion subscribers across 1,000 mobile networks in over 200 countries and regions.” Areas that do not have a mobile mast can connect via satellite, further extending the reach to the furthest nooks and crannies on the planet. Significant advances will come from mobile, not least we can expect 4G LTE to make a real difference by operating at double WiFi speeds, reducing latency with minimal ‘hops’.

Mobility is what the word implies — technology can follow or needs, rather than us being fixed to its location. “Before long it will be possible for every person on this planet to connect to every other. We are only just starting to get our heads around what that might mean,” said musician and innovator Peter Gabriel, when introducing Biko, a song about a person on the wrong side of historical prejudice.

At the same time, technology offers huge potential to level the global playing field. Consider education, which is already being undertaken in developing countries, according to Alex Laverty at The African File. Clearly, the more that educational materials can be delivered onto mobile devices, the better - while the current handset 'stock' might not be all that suitable for interactive applications, it's worth keeping an eye on initiatives such as India's $35 tablet programme - and note that OLPC also has a tablet device in the pipeline.

Gabriel’s views echo the views17 of Nikola Tesla in 1926. “From the inception of the wireless system, I saw that this new art of applied electricity would be of greater benefit to the human race than any other scientific discovery, for it virtually eliminates distance.” From its lowly beginnings, we now have a communications infrastructure, suitable for both computer data and more complex 'content', which spans the globe. And it hasn’t stopped growing — so-called dark fibre (that is, cables which nobody has decided a use for, yet) continues to be laid between cities and nations. Over half a billion pounds was spent18 on fibre-optic cable in 2014, and the market shows no sign of slowing as nation after nation looks to extend its networking reach.

Even as a significant debate continues around net neutrality, across the last two decades we have seen an explosion of activity in consequence, through user-generated blogs, videos and other content to social networking, mobile gaming and beyond. What this really gives us, however, is a comprehensive platform of processing and communications upon which we can build a diverse and complex range of services. Enter: the cloud.












  12. Hedy Lamarr - spread-spectrum wireless pioneer? 

    [^14] “Ted Nelson had the idea of hypertext, but he couldn't implement it.” 



  16. Need a reference here - but here’s a map