A Quick History of Digital Communication Before the Internet

At 12:30 am on April 4th, 1841 President William Henry Harrison died of pneumonia just a month after taking office. The Richmond Enquirer published the news of his death two days later on April 6th. The North-Carolina standard newspaper published it on April 14th. His death wasn’t known of in Los Angeles until July 23rd, 110 days after it had occurred..

Isocron map
An isochron map shows the travel time to various destinations from a specific starting point. In this case, from London in 1914 to the rest of the world.

The Internet is a platform for universal communication. You can plug a device into any Internet connection, and provided it speaks TCP/IP, you can communicate with any other point on earth. But before the Internet, even before ARPANET, people and devices needed to talk. I thought I would take a look at some of the technologies which enabled people to communicate in a pre-Internet world.

The Entropy of a Horse

Pony Express waving at
Telegraph

The Pony Express was a system of horses, riders and relief stations which stretched from the western end of the telegraph system in Nebraska to Sacremento California. The system allowed information to be sent across the country in just ten days.

While this may not seem like an achievement by today’s standards, it was considered neigh-on impossible at the time. It required 157 stations, and riders who couldn’t weigh more than 125 pounds, to allow the small horses (hense the ‘pony’) to carry them as quickly as possible.

If the average rider carried 1280 1/4 ounce messages (the billable weight at the time), each letter being roughly 100 words, it equates to about 640 kilobytes of information. Over ten days, this adds up to a data transmission rate of about 6 bits per second, albiet with latency numbers which would strike fear into the heart of the stoutest of multiplayer gamers.

Somewhat tragically, the Pony Express was founded just sixteen months before the completion of the telegraph, and was discontinued just two days after the telegraph opened. It was notable however for delivering news of Abraham Lincoln’s inauguration in 1860 in just seven days.

The Five-Needle Telegraph

A MURDER HAS GUST BEEN COMMITTED AT SALT HILL AND THE SUSPECTED MURDERER WAS SEEN TO TAKE A FIRST CLASS TICKET TO LONDON BY THE TRAIN WHICH LEFT SLOUGH AT 742 PM HE IS IN THE GARB OF A KWAKER WITH A GREAT COAT ON WHICH REACHES NEARLY DOWN TO HIS FEET HE IS IN THE LAST COMPARTMENT OF THE SECOND CLASS COMPARTMENT

— Police bulletin sent via needle telegraph, 1845

The printing telegraph is wonderful in its simplicity, it only requires a single pair of wires. Before its invention, and even before Morse invented his code, there were alternative telegraph systems built all around the world.

William Cooke and Charles Wheatstone (of Wheatstone bridge fame) built a system which used analog signals and pointers.

Five-Needle Telegraph

Their system used six seperate wires. One was used as a common ground, and the other five each drove a pointer. You would vary the voltage on the wires powering two of the needles such that they point to the letter you wish to send. The system wasn’t particularily fast however, and it required six seperate wires. The wiring is the most expensive part of any telegraph operation from the era, making the system impractical when single wire systems (using the earth as the ground) became available.

You’ll also notice that this system can only transmit 20 unique codes, meaning messages couldn’t contain the letters C, J, Q, U, X or Z.

The Telegraph

With it disappeared the feeling of isolation the inhabitants of the Pacific Coast had labored under. San Francisco was in instant communication with New York, and the other great cities of the Atlantic seaboard. The change was a great one, but it was one to which the people readily adapted themselves to, having wished and waited so long for it. In that moment California was brought within the circle of the sisterhood of States. No longer as one beyond the pale of civilization, but, with renewed assurances of peace and prosperity, she was linked in electrical bonds to the great national family union.

— James Gamble

The needle telegraph and the semaphore both worked, but it was the Morse telegraph which would ultimately link the world.

Morse’s system allowed an operator to press a key to send a signal over the telegraph wire. A long press was interpreted as a ‘dash’, a short press a ‘dot’. In an early form of binary encoding, these dots and dashes were translated into the alphabet. The system was simple, fast with a skilled operator, and required the absolute minimum amount of infrastructure investment.

Morse Code

The first long-distance telegraph message ever sent was a melodramatic “WHAT HATH GOD WROUGHT!”. It was sent in 1844 from the Supreme Court Chamber in Washington DC to a train station in Baltimore by Samuel Morse himself.

The first attempts to make a cable which could be ran underwater were made before rubber was discovered, instead using pitch and rope. These cables were, unfortunately, both brittle and not-waterproof. Existing waterproofing materials were simply not flexible enough to allow the cable to be laid.

In 1842 however a Scottish surgeon imported a sap from India for use in medical equipment. This sap was known as Gutta Percha, and is now known as latex. This flexible, waterproof, material became a critical component of the first successful undersea cables. When combined with a outer-layer of steel wires to give the cable strength, this insulated cable became the first type to successfully cross the Atlatic.

Cable Cross Section
A cross-section of an early undersea cable. The outer metallic wires are for strength. The black material inside them is Gutta-Percha insulation. The actual signals are sent through the inner-most wires.

While rubber had been discovered, the principals of impediance matching had not, resulting in such horrific echos and interference in the first cable that it took no less than 17 hours for the first message to be transmitted. Imagine the frustration of trying to send, and to hear, a message for the better part of a day.

Using a problem solving method we have all been guilty of at one point or another, one of the ‘engineers’ named Wildman Whitehouse delivered a 2000 volt shock to the cable in a 19th century attempt to ‘unplug the router and plug it in again’. This promptly destroyed the cable.

Whitehouse

The wires themselves had been tested with what one account describes as “a very crude form of galvonometer.” This ‘galvonometer’ was later explained to be the tounge of one of the engineers, used in the manner of generations of tinkerers testing nine volt batteries.

Whitehouse’s inclusion on the cable team was itself a classic case of “hire the guy who is the most confident this is a good idea.” He had no training as either a scientist or an engineer. He didn’t actually believe in voltage or current, and refused to use the state-of-the-art mirror galvanometer of the day. He prefered a device of his own design which involved the application of large amounts of current to a detection device which looks like nothing more than the scale used at doctors offices. The system required five foot tall inductors to power, and was therefore matched in it’s uselessness only by the danger it presented to everyone involved.

When the man who would become Lord Kelvin stepped in and tried a mirror galvonemeter they were ultimately able to cut the seventeen hour time down to about one hour, but at this point the cable had been too badly damaged to last. A transatlantic cable wouldn’t be attempted again for almost six years, and fortunately Mr. Whitehouse did not ‘assist’ the second time around.

In 2016, a transatlantic fiber optic cable can transmit 40 terrabits per second. Thus, in the last hundred and fifty years, our ability to transmit information has improved roughly 13000000000000x.

Baud

The telegraph worked well, but it required highly trained operators who were limited bin their speed by human perception. Even at its best morse code comprehension rarely exceeds 40 words-per-minute (the record is 75.2). Also, while it might require less wires than the needle telegraph, you can still only send one message at a time over any given wire.

The solution was some sort of digital code where you could map the letters to a lesser number of bits which could then be sent more quickly over the wire. The first popular code was published by Émile Baudot in 1870. It mapped the letters to five bits which were typed using a special five button keyboard:

Baudot Keyboard

The distributor unit would read the currently pressed character from the keyboard several times a second. It was up to the operator to have the character ready to be read forcing them to type at a consistant cadence usually of about 30 words per minute. To this day the speed of signals over a line is referred to as ‘baud’ from Baudot.

The baud was greatly accellerated when Donald Murray figured out how to write and read Baudot code from paper tape in 1901. That allowed messages to be typed out in advance and transmitted machine significantly faster than an operator could type. Murray also introduced the Carriage Return (CR) and Line Feed (LF) control characters which still cause us all manner of trouble to this day.

A variant of Baudot called ITA2 (the International Telegraph Alphabet) was used for virtually all telegrams and teletype machines until ASCII was invented in 1963. Even then, ASCII lacks support for many international and special characters, meaning special-purpose character sets remained in use throughout the life of teletype.

Character Sets

To read these digital codes the printer would wait for the current on the line to disappear (power over the line was the ‘off’ state, like a dial tone). When it disappeared, it was time for a character to be transmitted. Each bit would move the print head by a perscribed amount based on where in the five bit sequence it was. When the sequence was completed, the print head was pulled onto the paper, marking the character. Generally there would be two bits of current flowing at the end (the ‘stop’ bit) to ensure that characters were seperated properly.

The fixed timing of a Baudot code meant that you could ‘multiplex’ many typists and printers over the same telegraph lines, a huge technological improvement. A physical distributor unit would spin, checking and sending the current character from each keyboard in turn. On the other end a similarly spinning distributor would send each character to the appropriate printer.

Baudot Multiplexing

Hush-a-Phone & Carterfone

The Internet has been, in my opinion, the greatest contribution to mankind’s education, information, and problem solving ability in the history of our species. Greater than the Library of Alexandria, Egypt, in the third century B.C. 127. Greater than the impact of Thomas Jefferson’s vision of a self-governing democracy’s dependence upon a foundation of public education, public libraries, reduced postal rates for printed matter, and the impact of free speech they make possible.

— The FCC Commissioner who wrote the opinion in the Carterfone case, 41 years later

Here lies a stubborn Texan.

— The gravestone of Tom Carter, Texas

You may have seen a picture of a phone cradled in a modem before, perhaps in a movie like Wargames:

Acoustic Coupler

This device is called an ‘Acoustic Coupler’, and it exists for reasons legal, not technical. Telephone companies didn’t particularily want people sending data over their lines. This was for a multitude of reasons, not least of which was they leased out very expensive Dataphone units and were not particularily excited about you being able to buy a modem for $399 at Radioshack. They did not have the legal grounds to prevent it though for two reasons. One, the phone lines were regulated by the American government as a ‘natural monopoly’. The theory is you can’t run a dozen seperate phone lines to every home. The very act of building phone lines somewhat prohibts someone else from running lines into those same homes, granting you a monopoly. In the interest of competition, the government elected to regulate the system to prevent that monopoly from hurting consumers.

The next play by the phone company would be to block modems based on the idea that using a modem somehow puts undue stress on the phone system. They lost that ability though because of the Hush-a-Phone and Carterfone decisions.

Hush-a-Phone

The Hush-a-Phone was a device originally invented in the 1920s to allow people to have more private conversations over the phone. It’s essentially just a tin box which goes over the phone’s mouthpiece. When you speak into it no one else in the room can hear you, but the party at the other end of the phone line can. AT&T sued the company on the somewhat dubious grounds that the device reduced the quality of the call for the recipient, which damaged their business and the phone system as a whole. The Court of Appeals decided that it was ridiculous to say that a user “may produce the result in question by cupping his hand... but may not do so by using a device.” They ruled that banning such a device was an “unwarranted interference with the telephone subscriber’s right reasonably to use his telephone in ways which are privately beneficial without being publicly detrimental.”

That ruling formed the basis for the idea that you can do essentially anything you want with a phone, as long as you weren’t damaging the phone system itself. There was one thing you couldn’t do however: actually connect to the phone wires themselves, which is why you needed an ‘acoustic coupler’ in those first modems.

This requirement was also ultimately overturned based on the success of the ‘Carterfone’. The Carterfone was a device (obviously invented by a guy with the name ‘Tom Carter’) which allowed you to connect a phone line to a radio. You could use this to, for example, patch a phone call into the 2-way radio in a police car. Carter’s original purpose was to allow phone calls to reach him while he was ranching on his farm in Iowa.

Carter ultimately sued AT&T, in his words: “I just didn’t believe anyone I wasn’t harming had the right to tell me I couldn’t be in business.” By the time the case was ruled on he had lost one hundred employees and had had to sell his ranch and home to fund the case. Eventually the FCC ruled that as long as you didn’t damage the phone system the phone company couldn’t stop you from connecting devices directly to phone lines.

It may seem strange that the phone company would go to such lengths to prevent people from using their services. It’s valuable to remember however that they were a true monopoly. Their telephone poles were scheduled for replacement on an eighty year cycle and their bonds were only second to that of the US government in terms of percieved reliablity. They had no innovation from competitiors to fear as they controlled everything from the wires in the sky to the physical phones (which were only available in black). This made the the loss of control a the biggest possible threat.

Dataphone

If you lived through the 56k days, you are famailiar with modems. A modem takes the digital data your computer spits out and encodes it in a format which can travel over a ‘POTS’ (Plain-old Telephone Service) line, in other words, a line which would normally carry a telephone call.

A POTS is not only relatively simple in its construction when compared to more modern signaling cables, it’s restricted to only transmit the narrow frequency band of human speech from around 300 to 3300 Hz. This dramatically limits the amount of bandwidth you can get through a POTS.

The Dataphone was the first commercially available modem, moving data over the phone system at 110 bits per second. They were also notable as being the first commercial use of the later-ubiquitious ASCII character encoding.

First dataphone
The first Dataphone modem, originally part of the SAGE system.

It descended from the ‘Digital Subset’, a component of the SAGE system. The SAGE system was a cold war era networked computer system which was used to unite radar data into a single picture of an enemy attack and direct defenses to their locations.

The SAGE system was composed of 56 seperate computers, any one of which would be considered the largest computer ever built. Each computer weighed 250 tons, was composed of 60,000 vacuum tubes and performed 75,000 operations per second. The project cost more than the Manhatten Project which developed the first nuclear weapons.

SAGE
One of the SAGE mainframes

Incidentally, the codename for SAGE, ‘Project Lincoln’, gave its name to the famous MIT Lincoln Labs which still produces military technology today.

Modems of the era were rather simple. The first station uses a tone of 1,270 Hz to mark a 1, and 1070 Hz to mark a 0. The second station uses 2,225 Hz and 2,025 Hz to send its values. It’s only possible to make a single connection over a phone line, and the data moves at 110 bps.

Very impressive encoding and the transition to digital phone lines would allow telephone modem speeds to ultimately reach 56 kbits per second. As the system was originally intended to be used to send phone calls digitally, it’s based around each channel encoding one phone call with 8,000 8 bit samples per second, for a total of 64 kbps. Each sixth frame includes a single control bit however used by the phone system (bit-robbing), leaving us with just 56 kbps with which to send our data.

ALOHAnet

Interesting solutions often are born of interesting constraints. In the case the constrant was the need to allow computer users at the University of Hawaii to share a single time-sharing computer in Oahu. As you most likely know, Hawaii is composed of a chain of islands, ruling out any wire-based communication standard. Instead University of Hawaii professors and students built a system based on UHF radios.

Rather than trying to give every terminal its own set of communication frequencies, it was decided that it was necessary to find a way for every terminal to share a single frequency. In other words, they would all be talking over the same ‘wire’ potentially at the same time. To prevent multiple messages from colliding what was perhaps the first collision management protocol was invented.

If when you were transmitting data, you recieved data from another station, it meant multiple stations had transmitted at the same time. As the data would be garbled, those messages would be automatically resent after a random delay to limit the likelyhood of another collision. ALOHA would ultimately become the standard used to first add text messaging to cell phones. Even more importantly though, it would turn out that this concept wasn’t just valuable for wireless networks.

Ethernet

The whole concept of an omnipresent, completely passive-medium for the propagation of magnetic waves didn’t exist. It was fictional, but when David and I were building this thing at PARC, we planned to run a cable up and down every corridor to actually create an omnipresent, completely-passive medium for the propagation of electromagnetic waves. In this case, data packets.

— Bob Metcalfe, Creator of Ethernet

The Ether Drawing

Xerox PARC was a magical place. It was a research center which, by virtue of being 3,000 miles away from Xerox HQ, was given much more freedom than was reasonable. This freedom allowed its scientists to invent a huge range of the technology we use today including the graphical user interface you’re reading this on, the WYSIWYG text editor concept I’m writing this on, the concepts powering the bitmap graphics in this post and the model-view-controller architecture the Javascript on this page uses to render it. If you’re scrolling this page with a mouse, or printing this on a laser printer, that was them too.

The desire to land men on the Moon in the 60s created many, many problems which engineers needed to solve. Those solutions ended up becoming the ‘technology’ we would be adopting outside space travel for decades to come. Similarily, when PARC began to invent all sorts of fantastic devices like the personal computer, it created many problems which enterprising engineers could attempt to solve.

Robert Metcalfe elected to target computer networking. Computer networking is somewhat simple when all you want to do is connect two devices to each other, and they are close enough together that you can trust a signal from one will get to the other. It gets much more complicated when you want to network a dozen, or a hundred, or a million, computers across thousands of miles.

Existing solutions included the Token Ring system, which connected many computers in a daisy-chain fashion. Each computer would take a turn of holding the ‘token’. While it held the token it could communicate, while it was waiting for the token it needed to wait. This system was problematic both because it was only very efficient if everyone wanted to talk a roughly equal amount, and because if a single link in the chain of computers was broken, the whole system would collapse.

Metcalfe had, however, studied the ALOHAnet while in college and saw the opportunity to build a system where every computer could be connected to a common ‘bus’, using that same collision handling to ensure messages wouldn’t conflict. He called it ‘Ethernet’ based on the idea that this same networking standard could be used to transmit data over any medium, including the ‘aether’ which was once believed to form the universe.

Ethernet has some wonderful characteristics including the ability to plug additional computers into the network wherever necessary without diffucult reconfiguration, and the ability for nodes to drop off the network without other, unrelated, failures. In practical situations Ethernet manages to achieve nearly 98% throughput, much higher than Token Ring could achieve. Even more importantly, the reliability of Ethernet networks, and an agreement by DEC, Intel and Xerox to standardize around Ethernet made it the dominant wired networking standard into the 90s as it is today.

As computing hardware has become cheaper, Ethernet’s collision detection has actually become less and less important. In most modern networks each computer has its own connection to a ‘switch’. The switch makes intelligent decisions about where each packet needs to be foward, to make the networks collision-free.

Wi-Fi

Wi-Fi was invented for cash registers by the NCR (National Cash Register) corporation (which is today worth over four billion dollars). NCR is similar to IBM, a very old company (founded in 1884) which managed to build itself a powerful research team which wasn’t afraid to innovate beyond the bounds of cash registers. The wireless solution they built was much more than a cash register communication protocol. Their implementation used the same chipset as Ethernet itself, and actually presented itself to the operating system of the computers as an Ethernet card making it compatible with virtually every computer on the market.

The original system, known as ‘WaveLAN’, was released in 1990 by Victor Hayes who would go on to lead the team creating the 802.11 specification we still use (in an expanded form) today.

The Wi-Fi system was partially made possible through a historic government deregulation. In 1985 the FCC had released the now famous 900 MHz, 2.4 GHz and 5.8 GHz frequency bands for use without a government licence. (The actual reason they were released was because those frequencies were used for industrial and scientific purposes like microwave ovens. It was believe they would be useless for communication.) This made it possible for innovators to build all of the ad-hoc wireless technologies we consider ubiquitous today like Bluetooth, NFC and cordless phones. By 2005 over 100 million Wi-Fi chipsets would be shipped annually.

The Rest

And the rest was history. If you liked this, take a look at our post on the Telstar, the first communication satellite to ever exist, and the nuclear blast that destroyed it. Also subscribe below to get notified when we release new posts.

Like this post? Share it with your followers.

Sign up to get our next post delivered to your inbox.

Follow us to get our latest updates.