Flexible friend: Data's Big digital journey online
The analogue shift disguised its original purpose
Big Data and All That The media appear to suffer from a congenital compulsion to simplify everything down to a level they can grasp. Big data is one of those simplifications: something that can be shoved down a Fat Pipe. Enid Blyton passed away before the IT explosion and only got as far as Big Ears.
I don’t get any pleasure from nursery language and I cringe when seeming adults order their eggs "sunny side up".
Even if I were to attempt a definition of big data, others would disagree. It is best classified by Alice in Wonderland: “When I use a word, it means exactly what I want it to mean”. Appropriately too, as Lewis Carroll was actually a mathematician who laid some of the foundation stones of symbolic logic.
Mankind has a habit of recording stuff and this invariably involves making marks on some medium that are more or less analogous to the material. When the marks could be made by hand and became standardised so that many could understand them, we had writing. As technology advanced, the marks could get smaller, so more stuff would fit in a reasonable space.
They would need a machine to write them. Printing produced a form of writing that could be read by humans and irrevocably changed the dissemination of words – truth, fiction and propaganda alike.
If a machine was also available to read marks, they would not need to be visible and could be smaller still. Marks could then be magnetic fields, electrical charges, presence or absence of matter and so on and they could represent text, musical notation, operating sequences of automated machinery, sound, images and, later, moving images. The media were all mutually incompatible and generally source-specific; the medium was tailored to the type of information they stored.
You couldn’t readily store knitting patterns on an audio disc even though it was played with a needle. In many of these media the parameter that was stored was infinitely variable and was more or less proportional to the original information. These could be categorised as direct linear media. For example the transverse velocity of the groove in an audio disc would be an analog of the velocity of a microphone diaphragm and the distance along the groove would be an analog of time.
The density of a photograph reflected the original image. Inevitably some attribute or imperfection of the medium would superimpose itself on the message. Photographs would be grainy, sound recordings would crackle, tapes would hiss and drop out. If the speed of the disc during cutting or reproduction wasn’t sufficiently stable, then pitch variations would be superimposed on the sound. Sometimes speed error was deliberate: the Keystone Cops were speeded up and frames were missed out to make the movement funnier.
It is a fundamental characteristic of direct linear media that the attributes of the medium could not be separated from the reproduced signal.
If such a recording was copied to a second medium or delivered down a wire, the imperfections of each would cascade, producing what came to be known as generation loss. There would then be a distinction between consumer and professional devices where the professional device would be designed to have less generation loss whereas the consumer device would be designed to be small and have a long playing time. Recordings that wore out and generation loss in consumer recording was, of course, just what record companies wanted.
Automatic for the people
Invariably human intervention was required to select the source-specific medium the machine was to read, to insert it in the reproducer and then to locate the desired part. In order to enjoy the replayed information you had to be there. Efforts were then made to automate the process. The juke box was an early random access device that automated the replay of audio discs.
Alongside the efforts to store or record, developments took place in communication. Roman roads were the Internet of their time and were optimised for speed. It was found that a horse pulling a light-weight carriage on smooth stones could go faster and further than one supporting a rider on bare earth. Obviously the shortest journey was in a straight line and this applied also to the carrier pigeon. But did a written message truly originate from the sender? This led to the development of the seal, a unique imprint in wax that was hard to forge, and signet rings or other devices to make the imprint. Data security has not improved much since then.
Female signal corps with semaphore flags. Photo from Everett Collection via Shutterstock
The speed of light helped semaphore messages travel quickly in good weather, but only over line-of-sight paths and there was the danger of errors propagating with the cascading of many short links. Not far from where this is being written is Beacon Hill, one of many.
The first electrical circuits contained switches to turn the current on or off and a device such as a solenoid could respond to the presence of current with mechanical movement. The first communication systems were no more than an electrical circuit of that kind with the wires extended. The receiving solenoid could record, for example, ticker tape.
Semaphore and the telegraph had in common that the number of symbols that could be transmitted was limited. These systems were not linear, but discrete. Semaphore could manage the alphabet, and with the equivalent of a shift key, decimal numbers, whereas telegraph was strictly on-off or binary.
The binary limitation was overcome by changing the length of the symbols. The famous dots and dashes of Morse code are an early application of modulation, where you modify the source information into a form the channel can manage. Later on, ASCII would allow text to be transmitted between teletype machines. Traditionally, time had been obtained from sundials which later synchronised pendulum clocks, so the world ran on local solar time. Telegraph wires were rapidly erected alongside railways and soon led to the creation of time zones. Time in the new zones was initially called railway time. Accurate synchronisation is one of the enabling technologies of networks.
When radio was developed, the initial transmissions used Morse code because linear modulation had yet to be developed. Ultimately radio signals were linearly modulated, first by band-limited audio as amplitude modulation, later by full-bandwidth audio as frequency modulation. But the ability of wires and radio alike to transmit binary was not neglected and the bit rate of such links rose steadily, to be joined by optical fibres when these were developed.