Science in Our Digital Age

A Little History Of Science: Science in Our Digital Age

The next time you switch on your computer, you probably won’t ‘compute’. You might look up something, email your friends, or check the latest football score. But computers were originally machines that could only compute – calculate – things faster or more accurately than our brains can.

We think of computers as cutting-edge technology, but the idea of the computer is very old. In the nineteenth century, a British mathematician, Charles Babbage (1792–1871), devised a calculating machine that could be ‘programmed’ to do tricks. For instance, he could set it up to count by single numbers to 1,000,000, and then when it got there, skip to 1,000,002. Anyone who had patiently watched it count to 1,000,000 would have been surprised by the missing number. Babbage’s point was that his machine could do things that we wouldn’t expect in the normal run of nature.

In the late 1800s, the American mathematician Herman Hollerith (1860–1929) invented an electric machine that used punched cards to analyse lots of data. If the cards were punched correctly and fed into the machine, it could ‘read’ them and process the information.

The Hollerith Machine was very useful in analysing the information that people put down on their census forms, gathered to help the government understand more about the population. Very quickly, it could compute basic data such as how much people earned, how many people lived in each household, and their ages and sexes. The punch card remained the way most computers worked until the Second World War.

READ:  Into the Atom

During that war, computers came into their own for military purposes. They could calculate how far shells would travel, and they served a more dramatic role in the top-secret attempts to decode enemy messages. The Germans, British and Americans all developed computers to aid wartime security. Here is a wonderful irony: the modern computer has opened up everyone’s world, but it began as something that only a very few people, with the highest security clearance, had access to.

The British and Americans used computers to analyse German coded messages. The heart of the British effort to break the German codes was an old country house called Bletchley Park, in Buckinghamshire. The Germans used two code-making (cipher) machines, Enigma and Lorenz. Each day the codes were changed, which demanded great adaptability from the decoding machines.

The British designed two code-breaking machines, the Bombe and the Colossus. The Colossus was well named, for these computers were enormous machines, filling entire rooms and consuming large amounts of electricity. The computers used a series of vacuum tubes to switch the electrical signals. These tubes generated a great deal of heat and were constantly failing. Wide aisles separated the rows of tubes so that the technicians could easily replace the burnt-out filaments. In those days, ‘debugging’ didn’t mean running a software program, it meant reaching in and clearing out the bugs – moths or flies – that had flown into the hot glass tube and shorted out the system. The code-breakers short- ened the duration of the war and undoubtedly helped the Allies to win it.

READ:  The History of Our Planet

A remarkable mathematician worked at Bletchley Park: Alan Turing (1912–54). He was educated at my old college in Cambridge, King’s College, where his brilliance was recognised as a student there in the early 1930s. He was publishing important ideas on computer mathematics, and his work at Bletchley Park was outstanding. After the war he continued to push his ideas. He had great insights into the relationship between the way computers work and the way our brains work; on ‘artificial intelligence’ (AI); and even on developing a machine that could play chess. Chess grandmasters still usually win against a computer, but the machines are getting better at making the best move. Turing developed an early electronic computer called ACE at the National Physical Laboratory in Teddington, London. It had much greater computing capacity. His life had a tragic end. He was gay at a time when homosexual activity was illegal in Britain. Arrested by the police, he underwent a treatment with sex hormones, to ‘cure’ his sexual orientation. He almost certainly committed suicide by eating an apple laced with the poison strychnine. His life and death are reminders that outstanding scientists can be anyone of any race, gender, religion and sexual preference.

The enormous machines built during the war were valuable, but they were limited by those overheating valves. Next came an invention that has changed the computer and much else: the transistor.

Developed from late 1947 by John Bardeen (1908–91), Walter Brattain (1902–87) and William Shockley (1910–89), this device can amplify and switch electronic signals. Transistors were much smaller than vacuum tubes and generated much less heat. They have made all kinds of electrical appliances, such as transistor radios, much smaller and more efficient. The three men shared the Nobel Prize in Physics for their work, and Bardeen went on to win a second one for his research on ‘semi-conductors’, the material that makes transistors and modern circuits possible.

READ:  Where is the Centre of the Universe?

The military continued to develop computing during the Cold War of 1945 to 1991. The two great superpowers, the USA and the USSR, distrusted each other, despite having been allies during the Second World War. Computers were used to analyse the data each country collected about the other’s activities. But increasingly powerful number-crunching computers were a great help to scientists, too. Physicists made the greatest use of these new and improving machines during the 1960s. High-energy particle accelerators created so much data that it would have been impossible for an army of people with pencils and paper to make sense of it all.

More and more, computer scientists became members of a range of scientific teams, and research budgets included their salaries and equipment. So it made a lot of sense if one team could speak to another not just person to person, but computer to computer.

After all, the telephone had been around for a century, and sending messages by telegraph wires was even older. Then, in the early 1960s, ‘packet switching’ was invented. Digital messages could be broken up into smaller units, and each unit would travel by the easiest route, and then be reassembled at its destination, the receiving computer screen. When you are talking on a landline, you’re in ‘real time’, and no one else can call you. But you can send or receive a message on a computer – an email or a post on a website – and it will be available whenever someone wants to read it.

Packet switching was developed simultaneously in the USA and the UK. As a feature of national security, it allowed military or political leaders to communicate with each other, and would work even if some of the communication facilities had been destroyed.

READ:  Out of the Darkness

Packet switching made it easier to connect groups of computers: networking them. Again, the earliest non-military groups to network were the scientists. So much modern science benefits from collaboration. Academic communities were the main beneficiaries of the ever-smaller and ever-faster computers of the 1960s.

They were extremely large, extremely slow and extremely expensive, compared with what we use today. But you will be relieved to know it was possible to play computer games even then, so the fun started early. The pace of change in computing accelerated in the 1970s. Computers – or microcomputers, as they were called – with a screen and keyboard became small enough to fit onto a desk. As the microprocessor chips they contained became more powerful, the personal computer revolution began. Much of the research was done in Silicon Valley in California in the USA.

Computers continued to change the way academic communities worked and communicated with each other. One of the largest collections of physicists in the world worked at the European Organisation for Nuclear Research (CERN), which houses the Large Hadron Collider, the world’s fastest particle accelerator. Computer specialists at CERN took networking and data analysis to new heights in the 1980s and 1990s. One expert was Tim Berners-Lee (b. 1955). Berners-Lee was always fascinated by computers. He grew up with them, as both his parents were early computer pioneers. Berners-Lee studied physics at Oxford and then went to work at CERN. In 1989, he asked for some research funds for ‘Information Management’.

His bosses at CERN gave him some help, but he persisted with his idea of making the increasing amounts of information available on the Internet easily accessible to anyone with a computer and a telephone line. Along with his colleague Robert Cailliau (b. 1947), he invented the World Wide Web. At first it was used just at CERN and one or two other physics laboratories. Then, in 1993 it went public. This coincided with the massive growth of personal computers not just at work but in the home. People who led the personal computer revolution, like Microsoft’s Bill Gates (b. 1955) and Apple’s Steve Jobs (1955– 2011) are modern scientific heroes (and became very rich). So 1955 turned out to be a good year for computers: Berners-Lee, Gates and Jobs were all born then.

READ:  Coughs, Sneezes and Diseases

The speed of computer development from the 1970s matched the rate of invention of methods for sequencing the genome. It’s no coincidence that the two events happened at the same time.

Modern science is unthinkable without the modern computer. Many fundamental scientific problems, from designing new drugs to modelling climate change, depend on these machines. At home, we use them for doing homework, booking holiday tickets, playing computer games. Embedded computer systems fly our aeroplanes, assist medical imaging, and wash our clothes. Like modern science, modern life is computer-based.

We shouldn’t be surprised at this. One of the things I have tried to show in this little book is that at any moment of history, the science has been a product of that particular moment. Hippocrates’ moment was different from Galileo’s, or Lavoisier’s. They dressed, ate and thought like other people at the time. The people in this book thought more sharply than most of their contemporaries, and were able to communicate their ideas. That is why what they thought and wrote is worth remembering. Yet the science of our day is more powerful than ever before.

Computers are good for criminals and hackers as well as scientists and students. Science and technology can be abused as easily as they can be used for our common good. We need good scientists, but we also need good citizens who will ensure that our science will make the world a better place for us all to live in.