A Little History Of Science: Radioactivity
Have you ever broken a bone, or swallowed something by mistake? If so, the chances are you had an X-ray so a doctor could see inside your body without having to open it up. X-rays are routine today. At the end of the nineteenth century, they were a sensation. X-rays were the first kind of radiation to be harnessed, even before the meaning of radiation was properly understood.
Radioactivity and atomic bombs came later. In Germany, X-rays are still sometimes called ‘Röntgen rays’, after Wilhelm Röntgen (1845–1923). He was not the first to have seen their power, but he was the first to realise what he had seen.
Science is often like that: it is not enough simply to see – you must understand what you are looking at. In the 1890s, Röntgen, along with many other physicists (remember J.J. Thomson?) was working with the cathode ray tube. On 8 November 1895, he noticed that a photographic plate, some distance from his cathode ray tube, had mysteriously been exposed.
It was covered with black paper, and at that time scientists assumed that cathode rays had no effect that far away. He spent the next six weeks working out what was happening. Other scientists had observed the same thing but had not done anything about it.
Röntgen discovered that these new rays went in a straight line, and were not affected by magnetic fields. Unlike light, they could not be reflected nor bent by a glass lens. But they could penetrate solid material, including his wife’s hand! She posed for the first X-ray picture, her wedding ring clearly visible along with the bones of her fingers. Not knowing exactly what these rays were, he simply called them ‘X-rays’. After the six weeks of hard work, he told the world.
X-rays became an immediate hit. Their medical uses were instantly recognised, in diagnosing broken bones or locating bullets or other things that shouldn’t be lodged inside a body. Few things have ever been so instantly taken up by the general public. ‘X-ray resistant’ underwear was quickly for sale. Physicists debated what exactly X-rays were. After more than a decade of further research, X-rays were shown to be radiation with an unusually short wavelength and high energy. Early on, laboratory workers noticed that X-rays could damage human flesh, causing burns to appear, so they were used to try to kill cancer cells as early as 1896.
It took a while longer for people to realise just how dangerous they were, and several of the early researchers died of radiation poisoning, or of a blood cancer called leukaemia. X-rays could cause as well as fight cancer.
While Röntgen worked with X-rays, another form of radiation – radioactivity – was discovered, this time in France. Henri Becquerel (1852–1908) was studying fluorescence, the way in which some substances glow, or naturally give off light. He was using a compound of uranium that did just that. When he discovered that this compound affected a photographic plate, just as Röntgen’s X-rays had done, he assumed that he had discovered another source of this mysterious ray. But Becquerel found in 1896 that his rays did not behave like Röntgen’s. They were a different kind of radiation, without the obvious dramatic effects of the X-rays that could ‘see’ through clothes or skin, but still worth another look.
In Paris, this challenge was taken up by the famous husband and wife physicists, Pierre and Marie Curie (1859–1906; 1867–1934). In 1898, the Curies obtained a tonne of pitchblende, crude tar-like stuff that contains some uranium. As they were extracting their relatively pure uranium, radioactivity burned their hands. They also discovered two new radioactive elements, which they named thorium and polonium, the latter after Marie’s native Poland. As these elements had properties similar to uranium’s, scientists around the world pressed to find out more about their powerful rays. These were the beta rays (streams of electrons); the alpha rays (shown in 1899 by Rutherford to be helium atoms with no electrons, and so positively charged); and gamma rays (without charge, but later shown to be electromagnetic radiation similar to X-rays).
The Curies were truly heroic in their dedication to science. After Pierre was killed in a street accident, Marie continued their work, despite having their two young children to look after.
The ancient promise of alchemy, to see one element change into another, was almost fulfilled by the discovery of radioactivity. Almost, because the alchemists’ dream had been to change lead or some other base metal into gold; what radioactivity did was trans- mute uranium into lead, a valuable metal into a base one! Still.
Nature could do what the alchemists had merely dreamed of. Like X-rays, radioactivity had important medical uses. Radium, another radioactive element discovered by Marie Curie, was espe- cially valued. Its rays could kill cancer cells. But, like X-rays, radio- activity also causes cancer if the dose is too high. Many early workers, including Marie Curie, died from the effects of radiation, before proper safety guidelines were worked out. Her daughter, Irène, won her own Nobel Prize for work in the same field, and died early of the blood cancer that had killed her mother.
Uranium, thorium, polonium and radium are naturally radioactive. What does this mean? These radioactive elements are what physicists call ‘heavy’. Their nucleus is very tightly packed and this makes it unstable. It is this instability that we detect as the radioactive rays. It was called ‘radioactive decay’ because when particles were lost, the element did literally decay, becoming a different element and taking up a different place in the periodic table. Studying this decay carefully continued that vital work of filling in the knowledge-gaps in the periodic table.
It also provided a valuable way of dating events in the earth’s history, a process called ‘radiometric dating’. Ernest Rutherford was a pioneer in this development, too, suggesting in 1905 that the technique would help in dating the age of the earth. Physicists calculated how long it would take for half of the atoms in a naturally radioactive element (uranium, for example) to decay away to its end product, the different version of the element (lead, in this example). This period of time was called the element’s half-life.
Elements’ half-lives can vary from a few seconds to millions of years. Once they knew an element’s half-life, scientists could date an event by looking in a fossil or a rock (any naturally occurring sample) to see how much there was of the original element and how much of the decayed one. The ratio between the two elements would tell them the age of the sample. One unusual form of carbon is naturally radioactive and its half-life can be used to date the fossilised remains of once-living animals and plants. All living things take up carbon through their lifetime. When they die this stops. So measuring the amount of radioactive carbon in fossils provides a date for their formation. Radiometric dating uses the same principle to date rocks, which gives a much longer time frame. The technique has transformed the study of fossils, because they are no longer just older or younger than each other – we know their approximate age.
Physicists quickly saw that enormous amounts of energy were involved in radioactive emissions. Naturally radioactive elements like uranium, and the radioactive forms of common elements like carbon, are scarce. But when you bombard atoms with alpha particles or neutrons, you can get many elements to artificially emit radioactivity’s energy. This showed how much energy is packed into the atom’s nucleus. Finding out how to make use of this potential has driven many physicists for the past hundred years.
When you bombard an atom and make it throw out an alpha particle from its nucleus, you ‘split’ the atom and make it a different element. This is nuclear fission. The nucleus has lost two protons.
The alternative, nuclear fusion, occurs when an atom absorbs a particle and takes up a new place on the periodic table. Both fission and fusion release energy. The possibility of nuclear fusion was shown in the late 1930s by German and Austrian physicists, including Lise Meitner (1878–1968). Born a Jew, Meitner had converted to Christianity, but she still had to flee Nazi Germany in 1938. She discussed the fusion of two hydrogen atoms to form an atom of helium, the next element on the periodic table. From studying the sun and other stars, the conversion of hydrogen to helium was shown to be the main source of stellar energy. (Helium was discovered in the sun before it was found on earth: its atoms show characteristic wavelengths when examined with an instrument called the spectroscope.) This reaction needs very high temperatures, and in the 1930s it could not be done in the labora- tory. But in theory, you could make a hydrogen bomb (a fusion bomb) that would release a vast amount of energy when it exploded.
In the 1930s, the alternative – the atomic or fission bomb – was more do-able. As the Nazis continued their aggression in Europe, war seemed increasingly likely. Scientists in several countries, including Germany, worked secretly towards preparing such devastating weapons. Crucial in this horrifying dance towards total war was the work of the Italian physicist Enrico Fermi (1901–54).
Fermi and his group showed that bombarding atoms with ‘slow’ neutrons would cause the desired nuclear fission. Slow neutrons were passed through paraffin (or a similar substance) on the way to their target atom. At this reduced speed they were more likely to lodge in the nucleus, causing it to split. Fermi left Italy in 1938 to escape its Fascist regime, which was sympathetic to the Nazis. He went to the United States, as did so many of the most creative scientists (and writers, artists and thinkers) in the period. Today we sometimes speak of the ‘brain drain’, meaning that the best ‘brains’ leave their homes for better working conditions in other countries: more money, a bigger lab, a better chance to live their lives as they wish. People in the late 1930s and early 1940s fled because they had been sacked from their jobs and feared for their lives. The Nazis and Fascists did many horrific things. They also changed the face of science, and Britain and the United States gained most from this enforced brain drain.
In the USA, many of the refugees would join the top-secret ‘Manhattan Project’. This was one of the most expensive scientific projects ever undertaken, but these were increasingly desperate times. By the late 1930s, the dramatic improvements in under- standing the radioactive elements convinced many physicists that they could create a nuclear explosion. The difficulty was in control- ling it. Some thought it would be too dangerous: the resulting chain reaction would simply blow up the whole planet. When war was declared in 1939, physicists in Britain and the USA believed that scientists in Germany and Japan would continue to work towards an atomic bomb and that the Allies must do the same. A number of scientists wrote to the American President, Franklin Roosevelt, urging him to authorise an Allied response. Among them was Albert Einstein, the world’s most famous scientist and also a refugee from Nazi Germany.
Roosevelt agreed. At sites in Tennessee, Chicago and New Mexico, the many components of the fateful step were coordinated. The Manhattan Project was run along military lines. Scientists stopped publishing their findings. They put aside science’s core value of openness and sharing information. War changes human values. The secret was not even shared with Communist Russia, a key ally of the USA and Britain, but still not trusted on the subject of top-secret bombs. By 1945, German, Japanese and Russian efforts to build atomic bombs had still not got very far, even though one of the scientists in the US secretly fed the Russians with information. But the Manhattan Project had produced two bombs.
One used uranium, the other plutonium, a man-made radioactive element. A smaller test bomb was exploded in the American desert. It worked. The bombs were ready for use. Germany surrendered on 8 May 1945, so no bomb was dropped in Europe. Japan continued its aggression in the Pacific. The new US President, Harry Truman, ordered the uranium bomb to be dropped on the Japanese city of Hiroshima on 6 August. It was detonated by firing one piece of uranium into another. The Japanese still did not surrender. Truman ordered the plutonium bomb to be dropped on a second Japanese city, Nagasaki, three days later. That action finally ended the war. The bombs had killed about 300,000 people, mostly civilians, and Japan surrendered.
Everyone now saw the astounding power of nuclear energy. Our world was changed forever. Many of the scientists who had made these weapons of mass destruction knew their achievements had ended a terrible war, but worried about what they had created.
The incredible power of atomic energy continues to be important in our world. So, also, do its dangers. Mistrust between Russia and the USA continued after the Second World War, developing into the ‘Cold War’. Both countries built up vast stores of atomic or nuclear weapons. Fortunately, they have not yet been used in anger, and although the stockpiles have been reduced over the years, through agreement, the number of nations that have nuclear weapons has grown.
The physics that was learned during the Manhattan Project has also been used to produce a more controlled release of energy. Nuclear power can generate electricity with only a fraction of the greenhouse gases released by burning coal and other fossil fuels.
France generates almost three-quarters of its electricity by nuclear power, for instance. But the dangers of accidents and risks from terrorism have made many fearful of nuclear power, despite its benefits. Few things in modern science and technology better illustrate the mix of politics and social values than does the question: What should we do with our knowledge of nuclear energy?