Nuclear technology is technology that involves the nuclear reactions of atomic nuclei. Among the notable nuclear technologies are nuclear reactors, nuclear medicine and nuclear weapons. It is also used, among other things, in smoke detectors and gun sights.
The vast majority of common, natural phenomena on Earth only involve gravity and electromagnetism, and not nuclear reactions. This is because atomic nuclei are generally kept apart because they contain positive electrical charges and therefore repel each other.
In 1896, Henri Becquerel was investigating phosphorescence in uranium salts when he discovered a new phenomenon which came to be called radioactivity.[1] He, Pierre Curie and Marie Curie began investigating the phenomenon. In the process, they isolated the element radium, which is highly radioactive. They discovered that radioactive materials produce intense, penetrating rays of three distinct sorts, which they labeled alpha, beta, and gamma after the first three Greek letters. Some of these kinds of radiation could pass through ordinary matter, and all of them could be harmful in large amounts. All of the early researchers received various radiation burns, much like sunburn, and thought little of it.
The new phenomenon of radioactivity was seized upon by the manufacturers of quack medicine (as had the discoveries of electricity and magnetism, earlier), and a number of patent medicines and treatments involving radioactivity were put forward.
Gradually it was realized that the radiation produced by radioactive decay was ionizing radiation, and that even quantities too small to burn could pose a severe long-term hazard. Many of the scientists working on radioactivity died of cancer as a result of their exposure. Radioactive patent medicines mostly disappeared, but other applications of radioactive materials persisted, such as the use of radium salts to produce glowing dials on meters.
As the atom came to be better understood, the nature of radioactivity became clearer. Some larger atomic nuclei are unstable, and so decay (release matter or energy) after a random interval. The three forms of radiation that Becquerel and the Curies discovered are also more fully understood. Alpha decay is when a nucleus releases an alpha particle, which is two protons and two neutrons, equivalent to a helium nucleus. Beta decay is the release of a beta particle, a high-energy electron. Gamma decay releases gamma rays, which unlike alpha and beta radiation are not matter but electromagnetic radiation of very high frequency, and therefore energy. This type of radiation is the most dangerous and most difficult to block. All three types of radiation occur naturally in certain elements.
It has also become clear that the ultimate source of most terrestrial energy is nuclear, either through radiation from the Sun caused by stellar thermonuclear reactions or by radioactive decay of uranium within the Earth, the principal source of geothermal energy.
In natural nuclear radiation, the byproducts are very small compared to the nuclei from which they originate. Nuclear fission is the process of splitting a nucleus into roughly equal parts, and releasing energy and neutrons in the process. If these neutrons are captured by another unstable nucleus, they can fission as well, leading to a chain reaction. The average number of neutrons released per nucleus that go on to fission another nucleus is referred to as k. Values of k larger than 1 mean that the fission reaction is releasing more neutrons than it absorbs, and therefore is referred to as a self-sustaining chain reaction. A mass of fissile material large enough (and in a suitable configuration) to induce a self-sustaining chain reaction is called a critical mass.
When a neutron is captured by a suitable nucleus, fission may occur immediately, or the nucleus may persist in an unstable state for a short time. If there are enough immediate decays to carry on the chain reaction, the mass is said to be prompt critical, and the energy release will grow rapidly and uncontrollably, usually leading to an explosion.
When discovered on the eve of World War II, this insight led multiple countries to begin programs investigating the possibility of constructing an atomic bomb — a weapon which utilized fission reactions to generate far more energy than could be created with chemical explosives. The Manhattan Project, run by the United States with the help of the United Kingdom and Canada , developed multiple fission weapons which were used against Japan in 1945 at Hiroshima and Nagasaki. During the project, the first fission reactors were developed as well, though they were primarily for weapons manufacture and did not generate electricity.
In 1951, the first nuclear fission power plant was the first to produce electricity at the Experimental Breeder Reactor No. 1 (EBR-1), in Arco, Idaho, ushering in the "Atomic Age" of more intensive human energy use.[2]
However, if the mass is critical only when the delayed neutrons are included, then the reaction can be controlled, for example by the introduction or removal of neutron absorbers. This is what allows nuclear reactors to be built. Fast neutrons are not easily captured by nuclei; they must be slowed (slow neutrons), generally by collision with the nuclei of a neutron moderator, before they can be easily captured. Today, this type of fission is commonly used to generate electricity.
If nuclei are forced to collide, they can undergo nuclear fusion. This process may release or absorb energy. When the resulting nucleus is lighter than that of iron, energy is normally released; when the nucleus is heavier than that of iron, energy is generally absorbed. This process of fusion occurs in stars, which derive their energy from hydrogen and helium. They form, through stellar nucleosynthesis, the light elements (lithium to calcium) as well as some of the heavy elements (beyond iron and nickel, via the S-process). The remaining abundance of heavy elements, from nickel to uranium and beyond, is due to supernova nucleosynthesis, the R-process.
Of course, these natural processes of astrophysics are not examples of nuclear "technology". Because of the very strong repulsion of nuclei, fusion is difficult to achieve in a controlled fashion. Hydrogen bombs obtain their enormous destructive power from fusion, but their energy cannot be controlled. Controlled fusion is achieved in particle accelerators; this is how many synthetic elements are produced. A fusor can also produce controlled fusion and is a useful neutron source. However, both of these devices operate at a net energy loss. Controlled, viable fusion power has proven elusive, despite the occasional hoax. Technical and theoretical difficulties have hindered the development of working civilian fusion technology, though research continues to this day around the world.
Nuclear fusion was initially pursued only in theoretical stages during World War II, when scientists on the Manhattan Project (led by Edward Teller) investigated it as a method to build a bomb. The project abandoned fusion after concluding that it would require a fission reaction to detonate. It took until 1952 for the first full hydrogen bomb to be detonated, so-called because it used reactions between deuterium and tritium. Fusion reactions are much more energetic per unit mass of fuel than fission reactions, but starting the fusion chain reaction is much more difficult.
A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission or a combination of fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter. Even small nuclear devices can devastate a city by blast, fire and radiation. Nuclear weapons are considered weapons of mass destruction, and their use and control has been a major aspect of international policy since their debut.
The design of a nuclear weapon is more complicated than it might seem. Such a weapon must hold one or more subcritical fissile masses stable for deployment, then induce criticality (create a critical mass) for detonation. It also is quite difficult to ensure that such a chain reaction consumes a significant fraction of the fuel before the device flies apart. The procurement of a nuclear fuel is also more difficult than it might seem, since sufficiently unstable substances for this process do not currently occur naturally on Earth in suitable amounts.
One isotope of uranium, namely uranium-235, is naturally occurring and sufficiently unstable, but it is always found mixed with the more stable isotope uranium-238. The latter accounts for more than 99% of the weight of natural uranium. Therefore, some method of isotope separation based on the weight of three neutrons must be performed to enrich (isolate) uranium-235.
Alternatively, the element plutonium possesses an isotope that is sufficiently unstable for this process to be usable. Terrestrial plutonium does not currently occur naturally in sufficient quantities for such use,[3] so it must be manufactured in a nuclear reactor.
Ultimately, the Manhattan Project manufactured nuclear weapons based on each of these elements. They detonated the first nuclear weapon in a test code-named "Trinity", near Alamogordo, New Mexico, on July 16, 1945. The test was conducted to ensure that the implosion method of detonation would work, which it did. A uranium bomb, Little Boy, was dropped on the Japan ese city Hiroshima on August 6, 1945, followed three days later by the plutonium-based Fat Man on Nagasaki. In the wake of unprecedented devastation and casualties from a single weapon, the Japanese government soon surrendered, ending World War II.
Since these bombings, no nuclear weapons have been deployed offensively. Nevertheless, they prompted an arms race to develop increasingly destructive bombs to provide a nuclear deterrent. Just over four years later, on August 29, 1949, the Soviet Union detonated its first fission weapon. The United Kingdom followed on October 2, 1952; France, on February 13, 1960; and China component to a nuclear weapon. Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure.[4][5] A radiological weapon is a type of nuclear weapon designed to distribute hazardous nuclear material in enemy areas. Such a weapon would not have the explosive capability of a fission or fusion bomb, but would kill many people and contaminate a large area. A radiological weapon has never been deployed. While considered useless by a conventional military, such a weapon raises concerns over nuclear terrorism.
There have been over 2,000 nuclear tests conducted since 1945. In 1963, all nuclear and many non-nuclear states signed the Limited Test Ban Treaty, pledging to refrain from testing nuclear weapons in the atmosphere, underwater, or in outer space. The treaty permitted underground nuclear testing. France continued atmospheric testing until 1974, while China continued up until 1980. The last underground test by the United States was in 1992, the Soviet Union in 1990, the United Kingdom in 1991, and both France and China continued testing until 1996. After signing the Comprehensive Test Ban Treaty in 1996 (which had as of 2011 not entered into force), all of these states have pledged to discontinue all nuclear testing. Non-signatories India and Pakistan last tested nuclear weapons in 1998.
Nuclear weapons are the most destructive weapons known - the archetypal weapons of mass destruction. Throughout the Cold War, the opposing powers had huge nuclear arsenals, sufficient to kill hundreds of millions of people. Generations of people grew up under the shadow of nuclear devastation, portrayed in films such as Dr. Strangelove and The Atomic Cafe.
However, the tremendous energy release in the detonation of a nuclear weapon also suggested the possibility of a new energy source.
Nuclear power is a type of nuclear technology involving the controlled use of nuclear fission to release energy for work including propulsion, heat, and the generation of electricity. Nuclear energy is produced by a controlled nuclear chain reaction which creates heat—and which is used to boil water, produce steam, and drive a steam turbine. The turbine is used to generate electricity and/or to do mechanical work.
Currently nuclear power provides approximately 15.7% of the world's electricity (in 2004) and is used to propel aircraft carriers, icebreakers and submarines (so far economics and fears in some ports have prevented the use of nuclear power in transport ships).[6] All nuclear power plants use fission. No man-made fusion reaction has resulted in a viable source of electricity.
The medical applications of nuclear technology are divided into diagnostics and radiation treatment.
Imaging - The largest use of ionizing radiation in medicine is in medical radiography to make images of the inside of the human body using x-rays. This is the largest artificial source of radiation exposure for humans. Medical and dental x-ray imagers use of cobalt-60 or other x-ray sources. A number of radiopharmaceuticals are used, sometimes attached to organic molecules, to act as radioactive tracers or contrast agents in the human body. Positron emitting nucleotides are used for high resolution, short time span imaging in applications known as Positron emission tomography.
Radiation is also used to treat diseases in radiation therapy.
Since some ionizing radiation can penetrate matter, they are used for a variety of measuring methods. X-rays and gamma rays are used in industrial radiography to make images of the inside of solid products, as a means of nondestructive testing and inspection. The piece to be radiographed is placed between the source and a photographic film in a cassette. After a certain exposure time, the film is developed and it shows any internal defects of the material.
Gauges - Gauges use the exponential absorption law of gamma rays
Electrostatic control - To avoid the build-up of static electricity in production of paper, plastics, synthetic textiles, etc., a ribbon-shaped source of the alpha emitter 241Am can be placed close to the material at the end of the production line. The source ionizes the air to remove electric charges on the material.
Radioactive tracers - Since radioactive isotopes behave, chemically, mostly like the inactive element, the behavior of a certain chemical substance can be followed by tracing the radioactivity. Examples:
Oil and Gas Exploration- Nuclear well logging is used to help predict the commercial viability of new or existing wells. The technology involves the use of a neutron or gamma-ray source and a radiation detector which are lowered into boreholes to determine the properties of the surrounding rock such as porosity and lithography.[1]
Road Construction - Nuclear moisture/density gauges are used to determine the density of soils, asphalt, and concrete. Typically a cesium-137 source is used.
In biology and agriculture, radiation is used to induce mutations to produce new or improved species, such as in atomic gardening. Another use in insect control is the sterile insect technique, where male insects are sterilized by radiation and released, so they have no offspring, to reduce the population.
In industrial and food applications, radiation is used for sterilization of tools and equipment. An advantage is that the object may be sealed in plastic before sterilization. An emerging use in food production is the sterilization of food using food irradiation.
Food irradiation[8] is the process of exposing food to ionizing radiation in order to destroy microorganisms, bacteria, viruses, or insects that might be present in the food. The radiation sources used include radioisotope gamma ray sources, X-ray generators and electron accelerators. Further applications include sprout inhibition, delay of ripening, increase of juice yield, and improvement of re-hydration. Irradiation is a more general term of deliberate exposure of materials to radiation to achieve a technical goal (in this context 'ionizing radiation' is implied). As such it is also used on non-food items, such as medical hardware, plastics, tubes for gas-pipelines, hoses for floor-heating, shrink-foils for food packaging, automobile parts, wires and cables (isolation), tires, and even gemstones. Compared to the amount of food irradiated, the volume of those every-day applications is huge but not noticed by the consumer.
The genuine effect of processing food by ionizing radiation relates to damages to the DNA, the basic genetic information for life. Microorganisms can no longer proliferate and continue their malignant or pathogenic activities. Spoilage causing micro-organisms cannot continue their activities. Insects do not survive or become incapable of procreation. Plants cannot continue the natural ripening or aging process. All these effects are beneficial to the consumer and the food industry, likewise.[8]
The amount of energy imparted for effective food irradiation is low compared to cooking the same; even at a typical dose of 10 kGy most food, which is (with regard to warming) physically equivalent to water, would warm by only about 2.5 °C (4.5 °F).
The specialty of processing food by ionizing radiation is the fact, that the energy density per atomic transition is very high, it can cleave molecules and induce ionization (hence the name) which cannot be achieved by mere heating. This is the reason for new beneficial effects, however at the same time, for new concerns. The treatment of solid food by ionizing radiation can provide an effect similar to heat pasteurization of liquids, such as milk. However, the use of the term, cold pasteurization, to describe irradiated foods is controversial, because pasteurization and irradiation are fundamentally different processes, although the intended end results can in some cases be similar.
Detractors of food irradiation have concerns about the health hazards of induced radioactivity.[citation needed] A report for the industry advocacy group American Council on Science and Health entitled "Irradiated Foods" states: "The types of radiation sources approved for the treatment of foods have specific energy levels well below that which would cause any element in food to become radioactive. Food undergoing irradiation does not become any more radioactive than luggage passing through an airport X-ray scanner or teeth that have been X-rayed."[9]
Food irradiation is currently permitted by over 40 countries and volumes are estimated to exceed 500,000 metric tons (490,000 long tons; 550,000 short tons) annually worldwide.[10][11][12]
Food irradiation is essentially a non-nuclear technology; it relies on the use of ionizing radiation which may be generated by accelerators for electrons and conversion into bremsstrahlung, but which may use also gamma-rays from nuclear decay. There is a worldwide industry for processing by ionizing radiation, the majority by number and by processing power using accelerators. Food irradiation is only a niche application compared to medical supplies, plastic materials, raw materials, gemstones, cables and wires, etc.
Nuclear accidents, because of the powerful forces involved, are often very dangerous. Historically, the first incidents involved fatal radiation exposure. Marie Curie died from aplastic anemia which resulted from her high levels of exposure. Two scientists, an American and Canadian respectively, Harry Daghlian and Louis Slotin, died after mishandling the same plutonium mass. Unlike conventional weapons, the intense light, heat, and explosive force is not the only deadly component to a nuclear weapon. Approximately half of the deaths from Hiroshima and Nagasaki died two to five years afterward from radiation exposure.[4][5]
Civilian nuclear and radiological accidents primarily involve nuclear power plants. Most common are nuclear leaks that expose workers to hazardous material. A nuclear meltdown refers to the more serious hazard of releasing nuclear material into the surrounding environment. The most significant meltdowns occurred at Three Mile Island in Pennsylvania and Chernobyl in the Soviet Ukraine . The earthquake and tsunami on March 11, 2011 caused serious damage to three nuclear reactors and a spent fuel storage pond at the Fukushima Daiichi nuclear power plant in Japan. Military reactors that experienced similar accidents were Windscale in the United Kingdom and SL-1 in the United States.
Military accidents usually involve the loss or unexpected detonation of nuclear weapons. The Castle Bravo test in 1954 produced a larger yield than expected, which contaminated nearby islands, a Japanese fishing boat (with one fatality), and raised concerns about contaminated fish in Japan. In the 1950s through 1970s, several nuclear bombs were lost from submarines and aircraft, some of which have never been recovered. The last twenty years[when?] have seen a marked decline in such accidents.
Proponents of nuclear energy note that annually, nuclear-generated electricity reduces 470 million metric tons of carbon dioxide emissions that would otherwise come from fossil fuels.[13] Additionally, the amount of comparatively low waste that nuclear energy does create is safely disposed of by the large scale nuclear energy production facilities or it is repurposed/recycled for other energy uses. [14] Proponents of nuclear energy also bring to attention the opportunity cost of utilizing other forms of electricity. For example, the Environmental Protection Agency estimates that coal kills 30,000 people a year,[15] as a result of its environmental impact, while 60 people died in the Chernobyl disaster.[16] A real world example of impact provided by proponents of nuclear energy is the 650,000 ton increase in carbon emissions in the two months following the closure of the Vermont Yankee nuclear plant.[17]
Original source: https://en.wikipedia.org/wiki/Nuclear technology.
Read more |