This is a history of nuclear power as realized through the first artificial fission of atoms that would lead to the Manhattan Project and, eventually, to using nuclear fission to generate electricity.
In 1932, physicists John Cockcroft, Ernest Walton, and Ernest Rutherford discovered that when lithium atoms were "split" by protons from a proton accelerator, immense amounts of energy were released in accordance with the principle of mass–energy equivalence.[1] However, they and other nuclear physics pioneers Niels Bohr and Albert Einstein believed harnessing the power of the atom for practical purposes anytime in the near future was unlikely.[2] The same year, Rutherford's doctoral student James Chadwick discovered the neutron.[3] Experiments bombarding materials with neutrons led Frédéric and Irène Joliot-Curie to discover induced radioactivity in 1934, which allowed the creation of radium-like elements.[4] Further work by Enrico Fermi in the 1930s focused on using slow neutrons to increase the effectiveness of induced radioactivity. Experiments bombarding uranium with neutrons led Fermi to believe he had created a new transuranic element, which was dubbed hesperium.[5]
In 1938, German chemists Otto Hahn[6] and Fritz Strassmann, along with Austrian physicist Lise Meitner[7] and Meitner's nephew, Otto Robert Frisch,[8] conducted experiments with the products of neutron-bombarded uranium, as a means of further investigating Fermi's claims. They determined that the relatively tiny neutron split the nucleus of the massive uranium atoms into two roughly equal pieces, contradicting Fermi.[5] This was an extremely surprising result; all other forms of nuclear decay involved only small changes to the mass of the nucleus, whereas this process—dubbed "fission" as a reference to biology—involved a complete rupture of the nucleus. Numerous scientists, including Leó Szilárd, who was one of the first, recognized that if fission reactions released additional neutrons, a self-sustaining nuclear chain reaction could result.[9][10] Once this was experimentally confirmed and announced by Frédéric Joliot-Curie in 1939, scientists in many countries (including the United States, the United Kingdom, France, Germany, and the Soviet Union) petitioned their governments for support of nuclear fission research, just on the cusp of World War II, for the development of a nuclear weapon.[11]
In the United States, where Fermi and Szilárd had both emigrated, the discovery of the nuclear chain reaction led to the creation of the first man-made reactor, the research reactor known as Chicago Pile-1, which achieved criticality on December 2, 1942. The reactor's development was part of the Manhattan Project, the Allied effort to create atomic bombs during World War II. It led to the building of larger single-purpose production reactors, such as the X-10 Pile, for the production of weapons-grade plutonium for use in the first nuclear weapons. The United States tested the first nuclear weapon in July 1945, the Trinity test, with the atomic bombings of Hiroshima and Nagasaki taking place one month later.
In August 1945, the first widely distributed account of nuclear energy, the pocketbook The Atomic Age,[14] was released. It discussed the peaceful future uses of nuclear energy and depicted a future where fossil fuels would go unused. Nobel laureate Glenn Seaborg, who later chaired the United States Atomic Energy Commission, is quoted as saying "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".[15]
In the same month, with the end of the war, Seaborg and others would file hundreds of initially classified patents,[10] most notably Eugene Wigner and Alvin Weinberg's Patent #2,736,696, on a conceptual light water reactor (LWR) that would later become the United States' primary reactor for naval propulsion and later take up the greatest share of the commercial fission-electric landscape.[16] The United Kingdom, Canada,[17] and the USSR proceeded to research and develop nuclear energy over the course of the late 1940s and early 1950s.
Electricity was generated for the first time by a nuclear reactor on December 20, 1951, at the EBR-I experimental station near Arco, Idaho, which initially produced about 100 kW.[18][19] In 1953, American President Dwight Eisenhower gave his "Atoms for Peace" speech at the United Nations, emphasizing the need to develop "peaceful" uses of nuclear power quickly. This was followed by the Atomic Energy Act of 1954 which allowed rapid declassification of U.S. reactor technology and encouraged development by the private sector.
The F-1 (from "First Physical Reactor") was a research reactor operated by the Kurchatov Institute in Moscow, Russia . When started on December 25, 1946, it became the first nuclear reactor in Europe to achieve a self-sustaining nuclear chain reaction.[20]
The first organization to develop nuclear power was the U.S. Navy, with the S1W reactor for the purpose of propelling submarines and aircraft carriers. The first nuclear-powered submarine, USS Nautilus, was put to sea in January 1954.[22][23] The S1W reactor was a Pressurized Water Reactor. This design was chosen because it was simpler, more compact, and easier to operate compared to alternative designs, thus more suitable to be used in submarines. This decision would result in the PWR being the reactor of choice also for power generation, thus having a lasting impact on the civilian electricity market in the years to come.[24] The United States Navy Nuclear Propulsion design and operation community, under Rickover's style
On June 27, 1954, the Obninsk Nuclear Power Plant in the USSR became the world's first nuclear power plant to generate electricity for a power grid, producing around 5 megawatts of electric power.[25] The world's first commercial nuclear power station, Calder Hall at Windscale, England was connected to the national power grid on 27 August 1956. In common with a number of other generation I reactors, the plant had the dual purpose of producing electricity and plutonium-239, the latter for the nascent nuclear weapons program in Britain.[26] It had an initial capacity of 50 MW per reactor (200 MW total),[27][28] it was the first of a fleet of dual-purpose MAGNOX reactors.[29]
The U.S. Army Nuclear Power Program formally commenced in 1954. Under its management, the 2 megawatt SM-1, at Fort Belvoir, Virginia, was the first in the United States to supply electricity in an industrial capacity to the commercial grid (VEPCO), in April 1957.[30] The first commercial nuclear station to become operational in the United States was the 60 MW Shippingport Reactor (Pennsylvania), in December 1957.[31] Originating from a cancelled nuclear-powered aircraft carrier contract, the plant used a PWR reactor design.[32] Its early adoption, technological lock-in,[33] and familiarity among retired naval personnel, established the PWR as the predominant civilian reactor design, that it still retains today in the United States.
In 1957 EURATOM was launched alongside the European Economic Community (the latter is now the European Union). The same year also saw the launch of the International Atomic Energy Agency (IAEA).
The first major accident at a nuclear reactor occurred at the 3 MW SL-1, a U.S. Army experimental nuclear power reactor at the National Reactor Testing Station, Idaho National Laboratory. It was derived from the Borax Boiling water reactor (BWR) design and it first achieved operational criticality and connection to the grid in 1958. For reasons unknown, in 1961 a technician removed a control rod about 22 inches farther than the prescribed 4 inches. This resulted in a steam explosion which killed the three crew members and caused a meltdown.[34][35] The event was eventually rated at 4 on the seven-level INES scale. On September 16, 1967, the Pathfinder Nuclear Station near Sioux Falls, South Dakota, open for about a year, suffered an accident which caused the one year old plant to be retired permanently when an operator opened a valve too fast. Another serious accident happened in 1968, when one of the two liquid-metal-cooled reactors on board the Soviet submarine K-27 underwent a fuel element failure, with the emission of gaseous fission products into the surrounding air. This resulted in 9 crew fatalities and 83 injuries.[36]
The total global installed nuclear capacity initially rose relatively quickly, rising from less than 1 gigawatt (GW) in 1960 to 100 GW in the late 1970s, and 300 GW in the late 1980s. Since the late 1980s worldwide capacity has risen much more slowly, reaching 366 GW in 2005. Between around 1970 and 1990, more than 50 GW of capacity was under construction (peaking at over 150 GW in the late 1970s and early 1980s)—in 2005, around 25 GW of new capacity was planned. More than two-thirds of all nuclear plants ordered after January 1970 were eventually cancelled.[22] A total of 63 nuclear units were canceled in the United States between 1975 and 1980.[38]
In 1972 Alvin Weinberg, co-inventor of the light water reactor design (the most common nuclear reactors today) was fired from his job at Oak Ridge National Laboratory by the Nixon administration, "at least in part" over his raising of concerns about the safety and wisdom of ever larger scaling-up of his design, especially above a power rating of ~500 MWe, as in a loss of coolant accident scenario, the decay heat generated from such large compact solid-fuel cores was thought to be beyond the capabilities of passive/natural convection cooling to prevent a rapid fuel rod melt-down and resulting in then, potential far reaching fission product pluming. While considering the LWR, well suited at sea for the submarine and naval fleet, Weinberg did not show complete support for its use by utilities on land at the power output that they were interested in for supply scale reasons, and would request for a greater share of AEC research funding to evolve his team's demonstrated,[39] Molten-Salt Reactor Experiment, a design with greater inherent safety in this scenario and with that an envisioned greater economic growth potential in the market of large-scale civilian electricity generation.[40][41][42]
Similar to the earlier BORAX reactor safety experiments, conducted by Argonne National Laboratory,[43] in 1976 Idaho National Laboratory began a test program focused on LWR reactors under various accident scenarios, with the aim of understanding the event progression and mitigating steps necessary to respond to a failure of one or more of the disparate systems, with much of the redundant back-up safety equipment and nuclear regulations drawing from these series of destructive testing investigations.[44]
During the 1970s and 1980s rising economic costs (related to extended construction times largely due to regulatory changes and pressure-group litigation)[45] and falling fossil fuel prices made nuclear power plants then under construction less attractive. In the 1980s in the U.S. and 1990s in Europe, the flat electric grid growth and electricity liberalization also made the addition of large new baseload energy generators economically unattractive.
The 1973 oil crisis had a significant effect on countries, such as France and Japan, which had relied more heavily on oil for electric generation (39%[46] and 73% respectively) to invest in nuclear power.[47] The French plan, known as the Messmer plan, was for the complete independence from oil, with an envisaged construction of 80 reactors by 1985 and 170 by 2000.[48] France would construct 25 fission-electric stations, installing 56 mostly PWR design reactors over the next 15 years, though foregoing the 100 reactors initially charted in 1973, for the 1990s.[49][50] In 2019, 71% of French electricity was generated by 58 reactors, the highest percentage by any nation in the world.[51]
Some local opposition to nuclear power emerged in the U.S. in the early 1960s, beginning with the proposed Bodega Bay station in California, in 1958, which produced conflict with local citizens and by 1964 the concept was ultimately abandoned.[52] In the late 1960s some members of the scientific community began to express pointed concerns.[53] These anti-nuclear concerns related to nuclear accidents, nuclear proliferation, nuclear terrorism and radioactive waste disposal.[54] In the early 1970s, there were large protests about a proposed nuclear power plant in Wyhl, Germany. The project was cancelled in 1975 the anti-nuclear success at Wyhl inspired opposition to nuclear power in other parts of Europe and North America.[55][56] By the mid-1970s anti-nuclear activism gained a wider appeal and influence, and nuclear power began to become an issue of major public protest.[57][58] In some countries, the nuclear power conflict "reached an intensity unprecedented in the history of technology controversies".[59][60] In May 1979, an estimated 70,000 people, including then governor of California Jerry Brown, attended a march against nuclear power in Washington, D.C.[61] Anti-nuclear power groups emerged in every country that had a nuclear power programme.
Globally during the 1980s one new nuclear reactor started up every 17 days on average.[62]
In the early 1970s, the increased public hostility to nuclear power in the United States lead the United States Atomic Energy Commission and later the Nuclear Regulatory Commission to lengthen the license procurement process, tighten engineering regulations and increase the requirements for safety equipment.[63][64] Together with relatively minor percentage increases in the total quantity of steel, piping, cabling and concrete per unit of installed nameplate capacity, the more notable changes to the regulatory open public hearing-response cycle for the granting of construction licenses, had the effect of what was once an initial 16 months for project initiation to the pouring of first concrete in 1967, escalating to 32 months in 1972 and finally 54 months in 1980, which ultimately, quadrupled the price of power reactors.[65][66]
Utility proposals in the U.S for nuclear generating stations, peaked at 52 in 1974, fell to 12 in 1976 and have never recovered,[67] in large part due to the pressure-group litigation strategy, of launching lawsuits against each proposed U.S construction proposal, keeping private utilities tied up in court for years, one of which having reached the supreme court in 1978 (see Vermont Yankee Nuclear Power Corp. v. Natural Resources Defense Council, Inc.[68] With permission to build a nuclear station in the U.S. eventually taking longer than in any other industrial country, the spectre facing utilities of having to pay interest on large construction loans while the anti-nuclear movement used the legal system to produce delays, increasingly made the viability of financing construction, less certain.[67] By the close of the 1970s it became clear that nuclear power would not grow nearly as dramatically as once believed.
Over 120 reactor proposals in the United States were ultimately cancelled[69] and the construction of new reactors ground to a halt. A cover story in the February 11, 1985, issue of Forbes magazine commented on the overall failure of the U.S. nuclear power program, saying it "ranks as the largest managerial disaster in business history".[70]
According to some commentators, the 1979 accident at Three Mile Island played a major part in the reduction in the number of new plant constructions in many other countries.[53] According to the Nuclear Regulatory Commission (NRC), the Three Mile Island accident was the most serious accident in "U.S. commercial nuclear power plant operating history, even though it led to no deaths or injuries to plant workers or members of the nearby community."[71] The regulatory uncertainty and delays eventually resulted in an escalation of construction related debt that led to the bankruptcy of Seabrook's major utility owner, Public Service Company of New Hampshire.[72] At the time, the fourth largest bankruptcy in United States corporate history.[73]
Among American engineers, the cost increases from implementing the regulatory changes that resulted from the TMI accident were, when eventually finalized, only a few percent of total construction costs for new reactors, primarily relating to the prevention of safety systems from being turned off. With the most significant engineering result of the TMI accident, the recognition that better operator training was needed and that the existing emergency core cooling system of PWRs worked better in a real-world emergency than members of the anti-nuclear movement had routinely claimed.[63][74]
The already slowing rate of new construction along with the shutdown in the 1980s of two existing demonstration nuclear power stations in the Tennessee Valley, United States, when they could not economically meet the NRC's new tightened standards, shifted electricity generation to coal-fired power plants.[75] In 1977, following the first oil shock, U.S. President Jimmy Carter made a speech calling the energy crisis the "moral equivalent of war" and prominently supporting nuclear power. However, nuclear power could not compete with cheap oil and gas, particularly after public opposition and regulatory hurdles made new nuclear prohibitively expensive.[76]
In 1982, amongst a backdrop of ongoing protests directed at the construction of the first commercial scale breeder reactor in France, a later member of the Swiss Green Party fired five RPG-7 rocket-propelled grenades at the still under construction containment building of the Superphenix reactor. Two grenades hit and caused minor damage to the reinforced concrete outer shell. It was the first time protests reached such heights. After examination of the superficial damage, the prototype fast breeder reactor started and operated for over a decade.[77]
The Chernobyl disaster occurred on Saturday 26 April 1986, at the No. 4 reactor in the Chernobyl Nuclear Power Plant, near the city of Pripyat in the north of the Ukrainian SSR.[78] It is considered as the worst nuclear disaster in history both in terms of cost and casualties.[79] The initial emergency response, together with later decontamination of the environment, ultimately involved more than 500,000 personnel and cost an estimated 18 billion Soviet rubles—roughly US$68 billion in 2019, adjusted for inflation.[80][81]
According to some commentators, the Chernobyl disaster played a major part in the reduction in the number of new plant constructions in many other countries.[53] Unlike the Three Mile Island accident the much more serious Chernobyl accident did not increase regulations or engineering changes affecting Western reactors; because the RBMK design, which lacks safety features such as "robust" containment buildings, was only used in the Soviet Union.[82] Over 10 RBMK reactors are still in use today. However, changes were made in both the RBMK reactors themselves (use of a safer enrichment of uranium) and in the control system (preventing safety systems being disabled), amongst other things, to reduce the possibility of a similar accident.[83] Russia now largely relies upon, builds and exports a variant of the PWR, the VVER, with over 20 in use today.
An international organization to promote safety awareness and the professional development of operators in nuclear facilities, the World Association of Nuclear Operators (WANO), was created as a direct outcome of the 1986 Chernobyl accident. The organization was created with the intent to share and grow the adoption of nuclear safety culture, technology and community, where before there was an atmosphere of cold war secrecy.
Numerous countries, including Austria (1978), Sweden (1980) and Italy (1987) (influenced by Chernobyl) have voted in referendums to oppose or phase out nuclear power.
In the early 2000s, the nuclear industry was expecting a nuclear renaissance, an increase in the construction of new reactors, due to concerns about carbon dioxide emissions.[85] However, in 2009, Petteri Tiippana, the director of nuclear power plant division in the Finnish Radiation and Nuclear Safety Authority, told the BBC that it was difficult to deliver a Generation III reactor project on schedule because builders were not used to working to the exacting standards required on nuclear construction sites, since so few new reactors had been built in recent years.[86]
The Olkiluoto 3 was the first EPR, a modernized PWR design, to start construction. Problems with workmanship and supervision have created costly delays. The reactor is estimated to cost three times the initial estimate and will be delivered over 10 years behind schedule.[87]
In 2018 the MIT Energy Initiative study on the future of nuclear energy concluded that, together with the strong suggestion that government should financially support development and demonstration of new Generation IV nuclear technologies, for a worldwide renaissance to commence, a global standardization of regulations needs to take place, with a move towards serial manufacturing of standardized units akin to the other complex engineering field of aircraft and aviation. At present it is common for each country to demand bespoke changes to the design to satisfy varying national regulatory bodies, often to the benefit of domestic engineering supply firms. The report goes on to note that the most cost-effective projects have been built with multiple (up to six) reactors per site using a standardized design, with the same component suppliers and construction crews working on each unit, in a continuous work flow.[88]
Following the Tōhoku earthquake on 11 March 2011, one of the largest earthquakes ever recorded, and a subsequent tsunami off the coast of Japan, the Fukushima Daiichi Nuclear Power Plant suffered three core meltdowns due to failure of the emergency cooling system for lack of electricity supply. This resulted in the most serious nuclear accident since the Chernobyl disaster.
The Fukushima Daiichi nuclear accident prompted a re-examination of nuclear safety and nuclear energy policy in many countries[89] and raised questions among some commentators over the future of the renaissance.[90][85] Germany approved plans to close all its reactors by 2022. (Following the energy crisis caused by the russian invasion of Ukraine, Germany now plans to keep reactors running until April 2023[91]) Italian nuclear energy plans[92] ended when Italy banned the generation, but not consumption, of nuclear electricity in a June 2011 referendum.[93][89] China, Switzerland, Israel, Malaysia, Thailand, United Kingdom, and the Philippines reviewed their nuclear power programs.[94][95][96][97]
In 2011 the International Energy Agency halved its prior estimate of new generating capacity to be built by 2035.[98][99] Nuclear power generation had the biggest ever fall year-on-year in 2012, with nuclear power plants globally producing 2,346 TWh of electricity, a drop of 7% from 2011. This was caused primarily by the majority of Japanese reactors remaining offline that year and the permanent closure of eight reactors in Germany.[100]
The Associated Press and Reuters reported in 2011 the suggestion that the safety and survival of the younger Onagawa Nuclear Power Plant, the closest reactor facility to the epicenter and on the coast, demonstrate that it is possible for nuclear facilities to withstand the greatest natural disasters. The Onagawa plant was also said to show that nuclear power can retain public trust, with the surviving residents of the town of Onagawa taking refuge in the gymnasium of the nuclear facility following the destruction of their town.[101][102]
In February 2012, the U.S. NRC approved the construction of 2 reactors at the Vogtle Electric Generating Plant, the first approval in 30 years.[103][104]
In August 2015, following 4 years of near zero fission-electricity generation, Japan began restarting its nuclear reactors, after safety upgrades were completed, beginning with Sendai Nuclear Power Plant.[105]
By 2015, the IAEA's outlook for nuclear energy had become more promising. "Nuclear power is a critical element in limiting greenhouse gas emissions," the agency noted, and "the prospects for nuclear energy remain positive in the medium to long term despite a negative impact in some countries in the aftermath of the [Fukushima-Daiichi] accident...it is still the second-largest source worldwide of low-carbon electricity. And the 72 reactors under construction at the start of last year were the most in 25 years."[106] (As of 2015), the global trend was for new nuclear power stations coming online to be balanced by the number of old plants being retired.[107] Eight new grid connections were completed by China in 2015.[108][109]
In 2016, the BN-800 sodium cooled fast reactor in Russia, began commercial electricity generation, while plans for a BN-1200 were initially conceived the future of the fast reactor program in Russia awaits the results from MBIR, an under construction multi-loop Generation research facility for testing the chemically more inert lead, lead-bismuth and gas coolants, it will similarly run on recycled MOX (mixed uranium and plutonium oxide) fuel. An on-site pyrochemical processing, closed fuel-cycle facility, is planned, to recycle the spent fuel/"waste" and reduce the necessity for a growth in uranium mining and exploration. In 2017 the manufacture program for the reactor commenced with the facility open to collaboration under the "International Project on Innovative Nuclear Reactors and Fuel Cycle", it has a construction schedule, that includes an operational start in 2020. As planned, it will be the world's most-powerful research reactor.[110]
In 2015, the Japanese government committed to the aim of restarting its fleet of 40 reactors by 2030 after safety upgrades, and to finish the construction of the Generation III Ōma Nuclear Power Plant.[111] This would mean that approximately 20% of electricity would come from nuclear power by 2030. As of 2018, some reactors have restarted commercial operation following inspections and upgrades with new regulations.[112] While South Korea has a large nuclear power industry, the new government in 2017, influenced by a vocal anti-nuclear movement,[113] committed to halting nuclear development after the completion of the facilities presently under construction.[114][115][116]
The bankruptcy of Westinghouse in March 2017 due to US$9 billion of losses from the halting of construction at Virgil C. Summer Nuclear Generating Station, in the U.S. is considered an advantage for eastern companies, for the future export and design of nuclear fuel and reactors.[117]
In 2016, the U.S. Energy Information Administration projected for its "base case" that world nuclear power generation would increase from 2,344 terawatt hours (TWh) in 2012 to 4,500 TWh in 2040. Most of the predicted increase was expected to be in Asia.[118] As of 2018, there are over 150 nuclear reactors planned including 50 under construction.[119] In January 2019, China had 45 reactors in operation, 13 under construction, and plans to build 43 more, which would make it the world's largest generator of nuclear electricity.[120]
Zero-emission nuclear power is an important part of the climate change mitigation effort. Under IEA Sustainable Development Scenario by 2030 nuclear power and CCUS would have generated 3900 TWh globally while wind and solar 8100 TWh with the ambition to achieve net-zero CO
2 emissions by 2070.[122] In order to achieve this goal on average 15 GWe of nuclear power should have been added annually on average.[123] As of 2019 over 60 GW in new nuclear power plants was in construction, mostly in China, Russia, Korea, India and UAE.[123] Many countries in the world are considering Small Modular Reactors with one in Russia connected to the grid in 2020.
Countries with at least one nuclear power plant in planning phase include Argentina, Brazil, Bulgaria, the Czech Republic, Egypt, Finland, Hungary, India, Kazakhstan, Poland, Saudi Arabia and Uzbekistan.[123]
The future of nuclear power varies greatly between countries, depending on government policies. Some countries, most notably, Germany, have adopted policies of nuclear power phase-out. At the same time, some Asian countries, such as China[120] and India,[124] have committed to rapid expansion of nuclear power. In other countries, such as the United Kingdom[125] and the United States, nuclear power is planned to be part of the energy mix together with renewable energy.
Nuclear energy may be one solution to providing clean power while also reversing the impact fossil fuels have had on our climate.[126] These plants would capture carbon dioxide and create a clean energy source with zero emissions, making a carbon-negative process. Scientists propose that 1.8 million lives have already been saved by replacing fossil fuel sources with nuclear power.[127]
(As of 2019) the cost of extending plant lifetimes is competitive with other electricity generation technologies, including new solar and wind projects.[128] In the United States, licenses of almost half of the operating nuclear reactors have been extended to 60 years.[129] The U.S. NRC and the U.S. Department of Energy have initiated research into light water reactor sustainability which is hoped will lead to allowing extensions of reactor licenses beyond 60 years, provided that safety can be maintained, to increase energy security and preserve low-carbon generation sources. Research into nuclear reactors that can last 100 years, known as Centurion Reactors, is being conducted.[130] As of 2020, a number of US nuclear power plants were cleared by Nuclear Regulatory Commission for operations up to 80 years.[131]
Following the 2022 Russian invasion of Ukraine, the situation has changed. With the Versailles declaration agreed in March 2022, the EU leaders of the 27 member states agreed to phase out the EU’s dependence on Russian fossil fuels as soon as possible.[132] The World Economic Forum has published energy policy changes, following the Russian Invasion.[133] Korea is planning to "increase renewables in electricity [...] [and] nuclear power to over 30%".[133] Japan has decided to "restart nuclear power plants aligned with the 6th Strategic Energy Plan [...]".[133] Germany decided to postpone the shutdown of its three remaining nuclear power plants until April 2023.[134]
Original source: https://en.wikipedia.org/wiki/History of nuclear power.
Read more |