Electrification is the process of powering by electricity and, in many contexts, the introduction of such power by changing over from an earlier power source. In the context of history of technology and economic development, electrification refers to the build-out of the electricity generation and electric power distribution systems. In the context of sustainable energy, electrification refers to the build-out of super grids with energy storage to accommodate the energy transition to renewable energy and the switch of end-uses to electricity.[citation needed]
The electrification of particular sectors of the economy, particularly out of context, is called by modified terms such as factory electrification, household electrification, rural electrification and railway electrification. In the context of sustainable energy, terms such as transport electrification (referring to electric vehicles) or heating electrification (referring to heat pumps) are used. It may also apply to changing industrial processes such as smelting, melting, separating or refining from coal or coke heating,[clarification needed] or to chemical processes to some type of electric process such as electric arc furnace, electric induction or resistance heating, or electrolysis or electrolytic separating.
Electrification was called "the greatest engineering achievement of the 20th Century" by the National Academy of Engineering,[1] and it continues in both rich and poor countries.[2][3]
Electric lighting is highly desirable. The light is much brighter than oil or gas lamps, and there is no soot. Although early electricity was very expensive compared to today, it was far cheaper and more convenient than oil or gas lighting. Electric lighting was so much safer than oil or gas that some companies were able to pay for the electricity with the insurance savings.[4]
In 1851, Charles Babbage stated:
One of the inventions most important to a class of highly skilled workers (engineers) would be a small motive power - ranging perhaps from the force of from half a man to that of two horses, which might commence as well as cease its action at a moment's notice, require no expense of time for its management and be of modest cost both in original cost and in daily expense.[5]
To be efficient steam engines needed to be several hundred horsepower. Steam engines and boilers also required operators and maintenance. For these reasons the smallest commercial steam engines were about 2 horsepower. This was above the need for many small shops. Also, a small steam engine and boiler cost about $7,000 while an old blind horse that could develop 1/2 horsepower cost $20 or less.[6] Machinery to use horses for power cost $300 or less.[7]
Many power requirements were less than that of a horse. Shop machines, such as woodworking lathes, were often powered with a one- or two-man crank. Household sewing machines were powered with a foot treadle; however, factory sewing machines were steam-powered from a line shaft. Dogs were sometimes used on machines such as a treadmill, which could be adapted to churn butter.[8]
In the late 19th century specially designed power buildings leased space to small shops. These building supplied power to the tenants from a steam engine through line shafts.[8]
Electric motors were several times more efficient than small steam engines because central station generation was more efficient than small steam engines and because line shafts and belts had high friction losses.[9][8]
Electric motors were more efficient than human or animal power. The conversion efficiency for animal feed to work is between 4 and 5% compared to over 30% for electricity generated using coal.[10][11]
Electrification and economic growth are highly correlated.[12] In economics, the efficiency of electrical generation has been shown to correlate with technological progress.[10][12]
In the U.S. from 1870 to 1880 each man-hour was provided with .55 hp. In 1950 each man-hour was provided with 5 hp, or a 2.8% annual increase, declining to 1.5% from 1930 to 1950.[13] The period of electrification of factories and households from 1900 to 1940, was one of high productivity and economic growth.
Most studies of electrification and electric grids focused on industrial core countries in Europe and the United States. Elsewhere, wired electricity was often carried on and through the circuits of colonial rule. Some historians and sociologists considered the interplay of colonial politics and the development of electric grids: in India, Rao [14] showed that linguistics-based regional politics—not techno-geographical considerations—led to the creation of two separate grids; in colonial Zimbabwe (Rhodesia), Chikowero [15] showed that electrification was racially based and served the white settler community while excluding Africans; and in Mandate Palestine, Shamir [16][page needed] claimed that British electric concessions to a Zionist-owned company deepened the economic disparities between Arabs and Jews.
While electrification of cities and homes has existed since the late 19th century, about 840 million people (mostly in Africa) had no access to grid electricity in 2017, down from 1.2 billion in 2010.[18]
Vast gains in electrification were seen in the 1970s and 1980s—from 49% of the world's population in 1970 to 76% in 1990.[19][20] By the early 2010s, 81–83% of the world's population had access to electricity.[21]
Clean energy is mostly generated in the form of electricity, such as renewable energy or nuclear power. Switching to these energy sources requires that end uses, such as transport and heating, be electrified for the world's energy systems to be sustainable.
In the U.S. and Canada the use of heat pumps (HP) is economic if powered with solar photovoltaic (PV) devices to offset propane heating in rural areas[23] and natural gas heating in cities.[24] A 2023 study[25] investigated: (1) a residential natural gas-based heating system and grid electricity, (2) a residential natural gas-based heating system with PV to serve the electric load, (3) a residential HP system with grid electricity, and (4) a residential HP+PV system. It found that under typical inflation conditions, the lifecycle cost of natural gas and reversible, air-source heat pumps are nearly identical, which in part explains why heat pump sales have surpassed gas furnace sales in the U.S. for the first time during a period of high inflation.[26] With higher rates of inflation or lower PV capital costs, PV becomes a hedge against rising prices and encourages the adoption of heat pumps by also locking in both electricity and heating cost growth. The study[25] concludes: "The real internal rate of return for such prosumer technologies is 20x greater than a long-term certificate of deposit, which demonstrates the additional value PV and HP technologies offer prosumers over comparably secure investment vehicles while making substantive reductions in carbon emissions." This approach can be improved by integrating a thermal battery into the heat pump+solar energy heating system.[27][28]
It is easier to sustainably produce electricity than it is to sustainably produce liquid fuels. Therefore, adoption of electric vehicles is a way to make transport more sustainable.[29] Hydrogen vehicles may be an option for larger vehicles which have not yet been widely electrified, such as long distance lorries.[30] While electric vehicle technology is relatively mature in road transport, electric shipping and aviation are still early in their development, hence sustainable liquid fuels may have a larger role to play in these sectors.[31]
A large fraction of the world population cannot afford sufficient cooling for their homes. In addition to air conditioning, which requires electrification and additional power demand, passive building design and urban planning will be needed to ensure cooling needs are met in a sustainable way.[32] Similarly, many households in the developing and developed world suffer from fuel poverty and cannot heat their houses enough.[33] Existing heating practices are often polluting.
A key sustainable solution to heating is electrification (heat pumps, or the less efficient electric heater). The IEA estimates that heat pumps currently provide only 5% of space and water heating requirements globally, but could provide over 90%.[34] Use of ground source heat pumps not only reduces total annual energy loads associated with heating and cooling, it also flattens the electric demand curve by eliminating the extreme summer peak electric supply requirements.[35] However, heat pumps and resistive heating alone will not be sufficient for the electrification of industrial heat. This because in several processes higher temperatures are required which cannot be achieved with these types of equipment. For example, for the production of ethylene via steam cracking temperatures as high as 900 °C are required. Hence, drastically new processes are required. Nevertheless, power-to-heat is expected to be the first step in the electrification of the chemical industry with an expected large-scale implementation by 2025.[36]
Some cities in the United States have started prohibiting gas hookups for new houses, with state laws passed and under consideration to either require electrification or prohibit local requirements.[37] The UK government is experimenting with electrification for home heating to meet its climate goals.[38] Ceramic and Induction heating for cooktops as well as industrial applications (for instance steam crackers) are examples of technologies that can be used to transition away from natural gas.[39]
Electricity is a "sticky" form of energy, in that it tends to stay in the continent or island where it is produced. It is also multi-sourced; if one source suffers a shortage, electricity can be produced from other sources, including renewable sources. As a result, in the long term it is a relatively resilient means of energy transmission.[40] In the short term, because electricity must be supplied at the same moment it is consumed, it is somewhat unstable, compared to fuels that can be delivered and stored on-site. However, that can be mitigated by grid energy storage and distributed generation.
Solar and wind are variable renewable energy sources that supply electricity intermittently depending on the weather and the time of day.[41][42] Most electrical grids were constructed for non-intermittent energy sources such as coal-fired power plants.[43] As larger amounts of solar and wind energy are integrated into the grid, changes have to be made to the energy system to ensure that the supply of electricity is matched to demand.[44] In 2019, these sources generated 8.5% of worldwide electricity, a share that has grown rapidly.[45]
There are various ways to make the electricity system more flexible. In many places, wind and solar production are complementary on a daily and a season scale: There is more wind during the night and in winter, when solar energy production is low.[44] Linking distant geographical regions through long-distance transmission lines allows for further cancelling out of variability.[46] Energy demand can be shifted in time through energy demand management and the use of smart grids, matching the times when variable energy production is highest. With storage, energy produced in excess can be released when needed.[44] Building additional capacity for wind and solar generation can help to ensure that enough electricity is produced even during poor weather; during optimal weather energy generation may have to be curtailed. The final mismatch may be covered by using dispatchable energy sources such as hydropower, bioenergy, or natural gas.[47]
Energy storage helps overcome barriers for intermittent renewable energy, and is therefore an important aspect of a sustainable energy system.[48] The most commonly used storage method is pumped-storage hydroelectricity, which requires locations with large differences in height and access to water.[48] Batteries, and specifically lithium-ion batteries, are also deployed widely.[49] They contain cobalt, which is largely mined in Congo, a politically unstable region. More diverse geographical sourcing may ensure the stability of the supply-chain and their environmental impacts can be reduced by downcycling and recycling.[50][51] Batteries typically store electricity for short periods; research is ongoing into technology with sufficient capacity to last through seasons.[52] Pumped hydro storage and power-to-gas with capacity for multi-month usage has been implemented in some locations.[53][54]
As of 2018, thermal energy storage is typically not as convenient as burning fossil fuels. High upfront costs form a barrier for implementation. Seasonal thermal energy storage requires large capacity; it has been implemented in some high-latitude regions for household heat.[55]
The earliest commercial uses of electricity were electroplating and the telegraph.[56]
In the years 1831–1832, Michael Faraday discovered the operating principle of electromagnetic generators. The principle, later called Faraday's law, is based on an electromotive force generated in an electrical conductor that is subjected to a varying magnetic flux as, for example, a wire moving through a magnetic field. Faraday built the first electromagnetic generator, called the Faraday disk, a type of homopolar generator, using a copper disc rotating between the poles of a horseshoe magnet. Faraday's first electromagnetic generator produced a small DC voltage.
Around 1832, Hippolyte Pixii improved the magneto by using a wire wound horseshoe, with the extra coils of conductor generating more current, but it was AC. André-Marie Ampère suggested a means of converting current from Pixii's magneto to DC using a rocking switch. Later segmented commutators were used to produce direct current.[57]
Around 1838-40, William Fothergill Cooke and Charles Wheatstone developed a telegraph. In 1840 Wheatstone was using a magneto that he developed to power the telegraph. Wheatstone and Cooke made an important improvement in electrical generation by using a battery-powered electromagnet in place of a permanent magnet, which they patented in 1845.[58] The self-excited magnetic field dynamo did away with the battery to power electromagnets. This type of dynamo was made by several people in 1866.
The first practical generator, the Gramme machine, was made by Z.T. Gramme, who sold many of these machines in the 1870s. British engineer R.E.B. Crompton improved the generator to allow better air cooling and made other mechanical improvements. Compound winding, which gave more stable voltage with load, improved the operating characteristics of generators.[59]
The improvements in electrical generation technology in the 19th century increased its efficiency and reliability greatly. The first magnetos only converted a few percent of mechanical energy to electricity. By the end of the 19th century the highest efficiencies were over 90%.
Sir Humphry Davy invented the carbon arc lamp in 1802 upon discovering that electricity could produce a light arc with carbon electrodes. However, it was not used to any great extent until a practical means of generating electricity was developed.
Carbon arc lamps were started by making contact between two carbon electrodes, which were then separated to within a narrow gap. Because the carbon burned away, the gap had to be constantly readjusted. Several mechanisms were developed to regulate the arc. A common approach was to feed a carbon electrode by gravity and maintain the gap with a pair of electromagnets, one of which retracted the upper carbon after the arc was started and the second controlled a brake on the gravity feed.[8]
Arc lamps of the time had very intense light output – in the range of 4,000 candlepower (candelas) – and released a lot of heat, and they were a fire hazard, all of which made them inappropriate for lighting homes.[57]
In the 1850s, many of these problems were solved by the arc lamp invented by William Petrie and William Staite. The lamp used a magneto-electric generator and had a self-regulating mechanism to control the gap between the two carbon rods. Their light was used to light up the National Gallery in London and was a great novelty at the time. These arc lamps and designs similar to it, powered by large magnetos, were first installed on English lighthouses in the mid 1850s, but the technology suffered power limitations.[60]
The first successful arc lamp (the Yablochkov candle) was developed by Russian engineer Pavel Yablochkov using the Gramme generator. Its advantage lay in the fact that it did not require the use of a mechanical regulator like its predecessors. It was first exhibited at the Paris Exposition of 1878 and was heavily promoted by Gramme.[61] The arc light was installed along the half mile length of Avenue de l'Opéra, Place du Theatre Francais and around the Place de l'Opéra in 1878.[62]
R. E. B. Crompton developed a more sophisticated design in 1878 which gave a much brighter and steadier light than the Yablochkov candle. In 1878, he formed Crompton & Co. and began to manufacture, sell and install the Crompton lamp. His concern was one of the first electrical engineering firms in the world.
Various forms of incandescent light bulbs had numerous inventors; however, the most successful early bulbs were those that used a carbon filament sealed in a high vacuum. These were invented by Joseph Swan in 1878 in Britain and by Thomas Edison in 1879 in the US. Edison’s lamp was more successful than Swan’s because Edison used a thinner filament, giving it higher resistance and thus conducting much less current. Edison began commercial production of carbon filament bulbs in 1880. Swan's light began commercial production in 1881.[63]
Swan's house, in Low Fell, Gateshead, was the world's first to have working light bulbs installed. The Lit & Phil Library in Newcastle, was the first public room lit by electric light,[64][65] and the Savoy Theatre was the first public building in the world lit entirely by electricity.[66]
The first central station providing public power is believed to be one at Godalming, Surrey, UK, in autumn 1881. The system was proposed after the town failed to reach an agreement on the rate charged by the gas company, so the town council decided to use electricity. The system lit up arc lamps on the main streets and incandescent lamps on a few side streets with hydroelectric power. By 1882 between 8 and 10 households were connected, with a total of 57 lights. The system was not a commercial success, and the town reverted to gas.[67]
The first large scale central distribution supply plant was opened at Holborn Viaduct in London in 1882.[68] Equipped with 1000 incandescent lightbulbs that replaced the older gas lighting, the station lit up Holborn Circus including the offices of the General Post Office and the famous City Temple church. The supply was a direct current at 110 V; due to power loss in the copper wires, this amounted to 100 V for the customer.
Within weeks, a parliamentary committee recommended passage of the landmark 1882 Electric Lighting Act, which allowed the licensing of persons, companies or local authorities to supply electricity for any public or private purposes.
The first large scale central power station in America was Edison's Pearl Street Station in New York, which began operating in September 1882. The station had six 200 horsepower Edison dynamos, each powered by a separate steam engine. It was located in a business and commercial district and supplied 110 volt direct current to 85 customers with 400 lamps. By 1884 Pearl Street was supplying 508 customers with 10,164 lamps.[69]
By the mid-1880s, other electric companies were establishing central power stations and distributing electricity, including Crompton & Co. and the Swan Electric Light Company in the UK, Thomson-Houston Electric Company and Westinghouse in the US and Siemens in Germany. By 1890 there were 1000 central stations in operation.[8] The 1902 census listed 3,620 central stations. By 1925 half of power was provided by central stations.[70]
One of the biggest problems facing the early power companies was the hourly variable demand. When lighting was practically the only use of electricity, demand was high during the first hours before the workday and the evening hours when demand peaked.[71] As a consequence, most early electric companies did not provide daytime service, with two-thirds providing no daytime service in 1897.[72]
The ratio of the average load to the peak load of a central station is called the load factor.[71] For electric companies to increase profitability and lower rates, it was necessary to increase the load factor. The way this was eventually accomplished was through motor load.[71] Motors are used more during daytime and many run continuously. Electric street railways were ideal for load balancing. Many electric railways generated their own power and also sold power and operated distribution systems.[4]
The load factor adjusted upward by the turn of the 20th century—at Pearl Street the load factor increased from 19.3% in 1884 to 29.4% in 1908. By 1929, the load factor around the world was greater than 50%, mainly due to motor load.[73]
Before widespread power distribution from central stations, many factories, large hotels, apartment and office buildings had their own power generation. Often this was economically attractive because the exhaust steam could be used for building and industrial process heat, which today is known as cogeneration or combined heat and power (CHP). Most self-generated power became uneconomical as power prices fell. As late as the early 20th century, isolated power systems greatly outnumbered central stations.[8] Cogeneration is still commonly practiced in many industries that use large amounts of both steam and power, such as pulp and paper, chemicals and refining. The continued use of private electric generators is called microgeneration.
The first commutator DC electric motor capable of turning machinery was invented by the British scientist William Sturgeon in 1832.[74] The crucial advance that this represented over the motor demonstrated by Michael Faraday was the incorporation of a commutator. This allowed Sturgeon's motor to be the first capable of providing continuous rotary motion.[75]
Frank J. Sprague improved on the DC motor in 1884 by solving the problem of maintaining a constant speed with varying load and reducing sparking from the brushes. Sprague sold his motor through Edison Co.[76] It is easy to vary speed with DC motors, which made them suited for a number of applications such as electric street railways, machine tools and certain other industrial applications where speed control was desirable.[8]
Manufacturing was transitioned from line shaft and belt drive using steam engines and water power to electric motors.[4][9]
Although the first power stations supplied direct current, the distribution of alternating current soon became the most favored option. The main advantages of AC were that it could be transformed to high voltage to reduce transmission losses and that AC motors could easily run at constant speeds.
Alternating current technology was rooted in Faraday's 1830–31 discovery that a changing magnetic field can induce an electric current in a circuit.[77]
The first person to conceive of a rotating magnetic field was Walter Baily who gave a workable demonstration of his battery-operated polyphase motor aided by a commutator on June 28, 1879, to the Physical Society of London.[78] Nearly identical to Baily’s apparatus, French electrical engineer Marcel Deprez in 1880 published a paper that identified the rotating magnetic field principle and that of a two-phase AC system of currents to produce it.[79] In 1886, English engineer Elihu Thomson built an AC motor by expanding upon the induction-repulsion principle and his wattmeter.[80]
It was in the 1880s that the technology was commercially developed for large scale electricity generation and transmission. In 1882 the British inventor and electrical engineer Sebastian de Ferranti, working for the company Siemens collaborated with the distinguished physicist Lord Kelvin to pioneer AC power technology including an early transformer.[81]
A power transformer developed by Lucien Gaulard and John Dixon Gibbs was demonstrated in London in 1881, and attracted the interest of Westinghouse. They also exhibited the invention in Turin in 1884, where it was adopted for an electric lighting system. Many of their designs were adapted to the particular laws governing electrical distribution in the UK.[citation needed]
Sebastian Ziani de Ferranti went into this business in 1882 when he set up a shop in London designing various electrical devices. Ferranti believed in the success of alternating current power distribution early on, and was one of the few experts in this system in the UK. With the help of Lord Kelvin, Ferranti pioneered the first AC power generator and transformer in 1882.[82] John Hopkinson, a British physicist, invented the three-wire (three-phase) system for the distribution of electrical power, for which he was granted a patent in 1882.[83]
The Italian inventor Galileo Ferraris invented a polyphase AC induction motor in 1885. The idea was that two out-of-phase, but synchronized, currents might be used to produce two magnetic fields that could be combined to produce a rotating field without any need for switching or for moving parts. Other inventors were the American engineers Charles S. Bradley and Nikola Tesla, and the German technician Friedrich August Haselwander.[84] They were able to overcome the problem of starting up the AC motor by using a rotating magnetic field produced by a poly-phase current.[85] Mikhail Dolivo-Dobrovolsky introduced the first three-phase induction motor in 1890, a much more capable design that became the prototype used in Europe and the U.S.[86] By 1895 GE and Westinghouse both had AC motors on the market.[87] With single phase current either a capacitor or coil (creating inductance) can be used on part of the circuit inside the motor to create a rotating magnetic field.[88] Multi-speed AC motors that have separately wired poles have long been available, the most common being two speed. Speed of these motors is changed by switching sets of poles on or off, which was done with a special motor starter for larger motors, or a simple multiple speed switch for fractional horsepower motors.
The first AC power station was built by the English electrical engineer Sebastian de Ferranti. In 1887 the London Electric Supply Corporation hired Ferranti for the design of their power station at Deptford. He designed the building, the generating plant and the distribution system. It was built at the Stowage, a site to the west of the mouth of Deptford Creek once used by the East India Company. Built on an unprecedented scale and pioneering the use of high voltage (10,000 V) AC current, it generated 800 kilowatts and supplied central London. On its completion in 1891 it was the first truly modern power station, supplying high-voltage AC power that was then "stepped down" with transformers for consumer use on each street. This basic system remains in use today around the world.
In the U.S., George Westinghouse, who had become interested in the power transformer developed by Gaulard and Gibbs, began to develop his AC lighting system, using a transmission system with a 20:1 step up voltage with step-down. In 1890 Westinghouse and Stanley built a system to transmit power several miles to a mine in Colorado. A decision was taken to use AC for power transmission from the Niagara Power Project to Buffalo, New York. Proposals submitted by vendors in 1890 included DC and compressed air systems. A combination DC and compressed air system remained under consideration until late in the schedule. Despite the protestations of the Niagara commissioner William Thomson (Lord Kelvin) the decision was taken to build an AC system, which had been proposed by both Westinghouse and General Electric. In October 1893 Westinghouse was awarded the contract to provide the first three 5,000 hp, 250 rpm, 25 Hz, two phase generators.[89] The hydro power plant went online in 1895,[90] and it was the largest one until that date.[91]
By the 1890s, single and poly-phase AC was undergoing rapid introduction.[92] In the U.S. by 1902, 61% of generating capacity was AC, increasing to 95% in 1917.[93] Despite the superiority of alternating current for most applications, a few existing DC systems continued to operate for several decades after AC became the standard for new systems.
The efficiency of steam prime movers in converting the heat energy of fuel into mechanical work was a critical factor in the economic operation of steam central generating stations. Early projects used reciprocating steam engines, operating at relatively low speeds. The introduction of the steam turbine fundamentally changed the economics of central station operations. Steam turbines could be made in larger ratings than reciprocating engines, and generally had higher efficiency. The speed of steam turbines did not fluctuate cyclically during each revolution. This made parallel operation of AC generators feasible, and improved the stability of rotary converters for production of direct current for traction and industrial uses. Steam turbines ran at higher speed than reciprocating engines, not being limited by the allowable speed of a piston in a cylinder. This made them more compatible with AC generators with only two or four poles; no gearbox or belted speed increaser was needed between the engine and the generator. It was costly and ultimately impossible to provide a belt-drive between a low-speed engine and a high-speed generator in the very large ratings required for central station service.
The modern steam turbine was invented in 1884 by British engineer Sir Charles Parsons, whose first model was connected to a dynamo that generated 7.5 kW (10 hp) of electricity.[94] The invention of Parsons's steam turbine made cheap and plentiful electricity possible. Parsons turbines were widely introduced in English central stations by 1894; the first electric supply company in the world to generate electricity using turbo generators was Parsons's own electricity supply company Newcastle and District Electric Lighting Company, set up in 1894.[95] Within Parsons's lifetime, the generating capacity of a unit was scaled up by about 10,000 times.[96]
The first U.S. turbines were two De Leval units at Edison Co. in New York in 1895. The first U.S. Parsons turbine was at Westinghouse Air Brake Co. near Pittsburgh.[97]
Steam turbines also had capital cost and operating advantages over reciprocating engines. The condensate from steam engines was contaminated with oil and could not be reused, while condensate from a turbine is clean and typically reused. Steam turbines were a fraction of the size and weight of comparably rated reciprocating steam engine. Steam turbines can operate for years with almost no wear. Reciprocating steam engines required high maintenance. Steam turbines can be manufactured with capacities far larger than any steam engines ever made, giving important economies of scale.
Steam turbines could be built to operate on higher pressure and temperature steam. A fundamental principle of thermodynamics is that the higher the temperature of the steam entering an engine, the higher the efficiency. The introduction of steam turbines motivated a series of improvements in temperatures and pressures. The resulting increased conversion efficiency lowered electricity prices.[98]
The power density of boilers was increased by using forced combustion air and by using compressed air to feed pulverized coal. Also, coal handling was mechanized and automated.[99]
With the realization of long distance power transmission it was possible to interconnect different central stations to balance loads and improve load factors. Interconnection became increasingly desirable as electrification grew rapidly in the early years of the 20th century.
Charles Merz, of the Merz & McLellan consulting partnership, built the Neptune Bank Power Station near Newcastle upon Tyne in 1901,[100] and by 1912 had developed into the largest integrated power system in Europe.[101] In 1905 he tried to influence Parliament to unify the variety of voltages and frequencies in the country's electricity supply industry, but it was not until World War I that Parliament began to take this idea seriously, appointing him head of a Parliamentary Committee to address the problem. In 1916 Merz pointed out that the UK could use its small size to its advantage, by creating a dense distribution grid to feed its industries efficiently. His findings led to the Williamson Report of 1918, which in turn created the Electricity Supply Bill of 1919. The bill was the first step towards an integrated electricity system in the UK.
The more significant Electricity (Supply) Act of 1926, led to the setting up of the National Grid.[102] The Central Electricity Board standardised the nation's electricity supply and established the first synchronised AC grid, running at 132 kilovolts and 50 Hertz. This started operating as a national system, the National Grid, in 1938.
In the United States it became a national objective after the power crisis during the summer of 1918 in the midst of World War I to consolidate supply. In 1934 the Public Utility Holding Company Act recognized electric utilities as public goods of importance along with gas, water, and telephone companies and thereby were given outlined restrictions and regulatory oversight of their operations.[103]
The examples and perspective in this section may not represent a worldwide view of the subject. (May 2021) |
The electrification of households in Europe and North America began in the early 20th century in major cities and in areas served by electric railways and increased rapidly until about 1930 when 70% of households were electrified in the U.S.
Rural areas were electrified first in Europe, and in the U.S. the Rural Electric Administration, established in 1935 brought electrification to the underserviced rural areas.[104]
In the Soviet Union, as in the United States, rural electrification progressed more slowly than in urban areas. It wasn't until the Brezhnev era that electrification became widespread in rural regions, with the Soviet rural electrification drive largely completed by the early 1970s. [105]
In China, the turmoil of the Warlord Era, the Civil War and the Japanese invasion in the early 20th century delayed electrification for decades. It was only after the establishment of the People's Republic of China in 1949 that the country was positioned to pursue widespread electrification. During the Mao years, while electricity became commonplace in cities, rural areas were largely neglected. [106] At the time of Mao's death in 1976, 25% of Chinese households still lacked access to electricity. [107]
Deng Xiaoping, who became China's paramount leader in 1978, initiated a rural electrification drive as part of a broader modernization effort. By the late 1990s, electricity had become ubiquitious in regional areas. [108] The very last remote villages in China were connected to the grid in 2015. [109]
Central station electric power generating provided power more efficiently and at lower cost than small generators. The capital and operating cost per unit of power were also cheaper with central stations.[9] The cost of electricity fell dramatically in the first decades of the twentieth century due to the introduction of steam turbines and the improved load factor after the introduction of AC motors. As electricity prices fell, usage increased dramatically and central stations were scaled up to enormous sizes, creating significant economies of scale.[110] For the historical cost see Ayres-Warr (2002) Fig. 7.[11]
Start years differ by sector but all sectors are present from 2020 onwards.
The society's lecture theatre was the first public room to be lit by electric light, during a lecture by Sir Joseph Swan on October 20, 1880.
{{cite web}}
: Check |url=
value (help)
{{cite book}}
: CS1 maint: numeric names: authors list (link)