Wildfire suppression in the United States has had a long and varied history. For most of the 20th century, any form of wildland fire, whether it was naturally caused or otherwise, was quickly suppressed for fear of uncontrollable and destructive conflagrations such as the Peshtigo Fire in 1871 and the Great Fire of 1910. In the 1960s, policies governing wildfire suppression changed due to ecological studies that recognized fire as a natural process necessary for new growth. Today, policies advocating complete fire suppression have been exchanged for those who encourage wildland fire use, or the allowing of fire to act as a tool, such as the case with controlled burns.
Native American use of fire in ecosystems are part of the environmental cycles and maintenance of wildlife habitats that sustain the cultures and economies of the Indigenous peoples of the Americas. Indigenous peoples have used burning practices to manage, protect, and relate to their surrounds since time immemorial. According to sociologist Kari Norgaard: "Indigenous peoples have long set low-intensity fires to manage eco-cultural resources and reduce the buildup of fuels – flammable trees, grasses and brush – that cause larger, hotter and more dangerous fires, like the ones that have burned across the West in recent years. Before fire suppression, forests in the West experienced a mix of low- to high-severity fires for millennia. Large, high-severity fires played an important ecological role, yet their spread was limited by low-severity fires set by indigenous peoples"[1]
However, "Fire suppression was mandated by the very first session of the California Legislature in 1850," and with the institution of the Weeks Act in 1911, "cultural uses of fire" were essentially made "illegal and for the many decades following, less and less burning occurred while more and more vegetation grew. Over a century of policies of fire suppression have created the conditions for the catastrophic, high-intensity wildfires we are seeing today" according to the Karuk Tribe of Northern California's Climate Adaptation Plan.[2][3] Because many Indigenous groups viewed fire as a tool for ecosystem management, education, and a way of life, such suppression would lead to decreased food availability and breakdown of social and familial structures.[4][5][6] It has been argued by numerous scholars that such suppression should be seen as a form of "Colonial Ecological Violence," "which results in particular risks and harms experienced by Native peoples and communities."[7][8]
Eventually, without small-scale managerial fires set by Indigenous peoples, wildfires would grow in size and severity because of buildup of vegetation on the forest floor in combination with Climate change.[9] As the U.S. Forest Service and environmental scientists come to understand the long-term damage that such suppression has done, Indigenous peoples have provided the Forest Service a better understanding of how traditional burning practices are necessary for forests and people.[10][11]
In the eastern United States, with its significant rainfall, wildfires are relatively small and have rarely posed great risk to life and property. As settlements moved further west into drier areas, the first large scale fires were encountered. Range fires on the Great Plains and forest fires in the Rocky Mountains were far larger and more destructive than what had ever been seen in the east.
Yellowstone National Park was established in 1872 as the world's first national park. For the next several years, administration of the park languished until 1886 when the U. S. Army was assigned the responsibility for its protection. Upon its arrival in the park, the Army found numerous fires burning in developed areas as well as in areas where it was not reasonable to control them. The commanding officer decided that human-caused fires along roads posed the biggest threat and that the Army would concentrate its suppression efforts on the control of those fires. There were not enough soldiers to fight all of the fires. Thus, came the first conscious decision by a manager of federal land to allow some fires to burn while others were controlled. The policy of fire suppression was also applied to Sequoia, General Grant, and Yosemite national parks when they were established in 1890, and Army patrols were initiated to guard against fires, livestock trespass, and illegal logging.[12]
A number of catastrophic fire events over the years greatly influenced fire management policies. The worst loss of life in United States history due to a wildfire occurred in 1871 when the Peshtigo Fire swept through Wisconsin, killing more than 1500 people.[13] The Santiago Canyon Fire of 1889 in California and especially the Great Fire of 1910 in Montana and Idaho contributed to the philosophy that fire was a danger that needed to be suppressed.[14] The Great Fire of 1910 had burned 3,000,000 acres (12,000 km2), destroyed a number of communities and killed 86 people, and this event prompted various land management agencies to emphasize wildfire suppression. U.S. Government land agencies, including the National Park Service, generally followed the fire management policies established by the U.S. Forest Service, which oversees the majority of the nation's forestlands. (See "The Big Burn" a 2014 PBS documentary American Experience (season 27)). This would lead to the passage of the 1911 Weeks Act.
Before the middle of the 20th century, most forest managers believed that fires should be suppressed at all times.[15] By 1935, the U.S. Forest Service's fire management policy stipulated that all wildfires were to be suppressed by 10 am the morning after they were first spotted.[16][17] Fire fighting crews were established throughout public lands, and generally staffed by young men during fire seasons. By 1940, firefighters known as smokejumpers would parachute out of airplanes to extinguish flames in remote locations. By the beginning of World War II, over 8,000 fire lookout towers had been constructed in the United States. Though many have been torn down due to increased use of airplanes for fire spotting, three are still used each year in Yellowstone.[18][19] Firefighting efforts were highly successful, with the area burned by wildfires reduced from an annual average of 30,000,000 acres (120,000 km2) during the 1930s, to between 2,000,000 acres (8,100 km2) and 5,000,000 acres (20,000 km2) by the 1960s.[15] The need for lumber during World War II was high and fires that destroyed timberland were deemed unacceptable. In 1944, the U.S. Forest Service developed an ad campaign to help educate the public that all fires were detrimental, using a cartoon black bear named Smokey Bear. This iconic firefighting bear can still be seen on posters with the catchphrase "Only you can prevent wildfires".[20][21] Early posters of Smokey Bear misled the public into believing that western wildfires were predominantly human-caused. In Yellowstone, human-caused fires average between 6 and 10 annually, while 35 wildfires are ignited by lightning.[19][22]
Some researchers, as well as some timber companies and private citizens, understood that fire was a natural state of affairs in many ecosystems. Fire would help clean out the understory and dead plant matter, allowing economically important tree species to grow with less competition for nutrients. Native Americans would often burn woodlands to reduce overgrowth and increase grasslands for large prey animals such as bison and elk.[23]
When the U.S. Forest Service was established in 1905, it became the primary task of the Forest Service to suppress all fires on the forest reserves it administered. In 1916, the National Park Service was established and took over park management from the Army. Following the Forest Service approach, fire suppression became the only fire policy and remained in the national parks for the next five decades. Some foresters questioned the economic logic of such suppression efforts. However, the extensive fires of 1910 solidified the Forest Service as the premier fire control organization and fire suppression remained the only fire policy for all federal land management agencies until the late 1960s.[25]
Complete fire suppression was the objective, even though these early efforts were less than successful until the advent of vehicles, equipment, and roads (see Fire trail) during the 1940s.[25] As early as 1924, environmentalist Aldo Leopold argued that wildfires were beneficial to ecosystems, and were necessary for the natural propagation of numerous tree and plant species. Over the next 40 years, increasing numbers of foresters and ecologists concurred about the benefits of wildfire to ecosystems. Some managers allowed low intensity fires to spread in remote areas unless they threatened valuable resources or facilities, but by 1934 a policy of extinguishing all fires by 10:00 am of the next burning period was implemented.[12][26] This resulted in the buildup of fuels in some ecosystems such as ponderosa pine and Douglas fir forests.[25]
The policy began to be questioned in the 1960s, when it was realized that no new giant sequoia had grown in the forests of California, because fire is an essential part of their life cycle.[27][28] In 1962, Secretary of the Interior Stewart Udall assembled a Special Advisory Board on Wildlife Management to look into wildlife management problems in the national parks. This Advisory Board wrote what is now referred to as the Leopold Report, named after its chair, zoologist and conservationist A. Starker Leopold, which did not confine its report to wildlife, but took the broader ecological view that parks should be managed as ecosystems. The passage of the 1964 Wilderness Act encouraged the allowance of natural processes to occur, including fire.[26] Afterwards, the National Park Service changed its policy in 1968 to recognize fire as an ecological process. Fires were to be allowed to run their courses as long as they could be contained within fire management units and accomplished approved management objectives. Several parks established fire use programs, and policies were gradually changed from fire control to fire management. The Forest Service enacted similar measures in 1974 by changing its policy from fire control to fire management, allowing lightning fires to burn in wilderness areas. This included both naturally caused fire and intentional prescribed fire.[12] In 1978, the Forest Service abandoned the 10:00 am policy in favor of a new policy that encouraged the use of wildland fire by prescription.[12][26]
Three events between 1978 and 1988 precipitated a major fire use policy review in 1989: the Ouzel Fire in Rocky Mountain National Park, the Yellowstone fires of 1988 in and around Yellowstone National Park, and the Canyon Creek fire in the Bob Marshall Wilderness on the Lewis and Clark National Forest. In all three cases, monitored fires burned until they threatened developed areas. While none of the Yellowstone fires of 1988 were caused by controlled burns, later investigations proved the fire use policy was appropriate, though needing strengthening and improvement.[26]
The Secretaries of Agriculture and the Interior convened a fire policy review team to evaluate the National Park Service and Forest Service wilderness fire policies. The team reaffirmed the fundamental importance of fire’s natural role but recommended that fire management plans be strengthened by establishing clear decision criteria and accountability, and that interagency cooperation be improved. Wildland fire use programs restarted slowly after the 1989 review. Eventually the Forest Service and National Park Service programs began to grow as the number of fires and area burned increased.[12]
Suppressive action taken during the South Canyon Fire, which was ignited by lightning in a fire exclusion zone on July 2, 1994, caused controversy after a blow-up killed 14 firefighters two days after the initial blaze. An interagency team was formed and issued their report in August. They cited several direct and contributory causes of the fatalities including fire behavior, personnel profiles, and incident management procedures. The South Canyon incident led to the first comprehensive review and update of federal wildland fire policy in decades. The report reiterated that the first priority of all federal wildland fire programs was firefighter and public safety. With regard to prescribed fires and prescribed natural fires, the report stated that, "Wildland fire will be used to protect, maintain, and enhance resources and, as nearly as possible, be allowed to function in its natural ecological role." In 1998, a new procedures guide used the term "wildland fire use" to describe what had previously been prescribed natural fires. By the end of the decade, a 1995 policy had reinvigorated “wildland fire use” programs and given managers the support they needed to enable the programs to continue to grow and mature.[12][29][30]
Fire management benefits began to appear, such as the 2000 Hash Rock fire which burned almost all of the Mill Creek Wilderness on the Ochoco National Forest in Oregon before it was suppressed. When the wildfire reached the 1996 Mill Creek fire, which had been managed under the wildland fire use program, it went out. Use of fire presently varies in various federal agencies, partially due to differing influences such as land proximity to urban areas.[12]
In response to growing threats of fire in California, Jackie Fielder, an indigenous organizer and politician, proposed the creation of an indigenous wildfire task force as a part of her 2020 state senate campaign.[31][32] The plan draws on ideas behind the Karuk tribeś work around cultural burning and climate adaptation, and would create a path for more cultural burning to take place. [33][34] Bill Tripp, working directly with policy for the Karuk tribe, has noted more education and growing awareness of indigenous practices can lead to promising alternatives to modern-day fire suppression. [35] The Karuk tribe have been a leader in restoring and expanding cultural burning in the American West, carrying out cultural burns in order to reduce wildfire risk and promote the growth of culturally important flora.[34][36] Fielders proposal aims to reduce the threat of wildfire, as well as return more oversight of land management to indigenous people. It also has the potential to provide more job opportunities to indigenous and rural Californians. [33]
Fielder's campaign gained traction at least in part due to bold proposals like this.[37] Scott Wiener, Fielder's opponent and current California State Senator for District 11, has stated that he is interested in supporting a task force similar to the one which Fielder proposed.[38][33][39] This would represent a divergence from the incumbent's past approach, as Wiener's approach to wildfire management in the past has largely focused on shifting new construction away from the wildland urban interface.[37]
Roosevelt's Civilian Conservation Corps created in 1933 as part of the New Deal… planted > 3.5 billion trees… spent 150 million hours fighting fires. In 1935, Forest Service… established "10 a.m. policy."… suppression became a huge part of the job… science they relied on created problems… before suppression techniques… natural fires would typically burn 100-200 acres… not 200,000… suppression from 1930s created trees of all the same age, made them more susceptible to insect disturbances…
{{cite web}}
: |last=
has generic name (help)CS1 maint: numeric names: authors list (link)