In the video game industry, crunch (or crunch culture) is compulsory overtime during the development of a game. Crunch is common in the industry and can lead to work weeks of 65–80 hours for extended periods of time, often uncompensated beyond the normal working hours.[1] It is often used as a way to cut the costs of game development, a labour-intensive endeavour. However, it leads to negative health impacts for game developers and a decrease in the quality of their work, which drives developers out of the industry temporarily or permanently. Critics of crunch note how it has become normalized within the game-development industry, to deleterious effects for all involved.[2] A lack of unionization on the part of game developers has often been suggested as the reason crunch exists.[1] Organizations such as Game Workers Unite aim to fight against crunch by forcing studios to honour game developers' labor rights.
"Crunch time" is the point at which the team is thought to be failing to achieve milestones needed to launch a game on schedule. The complexity of work flow, reliance on third-party deliverables, and the intangibles of artistic and aesthetic demands in video-game creation create difficulty in predicting milestones.[3] The use of crunch time is also seen to be exploitative of the younger male-dominated workforce in video games, who have not had the time to establish a family and who were eager to advance within the industry by working long hours.[3][4] Naughty Dog co-president, Evan Wells, argued that the drive for crunch may come from the developers themselves as individual developers may want to work extra hours without a mandate to assure their product meets delivery milestones and is of high quality, which can influence other developers to also commit to extra hours or avoid taking time off as to appear slacking.[5]
Because crunch time tends to come from a combination of corporate practices as well as peer influence, the term "crunch culture" is often used to discuss video game development settings where crunch time may be seen as the norm rather than the exception.[6] It stems from an emphasis that getting work done well and done quickly is more important than work-life balance or personal well being.[7] Most of the problems with crunch are a result of the crunch culture that runs rampant and is widely accepted throughout the industry. In some cases, crunch culture is so ingrained in a company that leadership willingly exposes their teams' 100 hour work weeks as a sign of hard work and proof that companies are doing their best to release a game on time.[8]
Surveys from game developers in the 2000s showed the average working week was at least 46 hours for more than 60% of respondents; when crunch time occurs, workweeks of 60 to 80 hours, or in some cases, 100 hours or more, have been reported.[3][9] This trend continues today with developers still clocking in 12–14 hour days for seven days a week during crunch.[10] Some of the biggest titles in gaming such as Fortnite and Red Dead Redemption 2 are the product of 70–100 hour work weeks.[8] Terms such as "stress casualties" were coined at BioWare, the development studio for the game Anthem. A "stress casualty" is when an employee disappears for months at a time as a result of the stress that they are put under during crunch.[11] In the case of Telltale Games, one employee recounts working until 3 am the night before they and over 200 other employees were laid off.[8] The intense workload can partly be traced to a shift to a microtransaction model for games. in this model, the main game is free but add-ons and extra content such as skins that can change the character's appearance can be bought for an extra price.[10] This emphasizes constant updates to keep creating more content so players stay attached which leads to perpetual crunch.[10]
According to a 2019 survey from the International Game Developers Association, 40 percent of game developers reported experiencing crunch at least once over the past year. Only 8 percent reported receiving extra pay for their crunch hours.[12] At federal and state levels in the United States, computer professionals who earn above a set annual salary are exempt from overtime laws. This exemption permits companies to not pay developers for any extra hours in the office. The set annual salary varies from state to state.[12] An exception is California , where software developers are specifically protected by a minimum hourly wage to be considered exempt, which as of 2008 was set to $36 an hour, though this tends to be lower than the average game developer salary.[12][13][14] When games are created, strict contracts are signed between development studios and publishers that set budgets and deadlines for the project. Exemption from overtime laws enable studios to work developers more hours than usual without going over-budget.[12]
Game studios also contract work out to cheaper contract workers. When this happens, pay and amount of work is settled on when the contract is signed meaning they are not entitled to overtime payment either.[7] Contract workers are willing to do this in hopes for a full time job offer after the game is finished or a bonus upon completion if the game performs well, but neither of these are guaranteed and there is a good chance that they are left with nothing once the contract is up.[7]
When crunch time does occur, the publisher or developer may help encourage employees by offering "crunch meals" that are delivered to the offices.[15] Once a product is delivered, and the necessity for crunch no longer required, some companies allow their employees to take paid time-off in compensation for the overtime hours they had put in, or may offer salary raises and bonuses for successful completion of the delivery milestone.[16]
According to a study done by Take This in 2019, 53% of game developers say that crunch time is an expected part of their employment.[17] Part of this can be attributed to workers already having been exposed to crunch from previous experiences.[17] The crunch culture has normalized crunch time to the point that Dan Houser, co-founder of Rockstar Games, willingly stated that employees worked 100-hour weeks to finish Red Dead Redemption 2.[8] The statement was met with criticism for harsh working conditions from the public.
The gaming community itself can also encourage crunch, sometimes inadvertently, through the hype that is created when a new game or sequel is announced. Alexey Izotov argues that some game-makers promote this culture of hype by overpromising on features they can't deliver, as well as poor communication with gaming audiences in general.[18] The higher the expectations are set for a game, the harder it becomes for developers to meet those expectations within certain timelines, and the harder they push themselves to do so.[7] This can be pushed to extremes with some fans sending death threats to Cyberpunk 2077 developers over a delayed release date.[19]
For developers themselves, many undergo self-imposed crunch even in scenarios where it is not forced or required. This can stem from a lot of things. Perfectionism or a desire to finish what they started is a big driver for self imposed crunch.[20] Camaraderie can also motivate developers to work extra hours as they notice fellow co-workers also staying late.[20] Crunch culture also plays a big part here as some developers bring up examples of past successes that occurred with crunch and attempt to replicate that.[20]
Crunch has been described as a "necessary evil" by the management that encourages it. In an 2017 article from the New York Times, co-founder and CEO of CD Projeckt RED Marcin Iwiński is cited describing video game development as "hard-core work"[21] and explaining how keeping up with deadlines is difficult due to the time it takes to complete basic tasks.[21]
Crunch time is known to have been used in the industry since at least the 1980s, though it was rarely publicly discussed. It stemmed from a "box product mentality" that created stricter time constraints.[17] Game developers had to get physical game discs ready for the holidays so games had to be finished by August.[17] Video game developers were, historically, paid greater than the average salaries, and because of the insular nature of the industry, where one's reputation is critical, few developers would leave the industry due to crunch. These factors made the acceptance of longer working hours the norm at some larger studios.[12] As the video game industry boomed, its developers were considered white-collar workers, exempt from overtime pay; this was particularly true for those in California (where much of the North American industry had been established), in which those that made over double the current minimum wage were considered ineligible for overtime.[12]
In the 1980s, Atari, in a desire to release an Atari 2600 port of Pac-Man as quickly as possible, had programmer Tod Frye work 80-hour weeks over the course of 6 months before its March 1982 release date.[22] In 1996, another programmer, Rebecca Heineman, was given 10 weeks to develop a 3DO port of Doom for Art Data Interactive; according to Heineman, this short time span was due to the company significantly underestimating the amount of work needed to make a functional game.[23][24] In both cases, the end products were panned, in large part due to problems stemming from their rushed development.[25][26]
In 2000, California introduced a specific "computer-related" clause for overtime exemptions, raising the minimum salary threshold to be exempt to around $85,000 per year tied to the consumer price index, which exceeded the average game developer salary at that time of about $61,000. The U.S. federal government followed suit in creating a similar class for exemptions at the federal level.[12]
California's exemption changes stirred up debate within the industry for workers who thought they were being treated unfairly.[12] Two lawsuits emerged against Electronic Arts (EA) as employees recognized they should not be categorized as exempt from overtime pay. One suit originated from artists that had worked on The Sims 2 that argued they had been forced to work overtime without compensation.[12] A second lawsuit originated from a social media post by Erin Hoffman, posting anonymously under the name "EA Spouse", in 2004, describing the working hours her husband had faced at EA and how crunch time, initially proposed early in development as to get a heads-up on later stages, had been pushed as a long-term requirement throughout the development cycle for the employees.[12][27][3][28] Following Hoffman's blog, a 2004 survey by the International Game Developers Association (IGDA) found that less than 3% of respondents said that they did not work any overtime, and of those that did, nearly half did not receive compensation for it.[9][29] EA ultimately settled both lawsuits, agreeing to pay back $15 million to the employees by 2006,[12] and that it would reclassify some of its developers as hourly employees eligible for overtime but eliminating their stock options. However, the publicity around these suits lead to more discussion in the video game industry on the crunch culture.[9]
California changed its labor laws in 2008, in an attempt to keep high-tech industries from moving out of state or country; this included reducing the minimum salary to be exempt for computer-related jobs from the current $100,000 to $75,000 per year, which at that time fell under the average salary for video game developers. The labor laws also included a number of exacting provisions of what type of job functions were considered exempt, which covered most game development responsibilities. As a result, employees found it difficult to challenge crunch time through legal recourse.[12]
More visibility of the industry's crunch conditions occurred in January 2010, when a collective group of "Rockstar Spouses", the spouses of developers at Rockstar San Diego, posted an open letter criticizing the management of the studio for deteriorating working conditions for their significant others since March 2009, which included excessive crunch time. This was followed by several former Rockstar employees posting similar complaints of their time there.[30][31] The IGDA considered that Rockstar's working conditions were exploitative and harmful.[32]
According to Virginia McArthur, a game executive at Endless Studios, as games shifted towards a digital format during the 2000s (which allowed studios to cut out production time of physical copies and focus more time on testing), there was a brief period of less crunch.[17] However, as free-to-play games entered the market, it was common to see more rushed products with limited timelines requiring intense crunch in order to beat competitors to market.[17]
Since the early 2010s, some companies in the industry have taken steps to eliminate crunch; however, it was also noted by some reporters that, overall, little progress had been made in the decade after the "EA Spouse" controversy.[9][33] A 2014 IGDA survey found that, while the average number of hours worked had dropped since 2004, 81% of respondents said they had experienced within the last 2 years, and around 50% said they felt it was "part of the job" and expected of them.[33] in 2004, 35% had said they worked between 65 and 80 hours per week; by 2014, 35% said they had worked from 50 to 65 hours.[9] A 2019 survey of developers by the Game Developers Conference found that nearly half still worked over 40-hour weeks on average; though only 12% said they worked an average 50 hours a week, nearly 75% stated they had at least one period where they had worked more than 40 hours in a single week.[34]
Continued stories of crunch time brought more public awareness that crunch remained an accepted practice in the game industry. Families of Rockstar developers for Red Dead Redemption 2, in October 2018, expressed similar concerns as the prior "Rockstar Spouse" case.[35] Anonymous Epic Games employees speaking to Polygon spoke of crunch time with 70 to 100 hour weeks by some ever since they released Fortnite Battle Royale, which has drawn a playerbase of millions. While these employees were getting overtime pay, there remained issues of health concerns and inability to take time off without it being seen negatively on their performance.[36]
The COVID-19 pandemic led to various disruptions on game development across the industry, but in most cases shows that development companies could still make games while employees were working remotely from home, raising the question of the need for crunch time. In October 2021, Eidos-Montréal and Eidos-Sherbrooke were some of the first major studios to announce a shift to a four-day workweek as to improve the quality of life for its developers.[37]
Crunch leads to burnout which can have adverse effects on both a team and the individual. Burnout occurs when someone gets tired or loses passion and desire for their work. It leads to loss of productivity and in some cases, depression, anxiety, and panic attacks.[17] According to a 2016 study by Open Sourcing Mental Illness, 51% of tech workers have been diagnosed with a mental health condition by a professional. These can include and be a combination of a mood disorder, anxiety disorder, ADHD, PTSD, and OCD.[17] Of those 51%, 80% believe that their mental illness affects their work.[17] Other cases report that crunch has caused people to suffer memory loss or develop ulcers.[10][21]
A number of popular games developed under crunch conditions, such as Fortnite, Fallout 4, and Uncharted 4, all saw extreme commercial success and solid critical acclaim.[20] However, there are also many games that avoided crunch throughout the entire development process that still received commercial and critical success, such as Animal Crossing, Apex Legends, Don't Starve and Hades.[7][17][20][38]
A study by The Game Outcomes Project found that mandatory crunch led to less successful games, using Metacritic as a gauge of success. The group found that cultural factors such as focus, team cohesion, and a compelling direction were more important than pure hours of work in determining how good a game was. This led to the conclusion that in fact crunch might make games worse rather than better, and at the least, resulted in diminishing returns.[2]
Bugs seen at the launch of Fallout came as a result of a short production schedule forced on the team by publisher Bethesda and lead to an infamous “crunch” period for the dev team, said bugs and poor performance cost the developer Obsidian Entertainment a $1 million conditional bonus based on critical reception, though insiders claim the short deadline and conditional bonus based on a Metacritic score was essentially to avoid paying said bonus.[39]
During the development of Overkill's The Walking Dead, crunch at Starbreeze Studios and Overkill Software was noted as having a negative effect on the game, resulting in a product that was critically panned for its quality and gameplay, and leaving both studios in financial distress.[40]
The term "crunch" has also been used by journalists to describe overtime labor in other entertainment industries, such as animation and visual effects.[41][42][43][44][45]
Original source: https://en.wikipedia.org/wiki/Crunch (video games).
Read more |