First-party developers[edit]
In the video game industry, a first-party developer is part of a company which manufactures a video game console and develops exclusively for it. First-party developers may use the name of the company itself (such as Nintendo), have a specific division name (such as Sony's Polyphony Digital) or have been an independent studio before being acquired by the console manufacturer (such as Rare or Naughty Dog).[6] Whether by purchasing an independent studio or by founding a new team, the acquisition of a first-party developer involves a huge financial investment on the part of the console manufacturer, which is wasted if the developer fails to produce a hit game in a timely manner.[7] However, using first-party developers saves the cost of having to make royalty payments on a game's profits.[7] Current examples of first-party studios include PlayStation Studios for Sony, and Xbox Game Studios for Microsoft.
Second-party developers[edit]
Second-party developer is a colloquial term often used by gaming enthusiasts and media to describe game studios who take development contracts from platform holders and develop games exclusive to that platform, i.e. a non-owned developer making games for a first-party company.[8] As a balance to not being able to release their game for other platforms, second-party developers are usually offered higher royalty rates than third-party developers.[7] These studios may have exclusive publishing agreements (or other business relationships) with the platform holder, but maintain independence so upon completion or termination of their contracts are able to continue developing games for other publishers if they choose to. For example, while HAL Laboratory initially began developing games on personal computers like the MSX, they became one of the earliest second-party developers for Nintendo, developing exclusively for Nintendo's consoles starting with the Famicom, though they would self-publish their mobile games.[9][10]
Third-party developers[edit]
A third-party developer may also publish games, or work for a video game publisher to develop a title. Both publisher and developer have considerable input in the game's design and content. However, the publisher's wishes generally override those of the developer.
The business arrangement between the developer and publisher is governed by a contract, which specifies a list of milestones intended to be delivered over a period of time. By updating its milestones, the publisher verifies that work is progressing quickly enough to meet its deadline and can direct the developer if the game is not meeting expectations. When each milestone is completed (and accepted), the publisher pays the developer an advance on royalties. Successful developers may maintain several teams working on different games for different publishers. Generally, however, third-party developers tend to be small, close-knit teams. Third-party game development is a volatile sector, since small developers may be dependent on income from a single publisher; one canceled game may be devastating to a small developer. Because of this, many small development companies are short-lived.
A common exit strategy for a successful video-game developer is to sell the company to a publisher, becoming an in-house developer. In-house development teams tend to have more freedom in the design and content of a game compared to third-party developers. One reason is that since the developers are employees of the publisher, their interests are aligned with those of the publisher; the publisher may spend less effort ensuring that the developer's decisions do not enrich the developer at the publisher's expense.
Activision in 1979 became the first third-party video game developer. When four Atari, Inc. programmers left the company following its sale to Warner Communications, partially over the lack of respect that the new management gave to programmers, they used their knowledge of how Atari VCS game cartridges were programmed to create their own games for the system, founding Activision in 1979 to sell these. Atari took legal action to try to block sale of these games, but the companies ultimately settled, with Activision agreeing to pay a portion of their sales as a license fee to Atari for developing for the console. This established the use of licensing fees as a model for third-party development that persists into the present.[11][12] The licensing fee approach was further enforced by Nintendo when it decided to allow other third-party developers to make games for the Famicom console, setting a 30% licensing fee that covered game cartridge manufacturing costs and development fees. The 30% licensing fee for third-party developers has also persisted to the present, being a de facto rate used for most digital storefronts for third-party developers to offer their games on the platform.[13]
In recent years, larger publishers have acquired several third-party developers. While these development teams are now technically "in-house", they often continue to operate in an autonomous manner (with their own culture and work practices). For example, Activision acquired Raven (1997); Neversoft (1999), which merged with Infinity Ward in 2014; Z-Axis (2001); Treyarch (2001); Luxoflux (2002); Shaba (2002); Infinity Ward (2003) and Vicarious Visions (2005). All these developers continue operating much as they did before acquisition, the primary differences being exclusivity and financial details. Publishers tend to be more forgiving of their own development teams going over budget (or missing deadlines) than third-party developers.
A developer may not be the primary entity creating a piece of software, usually providing an external software tool which helps organize (or use) information for the primary software product. Such tools may be a database, Voice over IP, or add-in interface software; this is also known as middleware. Examples of this include SpeedTree and Havoc.
Indie game developers[edit]
Independents are software developers which are not owned by (or dependent on) a single publisher. Some of these developers self-publish their games, relying on the Internet and word of mouth for publicity. Without the large marketing budgets of mainstream publishers, their products may receive less recognition than those of larger publishers such as Sony, Microsoft or Nintendo. With the advent of digital distribution of inexpensive games on game consoles, it is now possible for indie game developers to forge agreements with console manufacturers for broad distribution of their games.
Other indie game developers create game software for a number of video-game publishers on several gaming platforms.[citation needed] In recent years this model has been in decline; larger publishers, such as Electronic Arts and Activision, increasingly turn to internal studios (usually former independent developers acquired for their development needs).[14]
Quality of life[edit]
Video game development is usually conducted in a casual business environment, with T-shirts and sandals common work attire. Many workers find this type of environment rewarding and pleasant professionally and personally.[citation needed] However, the industry also requires long working hours from its employees (sometimes to an extent seen as unsustainable).[15] Employee burnout is not uncommon.[16]
An entry-level programmer can make, on average, over $66,000 annually only if they are successful in obtaining a position in a medium to large video game company.[17] An experienced game-development employee, depending on their expertise and experience, averaged roughly $73,000 in 2007.[18] Indie game developers may only earn between $10,000 and $50,000 a year depending on how financially successful their titles are.[19]
In addition to being part of the software industry,[citation needed] game development is also within the entertainment industry; most sectors of the entertainment industry (such as films and television) require long working hours and dedication from their employees, such as willingness to relocate and/or required to develop games that do not appeal to their personal taste. The creative rewards of work in the entertainment business attracts labor to the industry, creating a competitive labor market which demands a high level of commitment and performance from employees. Industry communities, such as the International Game Developers Association (IGDA), are conducting increasing discussions about the problem; they are concerned that working conditions in the industry cause significant deterioration in its employees' quality of life.[20][21]
Crunch[edit]
Some video game developers and publishers have been accused of the excessive invocation of "crunch time".[22] "Crunch time" is the point at which the team is thought to be failing to achieve milestones needed to launch a game on schedule. The complexity of work flow, reliance on third-party deliverables, and the intangibles of artistic and aesthetic demands in video-game creation create difficulty in predicting milestones.[23] The use of crunch time is also seen to be exploitative of the younger male-dominated workforce in video games, who have not had the time to establish a family and who were eager to advance within the industry by working long hours.[23][24] Because crunch time tends to come from a combination of corporate practices as well as peer influence, the term "crunch culture" is often used to discuss video game development settings where crunch time may be seen as the norm rather than the exception.[25]
The use of crunch time as a workplace standard gained attention first in 2004, when Erin Hoffman exposed the use of crunch time at Electronic Arts, a situation known as the "EA Spouses" case.[23] A similar "Rockstar Spouses" case gained further attention in 2010 over working conditions at Rockstar San Diego.[26][27] Since then, there has generally been negative perception of crunch time from most of the industry as well as from its consumers and other media.[28]
Discrimination and harassment[edit]Gender[edit]
Game development had generally been a predominately male workforce. In 1989, according to Variety, women constituted only 3% of the gaming industry,[29] while a 2017 IGDA survey found that the female demographic in game development had risen to about 20%. Taking into account that a 2017 ESA survey found 41% of video game players were female, this represented a significant gender gap in game development.[30][31]
The male-dominated industry, most who have grown up playing video games and are part of the video game culture, can create a culture of "toxic geek masculinity" within the workplace.[32][30] In addition, the conditions behind crunch time are far more discriminating towards women as this requires them to commit time exclusively to the company or to more personal activities like raising a family.[23][33] These factors established conditions within some larger development studios where female developers have found themselves discriminated in workplace hiring and promotion, as well as the target of sexual harassment.[34] This can be coupled from similar harassment from external groups, such as during the 2014 Gamergate controversy.[35] Major investigations into allegations of sexual harassment and misconduct that went unchecked by management, as well as discrimination by employers, have been brought up against Riot Games, Ubisoft and Activision Blizzard in the late 2010s and early 2020s, alongside smaller studios and individual developers. However, while other entertainment industries have had similar exposure through the Me Too movement and have tried to address the symptoms of these problems industry-wide, the video game industry has yet to have its Me Too-moment, even as late as 2021.[33]
There also tends to be pay-related discrimination against women in the industry. According to Gamasutra's Game Developer Salary Survey 2014, women in the United States made 86 cents for every dollar men made. Game designing women had the closest equity, making 96 cents for every dollar men made in the same job, while audio professional women had the largest gap, making 68% of what men in the same position made.[36]
Increasing the representation of women in the video game industry required breaking a feedback loop of the apparent lack of female representation in the production of video games and in the content of video games. Efforts have been made to provide a strong STEM (science, technology, engineering, and mathematics) background for women at the secondary education level, but there are issues with tertiary education such as at colleges and universities, where game development programs tend to reflect the male-dominated demographics of the industry, a factor that may led women with strong STEM backgrounds to choose other career goals.[37]
Racial[edit]
There is also a significant gap in racial minorities within the video game industry; a 2019 IGDA survey found only 2% of developers considered themselves to be of African descent and 7% Hispanic, while 81% were Caucasian; in contrast, 2018 estimates from the United States Census estimate the U.S. population to be 13% of African descent and 18% Hispanic.[38][39][40] In a 2014 and 2015 survey of job positions and salaries, the IGDA found that people of color were both underrepresented in senior management roles as well as underpaid in comparison to white developers.[41] Further, because video game developers typically draw from personal experiences in building game characters, this diversity gap has led to few characters of racial minority to be featured as main characters within video games.[42] Minority developers have also been harassed from external groups due to the toxic nature of the video game culture.[32]
This racial diversity issue has similar ties to the gender one, and similar methods to result both have been suggested, such as improving grade school education, development of games that appeal beyond the white, male gamer stereotype, and identify toxic behavior in both video game workplaces and online communities that perpetuate discrimination against gender and race.[43]
LGBT[edit]
In regards to LGBT and other gender or sexual orientations, the video game industry typically shares the same demographics as with the larger population based on a 2005 IGDA survey. Those of LGBT do not find workplace issues with their identity, though work to improve the representation of LGBT themes within video games in the same manner as with racial minorities.[44] However, LGBT developers have also come under the same type of harassment from external groups like women and racial minorities due to the nature of the video game culture.[32]
Age[edit]
The industry also is recognized to have an ageism issue, discriminating against the hiring and retention of older developers. A 2016 IGDA survey found only 3% of developers were over 50 years old, while at least two-thirds were between 20 and 34; these numbers show a far lower average age compared to the U.S. national average of about 41.9 that same year. While discrimination by age in hiring practices is generally illegal, companies often target their oldest workers first during layoffs or other periods of reduction. Older developers with experience may find themselves too qualified for the types of positions that other game development companies seek given salaries and compensations offered.[45][46]
Contract workers[edit]
Some of the larger video game developers and publishers have also engaged contract workers through agencies to help add manpower in game development in part to alleviate crunch time from employees. Contractors are brought on for a fixed period and generally work similar hours as full-time staff members, assisting across all areas of video game development, but as contractors, do not get any benefits such as paid time-off or health care from the employer; they also are typically not credited on games that they work on for this reason. The practice itself is legal and common in other engineering and technology areas, and generally it is expected that this is meant to lead into a full-time position, or otherwise the end of the contract. But more recently, its use in the video game industry has been compared to Microsoft's past use of "permatemp", contract workers that were continually renewed and treated for all purposes as employees but received no benefits. While Microsoft has waned from the practice, the video game industry has adapted it more frequently. Around 10% of the workforce in video games is estimated to be from contract labor.[47][48]
Unionization[edit]
Similar to other tech industries, video game developers are typically not unionized. This is a result of the industry being driven more by creativity and innovation rather than production, the lack of distinction between management and employees in the white-collar area, and that the pace at which the industry moves that makes union actions difficult to plan out.[49] However, when situations related to crunch time become prevalent in the news, there have typically been followup discussions towards the potential to form a union.[49] A survey performed by the International Game Developers Association in 2014 found that more than half of the 2,200 developers surveyed favored unionization.[50] A similar survey of over 4,000 game developers run by the Game Developers Conference in early 2019 found that 47% of respondents felt the video game industry should unionize.[51]
In 2016, voice actors in the Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA) union doing work for video games struck several major publishers, demanding better royalty payments and provisions related to the safety of their vocal performances, when their union's standard contract was up for renewal. The voice actor strike lasted for over 300 days into 2017 before a new deal was made between SAG-AFTRA and the publishers. While this had some effects on a few games within the industry, it brought to the forefront the question of whether video game developers should unionize.[49][52][53]
A grassroots movement, Game Workers Unite, was established around 2017 to discuss and debate issues related to unionization of game developers. The group came to the forefront during the March 2018 Game Developers Conference by holding a roundtable discussion with the International Game Developers Association (IGDA), the professional association for developers. Statements made by the IGDA's current executive director Jen MacLean relating to IGDA's activities had been seen by as anti-union, and Game Workers Unite desired to start a conversation to lay out the need for developers to unionize.[54] In the wake of the sudden near-closure of Telltale Games in September 2018, the movement again called out for the industry to unionize. The movement argued that Telltale had not given any warning to its 250 employees let go, having hired additional staff as recently as a week prior, and left them without pensions or health-care options; it was further argued that the studio considered this a closure rather than layoffs, as to get around failure to notify required by the Worker Adjustment and Retraining Notification Act of 1988 preceding layoffs.[55] The situation was argued to be "exploitive", as Telltale had been known to force its employees to frequently work under "crunch time" to deliver its games.[56] By the end of 2018, a United Kingdom trade union, Game Workers Unite UK, an affiliate of the Game Workers Unite movement, had been legally established.[57]
Following Activision Blizzard's financial report for the previous quarter in February 2019, the company said that they would be laying off around 775 employees (about 8% of their workforce) despite having record profits for that quarter. Further calls for unionization came from this news, including the AFL–CIO writing an open letter to video game developers encouraging them to unionize.[58]
Game Workers Unite and the Communications Workers of America established a new campaign to push for unionization of video game developers, the Campaign to Organize Digital Employees (CODE), in January 2020. Initial efforts for CODE were aimed to determine what approach to unionization would be best suited for the video game industry. Whereas some video game employees believe they should follow the craft-based model used by SAG-AFTRA which would unionized based on job function, others feel an industry-wide union, regardless of job position, would be better.[59]
Sweden presents a unique case where nearly all parts of its labor force, including white-collar jobs such as video game development, may engage with labor unions under the Employment Protection Act often through collective bargaining agreements. Developer DICE had reached its union agreements in 2004.[60] Paradox Interactive became one of the first major publishers to support unionization efforts in June 2020 with its own agreements to cover its Swedish employees within two labor unions Unionen and SACO.[61]
In Australia, video game developers could join other unions, but the first video game-specific union, Game Workers Unite Australia, was formed in December 2021 under Professionals Australia to become active in 2022.[62]
In Canada, in a historic move, video game workers in Edmonton unanimously voted to unionize for the first time in June 2022.[63]