Gamers vs. Anti-Consumerism

Keagan Hallock
Gamers+vs.+Anti-Consumerism

Over the course of the last twenty-five years, business practices in the video game market have become lousier and less consumer friendly. In recent years, a lot of developers, both small and large, have turned towards new trends in gaming. One trend, microtransactions, is a major factor in what drives games. A lot of games are being made with microtransactions in mind. Microtransactions are not inherently bad, they’ve grown to be a bad taste in everyone’s mouth after a company or two took it too far (more on this later).

Microtransactions is a business model where gamers pay real money for digital currency or for items and cosmetics for games. Microtransactions have been around for a while dating back to 2004. The first known instance of microtransactions in video games was with the game “Maplestory.”

“Maplestory” implemented microtransactions into the game in the form of Gachapon tickets. The way to acquire Gachapon tickets was to pay 100 yen (0.91 USD) at any booth around the game’s online world. They could then redeem the Gachapon tickets for a random in-game item. This new system was originally only added to the games Japanese servers, but after players were paying large sums of money to get a leg up on the competition, it was clear that this system worked. An RPG (role playing game) from 2004 inspired many later games to take on their own forms of the microtransaction system.

There are multiple uses for microtransactions in games. One of those uses is to make free-to-play games. Before 2004, free-to-play games were either nonexistent or so bad that no one would pay money for them ever. After the invention of microtransactions, it became possible to make a high quality free-to-play game and have the lost revenue made up through microtransactions. Microtransactions have been used to make games anyone can afford at a high quality. Some of the most popular games are free-to-play games with microtransactions such as “Counter Strike: Global Offensive” and “Fortnite Battle Royale.”

Although microtransactions have been used to make some of the most popular games in the world, they aren’t always a good practice. When microtransactions became widespread, it became clear just how greedy big game developers were. Developers spent little time making rushed loot box systems so that people would spend more money on it. But everything was switched around when a little developer known as Electronic Arts or EA made “Star Wars Battlefront 2.”

Someone calculated that to unlock everything in the game for free, it would take two and a half years of playing for ten hours a day. Many players wanted to get the game’s items many weapons and cosmetics, and the people who did not want to “grind” for two and a half years in order to get those items. This was one of the first games made by a big developer like EA that was pay-to-win. Gamers had to pay or get lucky to gain a competitive advantage. This game wasn’t about who was better at the game, it was about who had the deepest pockets. This spurred the hatred of microtransactions.

Since then, big game developers have opted out of putting microtransactions in their already $60 games. Some still do this like Fallout 76, but that’s a disaster and it’s excused to try to make any money possible because they’ve lost a lot of money already with the bad publicity and marketing the game has received. Whether it’s randomized loot boxes for pay-to-win items or microtransactions for in-game cosmetic items, they will remain for sometime, but, thankfully, the use of them is dying out in popular games that already cost $60 just to play.