Maybe We’re Looking at Microtransactions All Wrong

Bungie announced plans to change the Destiny 2's experience system after players discovered that  it was rigged against them , angering many since it's connected to the real life-money Eververse store.

Bungie announced plans to change the Destiny 2's experience system after players discovered that it was rigged against them, angering many since it's connected to the real life-money Eververse store.


One of the biggest stories discussed in the gaming community this year is the widespread outrage over microtransactions and loot boxes in several new, high-profile video games, including Call of Duty: WWII, Destiny 2, Forza Motorsport 7, Injustice 2, Middle-earth: Shadow of War, and Star Wars Battlefront II. Forbes contributor Paul Tassi writes, "There’s blood in the water, and it’s not enough to move millions of copies of $60 games, [game publishers] need each of those copies to produce X dollars in 'ongoing revenue,' which means long grinds and plenty of things to gamble your money away on."

Despite the vocal criticism from gamers, it doesn't seem like microtransaction models will disappear from the games industry in the future, as author Brandon Dortch describes below. According to a new report from market research firm SuperData Research, PC gamers will spend a whopping $22 billion on microtransactions in free-to-play games in 2017. This astronomical amount is double the number from 2012, and nearly three times the revenue generated from full game purchases on PC and consoles combined. SuperData's report suggests that, although players are "quick to complain" about game publishers monetizing extra content, they "continue to support service-based [models] with their wallets."

If you've played recent games with microtransactions, loot boxes, or other real-life-money systems, please share your thoughts or experience with them in the comments.

This editorial originally appeared on the Loading Snacks blog, and is republished and edited for length here by permission from the author, Brandon Dortch.

-Taylor Reeh

Over these last couple of decades, I’ve seen video games grow exponentially both as an art form and as a business. So much so that the business growth now makes me wonder if its current state warrants re-examining some of my own thoughts when comparing the parallels between video games, movies, and music.

Part of this revelation comes from the current debate over microtransactions and loot boxes in video games. For those unfamiliar with the terms, microtransactions are purchases made in-game usually in the $1-$10 range, but can sometimes be as high as the not-so-micro $100-$150 range. These purchases are generally for something cosmetic like a new outfit or skins for a game character ,but can also be for new, more powerful weapons, vehicles, or even experience bonuses that allow you progress faster than normal.

Not so recently (but seemingly a new point of contention in the console space), microtransactions have come to include loot boxes, which can have any of the previously mentioned items inside, but are more or less blind purchases because their contents are many times unknown prior to purchase. Think of them like buying a pack of collectible cards. There is a small percentage chance of receiving something rare and a much higher percentage chance of getting a duplicate that you don’t really want or need. Many are complaining about the appearance of these transactions in video games period, but especially in single player games and after having to purchase said games at the full $60 price point.

One of The early examples of console microtransactions came when Bethesda released DLC for The Elder Scrolls IV: Oblivion, which INCLUDED in-game Horse armor for $2.50.

Speaking of which, video games have largely stayed at the $60 price point for over a decade, starting with the release of the Xbox 360 in 2006. However, based on the way costs have risen within video game production, that pricing model is antiquated and games should be much more expensive. Or should they? When compared to other disc-based mediums like CD, DVD, Blu-Ray, etc., that line of thought may seem a bit backward because all of the aforementioned pieces of hardware have come down in price over time. At least until replaced with a better medium or until digital took over, where pricing has remained static even longer (another conversation altogether). Comparatively speaking, the cost associated with producing an album or movie before it ever gets to one of those home consumer formats hasn’t grown by the same leaps as game production. However, those mediums have additional revenue driving streams that exist before, after, or next to them that allow companies to recoup losses that may be had by producing them. A movie is most often produced with theater ticket sales in mind as the main way to recoup production costs, followed by release to home video, then cable networks and streaming service releases, and finally, network TV rebroadcasts, subsidized by advertising in the end.

That’s all part of the original budget plan. Traditionally, games don’t have a long-tailed revenue plan that could potentially last for years built into them. But they sort of do now. Games as a service, DLC, and microtransactions are the way most developers and publishers can not only offset the high cost of game development up front, but also the cost of sustaining servers and creating new content.

As the world’s largest and most lucrative form of entertainment, fan expectations are quite high, and the costs of game development to meet these expectations had to grow in tandem. Gaming is much bigger than movies and music, and it's been this way for years. As an example, the most recent release of Call of Duty: WWII from Activision made $500 million in its opening three days. That’s more than Warner Bros’ Wonder Woman and Disney’s Thor: Ragnarok’s opening weekends combined at $200 and $152 million respectively. No wonder companies are looking for ways to lengthen the cash grab when typically video games have been traditional one-and-done purchases, and the cost of producing those games can be higher than many big budget films.


"I think we may have to pull ourselves away from the idea of video game purchases being the same as buying a DVD or a CD."

Now, I’ve read many complaints about microtransactions and considered the pros and cons across some very valid points. Ones that include addiction, gambling, and pay-to-win issues, as well as ones that contend that most of the time these loot boxes or the transactions themselves aren’t forced on anyone. Either way, I think that part of the solution lies in changing the way we rationalize the purchase of our beloved medium.

I think we may have to pull ourselves away from the idea of video game purchases being the same as buying a DVD or a CD. It’s an interactive medium and has grown to reflect that. Purchasing a game is more like buying a ticket to a movie theater or a ticket to a theme park. It may cost a set amount to go to a movie but you still have to buy concessions. You may pay $60-65 to get into a Six Flags, and you may even have a season pass, but you may still have to pay to play carnival games in the park that give chances to win extra prizes...sound familiar? You don’t have to play those games at all. Even as the kid with the microphone taunts and pressures you, swearing that he or she can guess your age, you can still just ignore it altogether.

But why aren’t those games free when you already paid for admission? It's for the same reason that Tess Everess asks you to come buy shaders and other cosmetic unlocks from her booth in Destiny 2. If you are afraid to ride roller coasters or don’t like rides that spin, the price of entry doesn’t go down. Just like there’s no price cut for individuals who choose not to participate in Call of Duty multiplayer or the Crucible in Destiny. It’s there if you want to jump in, but choosing not to is up to you.  The business of “experiential” entertainment has always been grounded in getting you to spend more money beyond the original barrier to entry.


"The business of 'experiential' entertainment has always been grounded in getting you to spend more money beyond the original barrier to entry."

There is next to no profit in movie ticket sales. There hasn’t been for years. That’s why concessions are so important to them and partially why they are so expensive. There’s a need to keep the barrier of entry relatively low. That way people still pay for entry and may just buy some popcorn. When you go to a major concert venue, there are stands selling T-shirts and programs to an event you’ve already paid to attend. Making additional purchases tied to experiential entertainment has been par for the course for as long as I can remember, and certainly longer than video games have been around.

Maybe it’s time to consider games like Injustice 2 the concert event of the summer that costs $60 to enter, and its additional characters as $100 VIP seating or backstage passes. The barrier to entry is the same for everyone and those that want a deeper experience can pay for it. This isn’t to say that companies aren’t robbing us of some experiences in order to push microtransactions (I’m looking at you, Bungie!) or that we should simply lay down and accept these practices without question. It’s more of a way to shift our focus when looking at how a decades-old business needs to change in order to continue to exist, and our view needs to change with it. I personally feel better about this changing landscape that I love so much when looking at it from this perspective. It helps to make some sense of it all and that makes it just a little less infuriating.

Follow the author Brandon Dortch and his blog Loading Snacks on Twitter: @mtshellz and @LoadingSnacks.