Gaming safety can be the wild west when it comes to privacy, data collection and advertising – especially for young techies.
Recently, Fortnite maker Epic Games made headlines due to regulatory issues that led to a major fine for violating Children’s Online Privacy Protection Act (COPPA) rules. The Federal Trade Commission assessed the company at a total of $520 million in penalties, in part due to software default settings that were “privacy invasive,” including voice and text communication settings and auto-matching with older players. The report also found that inappropriate data was collected without age segmentation involved, and the process to delete data was unnecessarily cumbersome. With complaints of sexual harassment via chat dating back to 2017, Epic Games found its practices in some hot water.
According to Jason Williams, CEO at Kidoz Inc., a kid-safe mobile advertising network that reaches millions of children worldwide, regulations state that no data can be used when marketing to children. “Therefore any game, site, network or platform that knowingly employs data when advertising to children is in violation and at risk of fines,” he says. “In fact, the risks are steadily increasing to a large number of game publishers that offer games appealing to children and are, therefore, a “mixed audience,”’ he notes, such as in the case of Fortnite.
But that doesn’t stop some of them.
“The platforms have made it clear that app developers must follow different rules depending on the type of users their app is intended for,” he says. “Apps designed for kids must use only kid-safe ad networks. Apps that want to monetize their adult and child traffic differently or are designed for adults but still appeal to kids are deemed “mixed audience” and must use an age-gate to segment their traffic,” says Williams.
There’s a long chain of command that must be relied upon to adhere to the rules. “The responsibility of in-app, kid-safe advertising falls to the platform providers, app developers, advertising networks, and advertisers. Each actor in this value chain of data safety must play their part to keep kids secure and data private while gaming,” he says. “The platform owners, Apple and Google, set the policies that, if followed, ensure safe and compliant advertising for all gaming users who are minors. It is also the responsibility of the ad networks that choose to serve ads to children that their technology is built completely safe to prevent any data sharing, storing or tracking.”
But with so many rules in place, grey areas still exist where bad actors are getting away with questionable behaviours.
“For the most part, metaverse platforms do not allow advertising at all. Roblox has introduced some advertising opportunities for brands, but it has disabled the ability to reach children entirely. Although a compliant strategy, children are not empowered with a way to experience premium content without making a purchase. There are rules for in-app advertising that require publishers of mixed audience apps to only show ads to children from compliant ad networks,” says Williams.
And that’s where the trouble lies. ”The problem is that, frequently, the app publishers don’t have an age gate, and so don’t know which users are children and which are adults. In this circumstance, app publishers often serve targeted (non-compliant) advertising to their entire user base as this strategy will generate more revenue and is the easiest to implement,” he says.
So what’s the next step for safe gaming?
“In order to have a future of safe gaming, it’s integral that digital minors are protected from their data being shared, saved, profiled and commercialized. In addition, the advertising content allowed into the user segments identified as digital minors must be safe. To get to a place of advertising safety, apps, networks, brands and platform owners need to ensure they follow the regulations and hold themselves accountable for distributing safe content appropriate for minors,” says Williams.
Laura Mingail, founder of Archetypes and Effects, notes that brands are able to interact with players in ways that may not be as obvious in traditional marketing methods – but they should still be implemented with integrity. “At a higher level, it is important to ensure that the narratives being conveyed in kids’ content offer value and are not nefarious,” she says. “When brands of all categories invest in creating content or enhancing game experiences with digital merch or interactive worlds, it must be done in the context of offering genuine value to the play experience. And, this should only be done in games and platforms that have appropriate age recommendations.”
As it stands, Canadian regulations call for all children under 12 (under 13 in Quebec) to be protected from data tracking and profiling. But we might look to our neighbors to the south for further restrictions. In the U.S., COPPA requires that games do not use data tracking or targeting when handling underage users (under 13). Even further, a new bill passed in California is moving the age of a digital minor to 18, which sets the new threshold for the definition of a digital minor as it aligns with other legislation around gaining the rights and responsibilities of adulthood.
“Minors are important customers who deserve to enjoy the services of these systems,” says Williams. And on the brand side, gaming can be a useful platform with many beneficial returns that should not be ignored. “Because game experiences can engage players for far longer than traditional content, there are more opportunities for meaningful brand interactions within them,” says Mingail.
But there needs to be protocols in place that ensure safety.
“The most practical step operators of digital platforms and games can take is the introduction of age gates into their digital products. The handling of their data should always comply with the law and respect the sensitivity of their privacy,” says Williams.
As for the future of Epic Games, the penalty represents a substantial gouge in the company’s expected $9.5 billion in annual revenue – likely not enough to put it out of business, but maybe enough to dissuade it from making these same mistakes again.