large image

Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Will the metaverse be safe for kids?

Metaverse is still in its early stages, but with Mark Zuckerberg claiming that we’re just years away from its release, will it be safe for kids? We asked leading security experts to offer their insights and help answer that question. Here’s what we found out.

The Metaverse is a phenomenon bought about by the rebranding of Facebook to Meta, and Mark Zuckerberg’s new goal to create a virtual and simulated reality for people to use.

It has been described by Zuckerberg as an interconnected digital space where users will be able to have new experiences through avatars, independent of their real-world surroundings. It is being advertised as a place to make social connections, as well as a place to learn and collaborate even if you can’t be in the same physical space.

If you want a full rundown about what Meta is and why the company decided to rebrand, click the link prior. This article will be focusing on the Metaverse, and how it will work with children.

Kaspersky Safe Kids

Kaspersky Safe Kids

Protect your kids online and offline with award-winning parental controls. Get flexible tools that help you safeguard their activities, monitor their behaviour and teach them self-control.

  • Kaspersky
  • £14.99 per year
Buy now

Do you expect any age verification to be introduced, since none exist currently?

David Emm, principal security researcher at Kaspersky, explained to Trusted Reviews how Meta requires users to be over 13 years of age, but there are no further features to restrict what a child could be exposed to.

“As per Meta’s terms, Facebook requires everyone to be at least 13 years old before they can create an account. Whilst there’s an assumption that someone is old enough to have a Facebook account, there’s currently no way for a parent to automatically restrict what a child is exposed to,” said Emm.

“They could only do this by using the Oculus app to see what their children are doing – and using their own judgement to decide if it’s appropriate for their children.”

The Facebook Community Standards apply to people using the Oculus, but currently, there is no way for the company to verify the age of someone joining the site.

Principal analyst Rupantar Guha, from GlobalData’s thematic team, told Trusted Reviews that as VR gains popularity, age verification will become standard.

“Age verification is not explicitly outlined on the Oculus app, but the requirement to link a Facebook account does it automatically. As VR gains popularity, the need to verify age will become standard,” Guha explained.

“VR headset makers will use this data for content recommendation and targeting advertisements. However, targeting personalized ads to children could attract public backlash, so all VR headset makers need to be mindful of that. Contextual advertising would be the best way to reach children, and in turn, their parents – the key monetization targets.”

Do you expect that the metaverse will introduce restrictions on inappropriate language?

Jeff Norton, author of MetaWars: Fight For the Future, told Trusted Reviews about the thin line between free speech and misinformation on the internet.

“If the last few years of social media have taught us anything, it’s how delicate the balance is between free speech and misinformation,” said Norton.

“As the internet develops and we interface on new platforms, we’ll have to wrestle with questions about how to govern and police everything from inappropriate language to deliberate misinformation to outright hate speech.”

Guha added to this, saying that both adults and children should be able to use the Metaverse with settings that restrict bad language.

“It should fundamentally be incorporated, both for adults and children. The metaverses should allow settings like safe search and profanity filters for the users. In addition, there should be artificial intelligence-based moderators to filter inappropriate language in public chats. As the metaverses evolve over the years and gain attention from consumers, the need for language moderation will become increasingly necessary,” Guda concluded.

Kaspersky Safe Kids

Kaspersky Safe Kids

Protect your kids online and offline with award-winning parental controls. Get flexible tools that help you safeguard their activities, monitor their behaviour and teach them self-control.

  • Kaspersky
  • £14.99 per year
Buy now

Do you expect that it will dedicate time to moderating inappropriate behavior?

Guha made further comments on the idea of moderating inappropriate behaviour, saying that it will be trickier to handle, considering that metaverses are not common yet.

“Meta is already working on moderating toxic behaviour in its metaverses. But there is a lot of work to be done, given that metaverses are not mainstream yet,” Guha says.

“Meta and other VR headset makers must view moderating behaviour as a foundational aspect, given that toxic actions will only grow as more consumers sign-up to the platforms. Failure to create a robust system to filter toxicity will have a detrimental impact on the company’s metaverse ambitions and reputation.”

David George, from GlobalData’s thematic team’s director of services, explained to Trusted Reviews about the issues surrounding the Metaverse, and how the parents and guardians of children will need to keep an eye on what their kids are watching.

“I think it’s also worth emphasising that the concept metaverse will cover many virtual realities from many providers with different target audiences and moderation policies.

“The biggest issue and challenge will be parents ensuring that their children keep to the age-appropriate ones, something that is already hard to do with the existing internet,” George says.

Do you expect there will be a way for parents/guardians to monitor what their children are engaging with in the metaverse?

Guha went on to talk about how the internet has created spaces for children and how the metaverse will likely try and do the same.

“The metaverse is still in the early stages of development and the immediate target audience will be adults. As the metaverse matures, I’m sure there will be content and specific spaces for children in it,” Guha says.

“Many children use platforms like Fortnite and Roblox to socialize and play, so child-safety policies are already in place. While the metaverse will have larger privacy concerns, given its immersive nature and ability to gather biometric data, online safety policies must evolve accordingly.”

Emm added that the metaverse will likely implement ways for parents to monitor what their kids are looking at.

“I would expect that, in response to the ‘children’s code’ and the Online Safety Bill, Meta will implement ways for parents to restrict what their children can do in the metaverse and what they are exposed to,” Emm explained.

“I also think it is likely that specific spaces for children will be created – in much the same way that we saw the emergence of social networks made for children.”

Norton added that each platform will take on its own rules, in the same way that social media sites already do.

“Each platform will have its own rules, norms, and terms & conditions,” says Norton.

“The optimist in me would like to think that social norms will emerge whereby inappropriate behaviour won’t be tolerated, but sadly it seems there’s always space for abuse online and the metaverse may become more fertile ground for our worst instincts. There’s an illusion of utter freedom that comes from anonymity, and those that act and speak via an anonymous avatar may very well behave even worse than what we’ve seen people on Twitter already do.”

Would you let your kids use the metaverse?

Guha claimed that he would be comfortable letting his children use the metaverse, but only if he was happy with the availability of the following factors:

  • Relevant content and experiences
  • Safety measures to curb inappropriate behaviour
  • Safe chat with a substantial level of filtering of inappropriate words
  • Parental monitoring of kids’ activities and experiences in the metaverse

Norton also mentioned that his children use online resources, with the recent events of Covid-19 changing the way that children use the internet.

“I have two boys and the pandemic ushered them into the metaverse with online learning during the first lockdown. As parents, we had actually done a pretty good job to limit screen time and prioritize reading over watching, but the pandemic changed everything,” Norton claims.

“And for the generation, I call “covid kids” there’s no going back. And after giving in… we allowed Minecraft into the house in the Spring of ’21, and that genie is never going back into its bottle. At its best, the boys cooperate and coordinate building and playing in the virtual world. At its worst, it’s turned them into addicts.

“And that’s the conundrum for parents and users; the metaverse will offer incredible opportunities for connection and interaction, but it will likely come at the opportunity cost of participating in the real world. If there’s one thing we’ve learned as human beings it’s that we can’t be in two places at once.”

Emm also believes that a guardian or parents should help guide their child through the process.

“I would be reluctant to let children use the metaverse unsupervised and without having first checked what they would be doing. I think parental supervision is vital.”

Kaspersky Safe Kids

Kaspersky Safe Kids

Protect your kids online and offline with award-winning parental controls. Get flexible tools that help you safeguard their activities, monitor their behaviour and teach them self-control.

  • Kaspersky
  • £14.99 per year
Buy now

Do you believe that metaverse will be safe for under 13’s to use?

GlobalData’s thematic analyst Emma Taylor explained to Trusted Reviews how already existing social media sites can be dangerous for children, and how the metaverse will likely exacerbate the current situation.

“Any platform which is used to consume digital content comes with an array of potential dangers, especially for kids, ” Taylor claims.

“The premise of the metaverse is to create one all-encompassing platform where you can work, shop, game and socialize and so will also undoubtedly be very enticing for kids. However, it is difficult to see how the metaverse will be regulated appropriately, especially to the extent that it could become safe for children.”

Taylor goes on to say that the sheer size of information that will be extracted from the metaverse is hard to predict currently.

“The underlying technologies associated with the metaverse, like social media platforms, are already seen as widely damaging to children.

“Unfortunately, it is likely that the issues attributed to these platforms will be extended, or even exacerbated in the metaverse because not only will it follow a similar ad-based model, but it will be more immersive, integrate even further into most aspects of our lives and be harder to regulate.

“The reasons for this are; the sheer magnitude of personal data which can be reaped from the metaverse, the involvement of a vast number of different developers and corporations, its disassociation from any national authority, and its unknown reach and potential.”

Norton adds to this by claiming that the Metaverse will pose similar problems to what is already accessible on the internet.

“I don’t think it’ll be any more or any less safe than our current internet, or indeed the real world. There are plenty of risks with social interaction, especially online, that parents need to guard against,” Norton says.

“I do think the platforms themselves have a duty of care to ensure no harm comes to the most vulnerable users, and indeed may have to take special measures to keep some spaces off-limits. I’m reminded that it’s the responsibility of a swimming pool owner to keep the pool behind a fence and a locked door, to take reasonable steps from protecting people from what the law calls an “attractive nuisance.” Big Tech probably needs some fences and locked doors on parts of the metaverse.”

Check out Kaspersky Safe Kids, just £14.99 per year

Why trust our journalism?

Founded in 2004, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.