A group of online safety advocates and experts have penned a letter to the CEO of Meta, formerly known as Facebook, urging the company to reconsider its plans to invite teenagers and young adults to join its metaverse app, Horizon Worlds. The letter, which was sent on April 14, was signed by several prominent safety groups, including Airplay, the Center for Countering Digital Hate, and Common Sense Media, among others. The group expressed concerns over the potential risks associated with allowing young people to access a virtual world without adequate safeguards in place.
The metaverse is a virtual world that allows users to interact with each other in a three-dimensional space. Meta’s Horizon Worlds app is currently in beta testing and is expected to launch in the coming months. The app is designed to be a social space where users can create their own virtual environments and interact with others. However, the safety groups argue that allowing young people to participate in the metaverse without proper protections could lead to a range of negative consequences, including cyberbullying, exposure to inappropriate content, and online grooming.
In their letter to Meta CEO Mark Zuckerberg, the safety groups called on the company to scrap its plans to allow teenagers and young adults to join Horizon Worlds until adequate safeguards are put in place. The groups also urged Meta to work with them to develop effective safety measures that would protect young people in the metaverse. The letter stated that “the risks to young people are too great to be ignored” and that “Meta has a responsibility to ensure that its products do not harm the most vulnerable members of society.”
The safety groups’ concerns are not unfounded. Research has shown that young people are particularly vulnerable to online risks, including cyberbullying, harassment, and exposure to inappropriate content. A recent study by the Pew Research Center found that 59% of US teens have experienced some form of cyberbullying, while 63% have been exposed to offensive or inappropriate content online. The study also found that 48% of teens have been contacted by strangers online, and 31% have been asked for personal information.
Meta has responded to the safety groups’ concerns by stating that it is committed to ensuring the safety of its users, including young people. The company has said that it is working on a range of safety features for Horizon Worlds, including content moderation tools, reporting mechanisms, and age verification systems. Meta has also said that it will be partnering with safety organizations to develop best practices for online safety in the metaverse.
Despite these assurances, the safety groups remain unconvinced. They argue that the risks associated with the metaverse are too great to be left to individual companies to manage, and that there needs to be a coordinated effort to ensure that young people are protected. The groups have called on governments and regulators to take action to address the risks associated with the metaverse, including the development of standards and guidelines for online safety.
The debate over online safety in the metaverse is likely to continue as the technology develops and becomes more widespread. As more young people begin to access virtual worlds, it will be important for companies and regulators to work together to ensure that appropriate safeguards are in place. Only by taking a proactive approach to online safety can we hope to protect young people from the risks associated with the digital world.