Moderation in the metaverse: what are the new challenges?
On October 28, at the Facebook Connect 2021 annual gathering around virtual reality, Mark Zuckerberg announced the name change of the “Facebook” group to “Meta”. This change in name is justified by the desire to give a strategic shift to the group, which is now turning towards the metaverse (a contraction of “meta”, which refers to a comprehensive vision, and the word “universe”), which according to its CEO, is the key and successor to the current mobile internet.
But what is the metaverse?
Conceptualized in 1992 by Neal Stephenson in his science-fiction novel “Snow Crash”, the concept of the metaverse refers to virtual worlds in which it is possible to interact with other users through connected devices such as a virtual reality headset or haptic gloves capable of reproducing the movements of players. The user is then asked to create an avatar (a digital double) to join the experience. In itself, any virtual world in which a user is required to create a “fictional” character, in their own image, is considered a metaverse.
Thus, all social networks and other forums are metaverses insofar as users gather and interact under their digital identities by agreeing with their digital doubles in a parallel universe. In order to make the experience completely immersive and new for the player, the metaverse uses and interweaves two technologies: augmented reality (AR) and virtual reality (VR) to create this collective universe.
In concrete terms, the metaverse represents an immersive experience in a three-dimensional virtual universe in which each individual is materialized by a hologram or an avatar.
| Moderation in the metaverse
Moderation in the metaverse would require monitoring and controlling billions of simultaneous interactions. The Director of New Technologies at Meta, Andrew Bosworth, has expressed his concerns about this new challenge represented by the metaverse and the need to ensure a healthy and caring environment for it. However, according to him, and despite the fact that he has significant resources such as an annual budget of several billion dollars to materialize the group’s objectives in this field, moderation would be almost impossible. He even says that this is an “existential threat.”
Moreover, recent events do not bode well for Facebook following the revelations of whistleblower Frances Haugen, who has pointed out the network’s difficulties in controlling content flows and ensuring the safety of its users by combating online hate and disinformation.
Another point, some behavioral deviations have been reported and raised in several online games (Roblox, Fortnite) that attract a predominantly young audience. For example, the reenactment of Nazi-inspired villages in the game Roblox and/or the representation of scenes forbidden to minors. In this context, it is difficult for authorities to trace and identify the individual who has exhibited this deviant behavior when each person is materialized by an avatar. Thus, it is also interesting to know who will be responsible for moderating the metaverse, whether it is the companies like Meta (the developers) that must ensure this mission or whether it is up to governments to manage this moderation.
As a first response, Andrew Bosworth indicates that he plans to create spaces within the metaverse where users can if they wish to have access to “safe zones” allowing them to “isolate” themselves. Will there be “human” moderators within the game to detect, prevent, control, or sanction deviant behavior? Is there an artificial intelligence capable of managing this mission? To this question, Facebook assures that they are “exploring the best uses of AI,” but “this is still under construction.” Technology specialist lawyer Brittan Heller expresses her concern about this point: “In 3D, it is no longer content that needs to be regulated, but whole behaviors.”
To moderate the metaverse, Andrew Bosworth suggests that it will be necessary to establish a next-generation monitoring system that will act as a justice institution. It will be responsible for enforcing strict and firm measures, warning and sanctioning individuals who do not respect the rules, and may even expel them from the virtual world.
N'hésitez pas à partager cet article !
"Moderation in the metaverse: what are the new challenges? "