Virtual worlds are increasingly providing sophisticated, realistic, and often immersive experiences that are the stuff of fantasy. You can enter them by generating an avatar - a representation of the user that could take the form of an animal, a superhero, a historic figure, each some version of yourself or the image you’d like to project. You can often choose to express yourself by selecting how to customize your character. For many, Avatar customization is key for satisfying and immersive gameplay or online experience. Avatars used to be relatively crude, even cartoonish representations, but they are becoming increasingly life-like, with nuanced facial expressions backed by a wealth of available emotes and actions. Most games and online spaces now offer at least a few options for choosing your avatar, with some providing in-depth tools to modify every aspect of your digital representation. 

There is a broad array of personal and business applications for these avatars as well- from digital influencers, celebrities, customer service representatives, to your digital persona in the virtual workplace. Virtual reality and augmented reality promise to take avatars to the next level, allowing the avatar’s movement to mirror the user’s gestures, expressions, and physicality. 

The ability to customize how you want to be perceived in a virtual world can be incredibly empowering. It enables embodying rich personas to fit the environment and the circumstances or adopting a mask to shield your privacy and personal self from what you wish to make public. You might use one persona for gaming, another for in a professional setting, a third for a private space with your friends.

An avatar can help someone remove constraints imposed on them by wider societal biases. For example trans and gender non-conforming individuals can more accurately reflect their true self, relieving the effects of gender dysphoria and transphobia, which has shown therapeutic benefits. For people with disabilities, avatars can allow participants to pursue unique activities through which they can meet and interact with others. In some cases, avatars can help avoid harassment. For example, researchers found some women choose male avatars to avoid misogyny in World of Warcraft. 

Facebook, owner of Oculus VR and heavily investing in AR, has highlighted its technical progress in one Facebook Research project called Codec Avatar. The Codec Avatars research project focuses on ultra-realistic avatars, potentially modeled directly on users’ bodies, and modeling the user’s voice, movements, and likeness, looking to power the ‘future of connection’ with avatars that enable what Facebook calls ‘social presence’ in their VR platform. 

Social presence combines the telepresence aspect of a VR experience and the social element of being able to share the experience with other people. In order to deliver what Facebook envisions for an “authentic social connection” in virtual reality, you have to pass the mother test: ‘your mother has to love your avatar before the two of you feel comfortable interacting as you would in real life’, as Yaser Sheikh, Director of Research at Facebook Reality Labs, put it. 

While we’d hope your mother would love whatever avatar you make, Facebook seems to mean the Codec Avatars are striving to be indistinguishable from their human counterparts–a “picture-perfect representation of the sender’s likeness,” that has the “unique qualities that make you instantly recognizable,” as captured by a full-body scan, and animated by ego-centric surveillance. While some may prefer exact replicas like these, the project is not yet embracing a future that allows people the freedom to be whoever they want to be online.

By contrast, Epic Games has introduced MetaHumans, which also allows lifelike animation techniques via its Unreal Engine and motion capture, but does not require a copy of the user. Instead, it allows the user the choice to create and control how they appear in virtual worlds.

Facebook’s plan for Codec Avatars is to verify users with “through a combination of user authentication, device authentication, and hardware encryption, and is “exploring the idea of securing future avatars through an authentic account.”  This obsession with authenticated perfect replicas mirrors Facebook’s controversial history with insisting on “real names”, later loosened somewhat to allow “authentic names,” without resolving the inherent problems Indeed, Facebook’s insistence to tie your Oculus account with your Facebook account (and its authentic name) already brings these policies together, for the worse.  If Facebook insists on indistinguishable avatars, tied to a Facebook “authentic” account, in its future of social presence, this will put the names policy on steroids.  

Facebook should respect the will of individuals not to disclose their real identities online.

Until the end of next year, Oculus still allows existing users to keep a separate unique VR profile to log in, which does not need to be your Facebook name. With Facebook login, users can still set their name as visible to ‘Only Me’ in Oculus settings, so that at least people on Oculus won’t be able to find you by your Facebook name.  But this is a far cry from designing an online identity system that celebrates the power of avatars to enable people to be who they want to be.

Lifelike Avatars and Profiles of Faces, Bodies, and Behavior on the Horizon

A key part of realistic avatars is mimicking the body and facial expression of the user, derived from collecting your non-verbal and body cues (the way you frown, tilt your head, or move your eyes) and your body structure and motion. Facebook Modular Codec Avatar system seeks to make “inferences about what a face should look like” to construct authentic simulations; compared to the original Codec Avatar system which relied more on direct comparison with a person.  

While still a long way from the hyper-realistic Codec Avatar project, Facebook has recently announced a substantial step down that path, rolling out avatars with real-time animated gestures and expressions for Facebook’s virtual reality world, Horizon. These avatars will later be available for other Quest app developers to integrate into their own work.

 Facebook's Codec Avatar research suggests that it will eventually require a lot of sensitive information about its users’ faces and body language: both their detailed physical appearance and structure (to recognize you for authentication applications, and to produce photorealistic avatars, including full-body avatars), and our moment-to-moment emotions and behaviors, captured in order to replicate them realistically in real-time in a virtual or augmented social setting. 

While this technology is still in development, the inferences coming from these egocentric data collection practices require even stronger human rights protections. Algorithmic tools can leverage the platform’s intimate knowledge of their users, assembled from thousands of seemingly unrelated data points and making inferences drawn from both individual and collective behavior. 

Research using unsophisticated cartoon avatars suggested that avatars may accurately infer some personality traits of the user. Animating hyper-realistic avatars of natural persons, such as Facebook’s Codec Avatars, will need to collect much more personal data. Think of it like walking around strapped to a dubious lie-detector, which measures your temperature, body responses, and heart-rate, as you go about your day. 

Inferences based on egocentric collection of data about users’ emotions, attention, likes, or dislikes provide platforms the power to control what your virtual vision sees, how your virtual body looks like, and how your avatar can behave. While wearing your headsets, you will see the 3D world through a lens made by those who control the infrastructure. 

Realistic Avatars Require Robust Security

Hyper-realistic avatars also raise concerns about “deep fakes”. Right now, deep fakes involving a synthetic video or audio “recording” may be mistaken for a real recording of the people it depicts. The unauthorized use of an avatar could also be confused with the real person it depicts. While any avatar, realistic or not, may be driven by a third party, hyper-realistic avatars, with human-like expressions and gestures, can more easily build trust.  Worse, in a dystopian future, realistic avatars of people you know could be animated automatically, for advertising or influencing opinion. For example, imagine an uncannily convincing ad where hyper-realistic avatars of your friends swoon over a product, or where an avatar of your crush tells you how good you’ll look in a new line of clothes.  More nefariously, hyper-realistic avatars of familiar people could be used for social engineering, or to draw people down the rabbit hole of conspiracy theories and radicalization.

‘Deep fake’ issues with a third party independently making a realistic fake depiction of a real person are well covered by existing law. The personal data captured to make ultra-realistic avatars, which is not otherwise readily available to the public, should not be used to act out expressions or interactions that people did not actually consent to present. To protect against this and put the user in charge of their experience, users must have strong security measures around the use of their accounts, what data is collected and how this data is used

A secure system for authentication does not require a verified match to one’s offline self. For some, of course, a verification linked to an offline identity may be valuable, but for others, the true value may lay in a way to connect without revealing their identity. Even if a user is presenting differently from their IRL body, they may still want to develop a consistent reputation and goodwill with their avatar persona, especially if it is used across a range of experiences. This important security and authentication can be provided without requiring a link to an authentic name account, or verification that the avatar presented matches the offline body. 

For example, the service could verify if the driver of the avatar was the same person who made it, without simultaneously revealing who the driver was offline. With appropriate privacy controls and data use limitations, a VR/AR device is well-positioned to verify the account holder biometrically, and thereby verify a consistent driver, even if that was not matched to an offline identity. 

Transparency and User Control Are Vital for the Avatars of the Virtual World

In the era of life-like avatars, it is even more important for users to have transparency and control from companies on the algorithms that underpin why their avatar will behave in specific ways, and to provide strong users control over the use of inferences. 

Facebook’s Responsible Innovation Principles, which allude to more transparency and control, are an important first step, but they remain incomplete and flawed. The first principle (“Never surprise people”)  fortunately implies greater transparency moving forward. Indeed, many of the biggest privacy scandals have stemmed from people being surprised by unfair data processing practices, even if the practice had been included in a privacy policy.  Simply informing people of your data practices, even if effectively done, does not ensure that the practices are good ones. 

Likewise, the second principle (“Provide controls that matter”) does not necessarily ensure that you as a user will have the controls over everything you think matters. One might debate over what falls into the category of things that “matter” enough to have controls, like the biometric data collected or the inferences generated by the service, or the look of one’s avatar.   This is particularly important when there can be so much data collected in a life-like avatar, and raises critical questions on how it could be used, even as the tech is in its infancy.  For example, if the experience requires an avatar that's designed to reflect your identity, what is at stake inside the experience is your sense of self. The platform won't just control the quality of the experience you observe (like watching a movie), but rather control an experience that has your identity and sense of self at its core. This is an unprecedented ability to potentially produce highly tailored forms of psychological manipulation according to your behavior in real-time.

Without strong user controls, social VR platforms or third-party developers may be tempted to use this data for other purposes, including psychological profiling of users’ emotions, interests, and attitudes, such as detecting nuances of how people feel about particular situations, topics, or other people.  It could be used to make emotionally manipulative content that subtly mirrors the appearance or mannerisms of people close to us, perhaps in ways we can’t quite put our fingers on.  

Data protection laws, like the GDPR, require that personal data collected for a specific purpose (like making your avatar more emotionally realistic in a VR experience) should not be used for other purposes (like calibrating ads to optimize your emotional reactions to them or mimicking your mannerisms in ads shown to your friends). 

While Facebook’s VR/AR policies for third-party developers prevent them, and rightly so, from using Oculus user data for marketing or advertising, among other things, including performing or facilitating surveillance for law enforcement purposes (without a valid court order), attempting to identify a natural person and combining user data with data from a third-party; the company has not committed to these restrictions, or to allowing strong user controls, on its own uses of data. 

Facebook should clarify and expand upon their principles, and confirm they understand that transparency and controls that “matter” include transparency about and control over not only the form and shape of the avatar but also the use or disclosure of the inferences the platform will make about users (their behavior, emotions, personality, etc.), including the processing of personal data running in the background. 

We urge Facebook to give users control and put people in charge of their experience. The notion that people must replicate their physical forms online to achieve the “power of connection,” fails to recognize that many people wish to connect in a variety of ways– including the use of different avatars to express themselves. For some, their avatar may indeed be a perfect replica of their real-world bodies. Indeed, it is critical for inclusion to allow avatar design options that reflect the diversity of users.  But for others, their authentic self is what they’ve designed in their minds or know in their hearts. And are finally able to reflect in glorious high resolution in a virtual world. 



.