Virtual reality and augmented reality technologies (VR/AR) are rapidly maturing and becoming more prevalent to a wider audience, especially as the pandemic drives more people to virtual activities. This technology provides the promise to entertain and educate, to connect and enhance our lives, and even to help advocate for our rights. But it also raises the risk of eroding them online. The headsets and devices can gather deeply personal information about you and the world around you, and the VR/AR services often store this data within their servers. And by introducing artificial intelligence to VR/AR, the privacy and security hazards are augmented, too, as the devices and services gather and analyze data at an unprecedented level. 

VR/AR devices let us enter a virtual world or see the real world with overlayed digital content, but at a price.

2020 confirmed that VR/AR users care about their privacy. Millions of users reacted furiously against Facebook when it announced it would force Oculus users to log into their headset with a account within two years or potentially brick their device. (All Oculus Quest 2 users can only sign up using a Facebook account.) In If Privacy Dies in VR, It Dies in Real Life, we explained why Oculus users might not want to use their real names. Without anonymity, Oculus leaves vulnerable users out to dry, such as VR activists in Hong Kong and LGBTQ+ users who cannot safely reveal their identity. 

VR/AR devices let us enter a virtual world or see the real world with digital overlay content, but at a price. VR/AR devices track our environment and intimate details about our life. A Stanford Research Study explained that Virtual Reality needs the system to measure body movements because the content responds accordingly: 

For example, in VR, people turn their physical head around to make eye contact with other virtual reality users, use their legs to walk in the physical room to get across a virtual room, and move their physical arms to grasp virtual objects. These tracking data can be recorded and stored for later examination (...) With VR, in addition to recording personal data regarding people’s location, social ties, verbal communication, search queries, and product preferences, technology companies will also collect nonverbal behavior—for example, users’ posture, eye gaze, gestures, facial expressions, and interpersonal distance.

Linking various Facebook services under a "unified login account" also raised serious concerns about the consolidation of Facebook data collection practices across Facebook products. Facebook already has a vast collection of data gathered from across the web and from your devices. Combining this with sensitive body-data and biometric identifiers detected by the headsets (including our interactions and reactions to objects and people) can further cement Facebook’s monopolistic power in the online advertising ecosystem. In the European Union, forcing a user to sign up with a Facebook account may also run afoul of the “coupling prohibition” under the European Data Protection Regulation, which states that making a service dependent upon consent to something that has nothing to do with the service means consent is not, actually, voluntary.

As the use of augmented reality has grown, so have the promises and dangers it poses.

Earlier this year, we called upon antitrust enforcers to address yet another Facebook broken promise about privacy. And they did. Recently Germany's competition regulator started examining the linkage between Oculus hardware and the rest of the Facebook platform. (In 2019, the same regulator prohibited Facebook from extensively collecting and merging user data from different sources.) 

VR/AR devices, which include cameras, microphones, and sensors, help us interact with the real world (and ensure you do not crash into your table). That means information about your environment, such as your home, office, or even your community, can also be collected and shared to target advertisements to you. And once collected, all this information is potentially available to the government. Even if you never use this equipment, sharing a space with someone who uses it may even put your privacy at risk.

VR/AR devices in your home can also collect all-encompassing audio and video, along with telemetry about your movements, depth data, and images. This data can be used to build a highly accurate geometrical representation of your home. In Come Back with a Warrant for my Virtual House, we explained why the government must not get warrantless access to this sensitive information, even when a third-party AR/VR provider holds that information. Our analysis builds on the landmark Supreme Court case Carpenter v. United States, which held that accessing historical records containing the cellphones' physical locations requires a search warrant, even though they were held by a third-party. We also are protected by the longstanding rule in Kyllo v. United States: when new technologies that can “explore details of the home that would previously have been unknowable without physical intrusion, the surveillance is a 'search' and is presumptively unreasonable without a warrant.” 

As the use of augmented reality has grown, so have the promises and dangers it poses. Glasses that augment reality may also mean a wearer could be recording your conversations while mapping the environment around you in precise detail and real-time. As we explained in Augmented Reality Must Have Augmented Privacy, if these technologies are massively adopted, AR recording's scope and scale could give rise to a global panopticon of constant surveillance in public or semi-public spaces. 

Recognizing that people have historically enjoyed effective anonymity and privacy when in these spaces, we explained how the U.S. Constitution and international human rights law require the government to obtain a warrant to access the records generated by augmented reality, and require tech companies to respect and protect their users’ right to data privacy. Specifically:

Companies must, therefore, collect, use, and share their users’ AR data only as minimally necessary to provide the specific service their users asked for. Companies should also limit the amount of data transited to the cloud, and the period it is retained, while investing in robust security and strong encryption, with user-held keys, to give users control over information collected. Moreover, we need strong transparency policies, explicitly stating the purposes for and means of data processing, and allowing users to securely access and port their data.

Augmented reality may pose unprecedented dangers of a dystopian future. But with strong policies, robust transparency, wise courts, modernized statutes, and privacy-by-design engineering, we can hold back that dystopia and reap the rewards of this technology.

Looking forward, we plan to delve into the privacy and data protection risks associated with the broad amount of information collected about our biometrics and body data, our fitness levels (and our vitals), and our “biometric inferred data.” This technology has the potential to monitor the tone of our voice, our facial and gaze expressions, our heartbeats, and body temperature. It can track our unconscious responses that our body makes, like when we blink, where we look, and our attention span. With machine learning, providers can use the data to infer attitudes, emotions, personality traits, preferences, mental health, cognitive processes, skills, and advertisements’ effectiveness. Fitness and health apps already ask users to input their feelings, and some are embarking on a tone of voice analysis. Police have leveraged Fitbit’s heart rate data in a criminal investigation. Law enforcement is incorporating AI into a vast range of criminal investigative contexts, with troubling implications. As with every new technology, police should not abuse this technology before we enact (and enforce) necessary privacy laws. 

Companies’ continued efforts to quantify our public, social, and inner lives will profoundly impact our daily lives in the years ahead. By combining VR/AR with machine learning and continually expanding their scope beyond our behavior on devices and into the physical environment, VR/AR can shape a world where humans can be more vulnerable to these companies’ influence—and governments’ pressure. But with proper safeguards and legal restrictions, a different, and better, reality is possible.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2020.