HomePostsDigital StateReality Check: How is the EU ensuring data protection in XR Technologies?

Related Posts

Reality Check: How is the EU ensuring data protection in XR Technologies?

Reading Time: 7 minutes
Print Friendly, PDF & Email

Extended reality (XR) technologies rely on large quantities of varied data about users and their environment, creating risks to their fundamental rights. Understanding XR technologies and associated data flow is crucial because of their important role in enabling immersive experiences, which promise wide societal uptake. 

Data processing in XR environments implicates the fundamental rights to respect for private life and personal data protection under the EU Charter of Fundamental Rights (Charter) and triggers the application of the General Data Protection Regulation (GDPR), as well as other new and pending EU laws governing data and the digital environment. This blog describes the spectrum of XR technologies and analyses the extent to which the EU legal framework provides safeguards for the rights and freedoms of its users, with a focus on privacy and data protection.                                                                                                                           

I. The Spectrum of Extended Reality Technologies: Augmented Reality, Virtual Reality, and Mixed Reality

XR is an umbrella term denoting a spectrum of technologies, including augmented (AR), virtual (VR), and mixed realities (MR). XR also encompasses potential or forthcoming advances that will enhance, alter, or replace a user’s physical location by varying levels of digital and physical world interaction. These may feature combinations of XR and related technologies, such as brain-computer interfaces (BCIs), digital identity, computer vision, and edge computing

AR, VR, and MR have distinct features requiring different data processing operations. However, most XR technologies do or will likely depend on effective tracking and “six degrees of freedom”: the ability to look and move around an environment. To enable this, devices often contain a variety of sensors working in tandem to offset individual sensors’ shortcomings. Examples include inertial measurement units, as well as optical and audio sensors. These gather large quantities and kinds of data from the user’s movements and environment, which are provided to algorithms to map out the device’s surroundings and determine its exact location. Certain sensor data may be distributed to other entities to power shared experiences and pursue other purposes, such as advertising, analytics, and security

One type of XR is AR, which layers virtual elements — like sound, video, and graphics — onto real-world environments. To present virtual content to a user, AR has thus far used smartphones and heads-up displays, such as those found in vehicles to relay route and other information to the driver. New mediums under development for AR include smart contact lenses

While AR experiences incorporate the real-world environment, VR technologies replace it with a completely virtual environment. VR technologies utilise a mixture of inputs and outputs that cause the user’s motor and perceptual functions to work largely as they would in physical reality. Examples include head-mounted displays (HMDs) and controllers that enable the user to interact with the environment. While most popular in gaming, VR also has applications in the workplace, education and healthcare

In the middle of the XR spectrum are MR technologies, which combine elements from both VR and AR. Like AR, MR imposes virtual elements onto real-world settings, but these elements go beyond simple overlays by persistently interacting with the physical world much like a material object would. To enable this interaction, MR may use artificial intelligence (AI) systems, like machine learning models. Examples of HMDs used in MR include the MagicLeap headset and Microsoft Hololens

II. Privacy and data protection risks of XR technologies

As “[t]he constitutional dimension of Big Data is hidden behind the opacity of algorithmic technologies”, so it is with XR technologies. Users understand neither XR’s underlying technologies and data flows, the monetization of their highly-personal data nor that they would be subject to subliminal techniques or automated decisions that may cause harm. The deployment of XR technologies in a setting such as the workplace is illustrative. Employees who use XR technologies can unintentionally reveal ‘biometric psychography,’ which employers could utilise for performance reviews. 

Techniques that use eye or motion tracking and video analytics can purportedly recognize individuals’ emotional states. Such techniques would allow organisations to measure individuals’ reactions to specific stimuli such as advertising content, in particular if information is combined with data collected from the use of BCIs and analysed by AI systems. This analysis may also allow inferring sensitive data about individuals, like ethnicity and health conditions. Users and bystanders may retain little control over access to and disclosure of their data.

XR-enabled worlds are arguably unique because of their realism and ubiquity. Users can become emotionally immersed in these virtual spaces, which they access through various devices using a single virtual identity. This may lead users to reveal more data about themselves. Additionally, some companies are rolling out ID systems for their XR-based products, despite the fact that the EU proposed a new framework for an EU Digital ID. The latter aims to address some data combination and repurposing concerns regarding social media providers’ identity solutions. 

XR interfaces like HMDs enable the collection and measurement of personal data which was previously unavailable in commercial settings, such as posture, gaze, gestures, and interpersonal distance. This may expose data subjects to the creation of increasingly detailed individual profiles – notably by exploiting the projection of small body variations into users’ avatars -, increasingly tailored advertising and other uses. Some of these may be beneficial, like helping counter deepfakes, but others may have negative consequences.

III. The constitutional dimension of the EU’s privacy protections for XR products

XR technologies raise fundamental rights issues. This blog explores the potential consequences of XR to two of those rights, notably privacy and data protection under the EU Charter’s Articles 7 and 8. The application of both would be triggered by XR-based products and services that rely on the collection of large volumes of often sensitive personal data and the insights they derive from it, as we describe below. 

Commercial deployers of XR technologies that process personal data of EU individuals are subject to the GDPR and the ePrivacy Directive. Most data collected via XR technologies from users and their avatars relates to an identified or identifiable natural person. Furthermore, interfaces like HMDs would likely be qualified as “terminal equipment” under the ePrivacy Directive, which generally restricts the collection of information from those devices to what is objectively necessary for service provision. 

Since EU data protection law is agnostic to the devices and interfaces underlying data processing, it applies to most commercial applications of XR technologies. Below we analyse the tensions between some XR use cases and the GDPR, and whether novel EU legislation could have positive spillover effects on data protection in XR. 

IV. GDPR compliance offers significant hurdles for XR technologies

An initial tension relates to the alignment of XR technologies with core GDPR principles like data minimisation and purpose limitation. Organisations need to ensure that processed data is limited to what is necessary to pursue their legitimate purposes, and that it is not further processed in an incompatible manner. Given the vast amounts of granular personal data XR technologies can collect, data minimisation should be embedded into the design of XR interfaces and supporting software. Moreover, controllers must check whether specific re-uses of personal data collected in XR would pass the strict Article 6(4) purpose compatibility test, or seek the individual’s separate consent. 

Complying with transparency and notice obligations in XR environments may also prove difficult. This could warrant innovative solutions, such as audio cues or prominent just-in-time visual notices. However, the long list of mandatory information elements under Articles 13 and 14 GDPR may force controllers to adopt a layered approach in their notices. Furthermore, controllers should avoid implementing manipulative design (so-called ‘dark patterns’) to trick users into oversharing or nudge them away from legally-required notices

Explicit consent may be required to legitimise many data processing operations that occur in the XR processing ecosystem. Controllers should collect individuals’ consent when they are able to infer sensitive data covered by Article 9(1) GDPR from their behavioural patterns.  They likely cannot rely on the ‘manifestly made public’ exemption: as with inferences made about social media users, data subjects do not manifestly make such data public by merely using XR technologies. Moreover, on August 1, the CJEU stated that data capable of revealing sexual orientation by means of an intellectual operation involving comparison or deduction qualify as special categories of data. Since information collected via eye, gait, and other non-verbal body-based measurements could be used to deduce equally sensitive personal data, organisations using XR would likely be prohibited from processing such information absent explicit consent.

Behavioural data in immersive settings can, when combined with virtual stimuli information, enable the drawing of conclusions about individuals’ psychological profiles and interests. However, such data do not qualify as more thoroughly-protected ‘biometric data’ under the GDPR when it is not used for uniquely identifying the individual. In its initial form, the AI Act also focused on strictly regulating remote biometric identification systems as ‘high-risk’ applications. More recently, a draft report from the European Parliament proposes to extend the Act’s scope to emotion recognition and biometric categorisation systems that rely on ‘biometric-based data’, which is “data resulting from specific technical processing relating to physical, physiological or behavioural signals of a natural person (…) which may or may not allow or confirm the unique identification of a natural person”. 

V. New EU laws will complement the GDPR’s requirements for XR-based services

Besides the GDPR and proposed AI Act, the EU will also regulate the use of XR technologies through other pending and adopted laws. In fact, some of these new laws aim to fill in data protection gaps in the EU, and refer to several provisions of the Charter that they deem to implement. These include consumer protection, non-discrimination, privacy, data protection, and effective judicial remedies.

The Digital Services Act will likely cover deployers of XR technologies, notably centralised operators of immersive worlds. Intermediaries who qualify as online platforms cannot present profiling-based advertising either to users who likely are minors or based on sensitive data. If they use automated systems to recommend content, they may have to make available a non-profiling-based option for users to receive recommendations.

XR deployers may also be covered by the Digital Markets Act if they are ‘gatekeepers’, ie. if they both provide ‘core platform services’ — such as online intermediation, marketplaces, operating systems — and reach certain thresholds. If so, they are prohibited from leveraging insights from user interactions with other businesses to deliver targeted ads, as well as combining or cross-using personal data. The DMA also extends users’ GDPR portability rights, potentially encompassing telemetry data generated by XR platforms (e.g., a detailed description of their eye movements).

Given the complexity of data processing operations in XR-enabled applications, individuals may get confused concerning the exercise of their GDPR rights, which are also provided in Article 8(2) Charter. Duly notified and independent data intermediaries under the EU’s Data Governance Act (DGA) could empower consumers to make informed choices. 

In conclusion, while the GDPR may address most of the privacy and data protection risks posed by XR technologies, there are potential gaps in protection and practical compliance hurdles for businesses. For example, the GDPR does not ensure sufficient protections for body-based data that reflect its heightened sensitivity and imposes transparency duties that are difficult to fulfil in immersive scenarios. New legislative initiatives, such as the AI Act, the DSA and the DMA, deem to further operationalize the EU Charter’s fundamental rights and may place guardrails on pervasive personal data processing, by regulating personalised ads and data combinations. Besides, the European Commission recently revealed it will propose specific legislation “on virtual worlds, including the metaverse”. Legislators should remain apprised of technological advances in XR lest novel threats to fundamental rights remain unaddressed. 

Suggested Citation

Sebastião Barros Vale and Daniel Berrick, ‘Reality Check: How is the EU ensuring data protection in XR Technologies?’ (The Digital Constitutionalist, 25 January 2023). Available at: https://digi-con.org/reality-check-how-is-the-eu-ensuring-data-protection-in-xr-technologies/

Sebastião Barros Vale
EU Policy Fellow at The Future of Privacy Forum (FPF) | + posts

Sebastião Barros Vale serves as the Senior Counsel | The Future of Privacy Forum

Daniel Berrick
Policy Counsel at The Future of Privacy Forum (FPF)

Daniel Berrick, JD, is the Policy Counsel at the Future of Privacy Forum

[citationic]

Featured Artist