Democracy and security in an age of quantum transparency
- By Chris Kremidas-Courtney
Peace, Security & Defence
Today, we find ourselves facing a choice between two different futures. In one, we step into a metaverse, where our every move is tracked, our privacy is a thing of the past, democracy is diminished and personal autonomy is limited. In the ideal future, we move in and out of immersive digital experiences like the metaverse with our privacy protected, our voices within our democracies enhanced and our free will maintained. But is that even possible?
Shocking new research findings by virtual reality (VR) pioneer and XRSI Global Technology Advisor Louis Rosenberg and researchers from the University of California-Berkeley and RWTH Aachen University affirm that a VR game called ‘Beat Saber’ can identify human subjects uniquely in less than ten seconds through the biomechanics of a person’s movement. The implications for privacy and human autonomy could be enormous.
The researchers were able to identify individual users out of a pool of more than 50,000 candidates with 94.33% accuracy from 100 seconds of motion data, which is more accurate than most fingerprint scanners. They also achieved a 73.20% accuracy rate from just ten seconds of movement.
This research and the sophisticated level of surveillance that is now possible by extended reality (XR) technologies raises dangerous new concerns around privacy, autonomy and human free will.
Malicious actors can use this technology to amplify political or social divides
The internet today is fuelled by personal data through every click, like, share or any other type of recordable action we take online. In the past few decades, the creation, processing and sharing of data have become so common that most people are no longer aware of how much personal data they give away every day, despite countless data breaches and scandals.
Citizens relinquishing their data without realising the risks or consequences has become the norm everywhere. While this is not new, the difference today is that we are moving towards an era of constant data capture and its use in various ways to manipulate us, impacting our choices, worldview and trust in others. This is especially a concern with the increased adoption of immersive technologies and a strong push to build the next iteration of the internet, known as the metaverse.
While the metaverse has the potential to revolutionise the way we interact with each other and with digital content, it also presents potential dangers if this technology is not regulated in a timely manner.
One of the primary concerns with XR technologies is the massive data collection about individuals, including their location, behaviours, motion, gaze, posture and even emotions. Not only can the collected data be used for targeted advertising, surveillance and tracking, but the level of immersion and engagement could lead to addiction, making it difficult for individuals to disengage and maintain control over their own lives. We are already witnessing a similar phenomenon with mobile phone addiction.
XR can further erode our agency by stimulating our senses, manipulating our perceptions and influencing our behaviour in subtle ways. Malicious actors can use this technology to amplify political or social divides and to spread disinformation or propaganda, further eroding our ability to make informed decisions and act autonomously.
Personal data of this nature should only be collected and shared following a completed opt-in process by the person
Some of these new XR technologies go beyond monitoring and tracking and permit real-time capture of any actions that we take, whether voluntary or involuntary.
At the same time, while some current XR hardware can read EEG, ECG and galvanic skin responses, there is a disappointing lack of established regulations and guidelines to further uphold the individual’s privacy rights. For example, medical devices that capture many of the same bio-signals are subject to strict governance and regulatory guidelines, while XR technologies on the consumer market are not.
As these technologies evolve, the capability to read moods and emotions with mental imagery and language is already possible or in an advanced stage of development.
Looking through an EU privacy lens, these devices are capturing the personal data of a person’s ‘mental identity’, which qualifies as ‘personal data’ under Article 4(1) of the General Data Protection Regulation (GDPR). Therefore, protecting the mental identity of a person in XR is paramount, and citizens should have the option to decide if they wish to share their personal data or not.
As with any personal data being collected under GDPR, personal data of this nature should only be collected and shared following a completed opt-in process by the person, with a clear definition of what they are agreeing to share.
The metaverse and the data that we generate by using immersive technologies need regulations now
Furthermore, the capability to influence a person’s moods, emotions, mental imagery and language centres requires the explicit and timely permission of the participant at the commencement of any XR session where it may be implemented, with the mechanisms, potential outcomes and hazards made clear to them immediately before their opportunity to opt-in or opt-out of their use.
The danger here is not just about the level of sophisticated data and, in turn, our lives owned by big technology organisations such as Meta, but the fact that authoritarian rivals like Russia, China and others may gain access to the data and misuse it in ways to weaken our democracies and divide our societies.
We must take lessons from both Brexit and the 2016 US presidential elections, where Cambridge Analytica had access to over 5,000 data points on each person and carried out influence campaigns on voters to influence the results. In 2018, XR technologies allowed for over two million unique body recordings in just 20 minutes and now with this recent discovery it takes only ten seconds to uniquely identify a person.
In light of these new developments, our current norms for personal data or ‘sensitive data’ are inadequate for protecting data on human biomechanics, such as gaze, pose, gait and other inferences about us or entire segments of our societies.
XR data as Biometrically Inferred Data (BID) and a lack of safeguards around health inferences and BID could lead to potential human rights violations and we’ll need novel ways to address these risks. The metaverse and the data that we generate by using immersive technologies need regulations now before it is too late and they are turned into a weapon against ourselves and our autonomy.
One of the most significant recommendations that Europe could enforce is banning addictive engagement algorithms
Just this week, Meta revealed its roadmap for AR/VR hardware and doubled down on its manipulative ad tech business model, as CEO Mark Zuckerberg noted in an internal meeting. “We should be able to run a very good ad business, I think it’s easy to imagine how ads would show up in space when you have AR glasses on. Our ability to track conversions, which is where there has been a lot of focus as a company, should also be close to 100%. If we’re hitting anything near projections, it will be a tremendous business,” he said. “A business unlike anything we’ve seen on mobile phones before.”
Meta’s XR lobbyists have already set up shop in Washington DC to influence the US Congress by various means with the goal of limiting regulation of their technology and business model. So, it’s up to Europe to draw a strict updated GDPR-like regulatory boundary to protect its citizens and their privacy amidst these new findings and revelations.
Based on everything XRSI has studied, one of the most significant recommendations that Europe could enforce is banning addictive engagement algorithms and encouraging ‘code for well-being’ practices. With the intersections of artificial intelligence and XR technology evolving at the speed of light, now is the time to regulate and not wait until much damage has been done.
The metaverse can enable us to create a better and more just world
Attention must be given by the European Commission and member states to the implementation of the Digital Services Act (DSA) and Digital Markets Act (DMA)when it comes to their application in the metaverse.
Explicit and contextual guidance on protecting BID is needed as an update to the GDPR, DSA and DMA to address these new developments that render privacy impossible to achieve.
But first, there is a dire need to educate citizens and policymakers so they can help navigate our immersive future.
Most importantly, however, Europe must demand transparency from tech companies so we can have better awareness of new developments and be better able to hold them accountable.
The metaverse can enable us to create a better and more just world, enable our green transition and open a new era of innovation and global connection – but only if we regulate it in ways that preserve and support privacy, human agency, democracy and equality.
This article is a contribution from a member or partner organisation of Friends of Europe. The views expressed in this #CriticalThinking article reflect those of the author(s) and not of Friends of Europe.
Next event In person & livestream