‘Privacy is Hard,’ a Look at the Ethical Implications of Mixed Reality

by | 25 September 2019 | Augmented Reality, Conferences, Industry Leaders, Interactive Techniques, Virtual Reality

Dina Douglass © 2019 ACM SIGGRAPH

When we give a permission prompt, we’re asking the user to make a decision that they can’t possibly be well-informed about… no matter how good they are at technology or privacy.

Diane Hosfelt, Mozilla

On Monday, 29 July, during the SIGGRAPH 2019 conference, an elite group of professionals joined together for a panel to discuss “The Ethical and Privacy Implications of Mixed Reality.” Moderated by Voices of VR Podcast’s Kent Bye, the panel featured Diane Hosfelt (security and privacy lead, Mozilla), Matt Miesnieks (founder and CEO, 6D.AI), Samantha Mathews Chase (founder and CEO, Venn.Agency), and Taylor Beck (privacy operations lead, Magic Leap), and offered an exploration of best practices around navigating privacy when it comes to spatial computing.

What does this landscape mean? What positions should companies take? How do creators navigate this new realm?
Here, we recap the session for a look at what privacy means in mixed reality and ideas on how to protect users.

What Is ‘Privacy’ and Why Is It Important?

“When we give a permission prompt, we’re asking the user to make a decision that they can’t possibly be well-informed about … no matter how good they are at technology or privacy,” stated Hosfelt.

Early in the panel, Hosfelt noted that there are no decent legal definitions of “privacy.” Realistically, privacy is contextual, it’s societal, it’s personal, which is to say that privacy is hard. Personally identifiable information is also very hard. Given the huge overlap between AI ethics and MR ethics, it’s so important to ask the seemingly obvious questions while a technology is still emerging.

So, what are the existing threats?

To start, the panel felt it important to note that privacy is a right, and that individuals’ privacy and security on the internet should not be a privilege. When we look at immersive technologies and mixed reality experiences, the sensors and derived data that are used and required in order to create experiences are inherent privacy threats.

If there is this digital twin of someone that’s a collection of their preferences and influences, noted Mathews Chase, the person might start to get a notion of “other self” that is somewhat existential. It’s important for users and creators to think about what the collective “we” wants to keep private in order to maintain autonomy so that the physical self can have control over the virtual self. If users are not given that control, they might start to give up and develop a cognitive dissonance — that is an even bigger threat.

The positives of where we are right now in mixed reality and this undefined space is that we have a chance to actually visualize and articulate what [privacy] looks like for us

Samantha Mathews Chase, Venn.Agency

One step that companies like 6D.AI and Magic Leap do is work to separate the interface from the user in order to maintain privacy. When you think spatial, most people consider it one-dimensionally. For example, a room scan of a space is “just one file” … which is not true. It’s up to us, the creators, to determine what to keep and what to leave out. If a company starts to explore advertising, that adds a whole new layer to the equation, not to mention regulations like GDPR and others.

“The positives of where we are right now in mixed reality and this undefined space is that we have a chance to actually visualize and articulate what [privacy] looks like for us,” noted Mathews Chase.

What Are We Doing Now?

“We’re really pushing the limits of not just the technology, but the science.”

Matt Miesnieks, 6D.AI

Commented Miesnieks, “We’re really pushing the limits of not just the technology, but the science.”

There are all sorts of new realms within immersive computing that are opening new vectors and venues for companies to start to access data and tie it to identities. Many of the companies represented in the panel are working through this and trying to help navigate the data being collected in order to protect the user through firewalls or other means.

As one example, Magic Leap shared its practice of maintaining two paths for data collected: one for the data that it collects and one for the data it exposes to other applications. If an application needs access to the Magic Leap API, shared Beck, the team works to determine how it can minimize that access. Overall, the company’s system does not collect a lot of personal information, but it also takes things a step further by allowing users to make meaningful decisions about what is being exposed. For what Magic Leap cannot control technically, they try to control contractually (i.e., vetting an application’s justified business purpose) as well as to implement “scary” features that are aimed at detering applications from misusing data.

Noted Mozilla, privacy should never be siloed. Everyone who is responsible for creating a product needs to be thinking about privacy and asking: How can my technology be misused? Will this make the world worse by enhancing biases or making abuse and harassment online worse?

Offering another example, Mathews Chase spoke of verifiable credentials. For those unfamiliar, a verifiable credential is a replacement for legacy systems, like direct logins, and offers a way of turning a piece of data into a certificate. For one of its products, the objective of the verifiable credential for Venn.Agency is only to answer the questions: Is the player a unique person? Are they in a place? Did they do the exercise (i.e., play the game)? Mathews Chase’s challenge for companies is to ask of themselves and their product, “What do we need to know?”

How Can Weu Do More?

While no one person has all the answers — like Hosfelt said, it’s hard — during the panel the following resources came up as tools to help when building a new product or determining what data to collect and how:


Do you have thoughts on how to best navigate privacy and ethics in spatial computing? Share them in the comments! Visit the ACM Digital Library to check out resources from this and other SIGGRAPH 2019 Panels, and don’t forget to submit future panel ideas for SIGGRAPH 2020 in Washington, D.C. later this year.

Related Posts