The Tough Questions of Ethical Content Creation in VR

Jodi Schiller

  • Share:    
One of the qualities of virtual reality (VR) that excites creators and consumers most is that it is wide open and packed with potential.

Optimistically, that potential means people will create and experience things they never have before and push creativity to its farthest reaches. However, as Newton said, for every action, there is an equal and opposite reaction. If there lies the potential for good work that uplifts humanity, then there also lies the potential for harmful and destructive content.

This concept was one explored at a panel called Ethical Content Creation for VR at the Silicon Valley Virtual Reality Conference in San Jose, California.

Microsoft AR and VR developer and evangelist Liv Erikson moderated the panel which included Jodi Schiller, founder of digital media and marketing company New Reality Arts; Jazmin Cano, marketing and events coordinator for open source social platform High Fidelity; and Jacob Ervin, technical lead for spatial computing company Occipital.

Erikson started the session by asking why it is worth talking about ethical content creation now.

"A lot of the patterns get established in the first year or two," Ervin said, and after that, they are hard to change. What's more, once we lay the foundation, we build everything on top. So, whether it is talking about diversity in VR or developing thoughtful practices regarding the types of VR experiences creators want to unleash into the world, the time to talk is now.

During the discussion, the panelists touched on a variety of topics, like how to proceed with VR when there are no longitudinal studies on the effects of the medium on the adult brain, let alone a child's brains. Another topic focused on the balance between keeping platforms open and protecting users.

Cano brought up the idea of trust. Creators cannot violate the trust of viewers or else the viewers will not come back to unpleasant or traumatic VR experiences—or maybe to VR at all.

At a few points, the idea came up of introducing something like biometric plug-ins that would allow users to set thresholds for things like heart rate so the system could automatically pull them out of the experience if their heart spiked past a certain level. So, for example, VR horror experiences are developing their niche. A user could decide the point at which they needed to be separated from the experience.

Schiller said the situation called for "creative ways of both allowing our imaginations to go wild but also looking after people at high distress levels."

Along those lines, Erwin brought up the idea that, while it's possible to treat traumas with VR, it could also be possible to cause traumas. VR is immersive, and what people often don't understand until they've tried it is the extent that it can dupe the brain into thinking it's on the edge of a building, being shot at, or standing too close to someone. Creators have to consider the consequences of what they create.

The talk also covered identity issues from the potential impact avatars can have on real-world behaviors to the potential for identity theft. Imagine if someone created an avatar of you without your permission and used it in unseemly ways. As Schiller pointed out, people already face the problem of controlling their image online. Fake social media profiles get created all the time using stolen pictures of unsuspecting people.

The panel talked about the idea of having layers of identity. Perhaps only close friends would see a user's true avatar, and others would see something different.

Again, on the optimistic side, Ervin said, "VR is a great medium for taking people and putting them in someone else's shoes." The hope is that VR can encourage and normalize positive social behavior rather than serve as a place where some of society's ills can continue to fester. One audience member brought up the example of a female avatar going into a social VR experience and getting attacked, as it happens online all the time on Twitter and Reddit.

Cano said big platforms need to make tools to help users deal with such challenges, like creating private rooms for social interaction.

"We need to own our responsibility. We have to be always thinking, 'What world are we creating?'" Schiller said. "We are the founders of something, and let's make it good."

This article was originally published by TechRepublic and written by Erin Carson.

Tags

Comments