Is the Metaverse Dangerous?

It turns out, yes, it can be.

Trigger warning: This blog post discusses sexual assault.

I briefly read a headline this week that interested me but did not make it into my reading material until this morning. I was too busy with our first webinar on Privacy and Security in the Metaverse to focus on it.

It sure did catch my attention today.

It caught my attention because I subscribe to a newsletter called the Morning Brew. The Morning Brew headline was: “Meta sets boundaries. Meta announced yesterday that it’s rolling out a “personal boundary” tool to block avatars from invading a nearly four-foot perimeter around your virtual character in its Horizon VR experiences.” 

I knew right away I needed to do some research about why and it turns out, the two are related. So, I found the Medium blog post that was written by the victim of the sexual harassment headlined above, I asked my son a few questions about his Oculus and I read more about Facebook’s new protocol. Here is what I discovered.

A woman put on her headset and decided to try Facebook’s new virtual hangout room for adults over the age of 18, called Horizon Worlds.

The victim says that within a few seconds of entering the game, male avatars quickly approached her and virtually ganged up on her. They took pictures of her and handed them to her in the game, they placed their avatars very close to hers and whispered in her ear, which sounds creepy, through the virtual headset.

She had to rip off the headset to make it stop.

As a woman myself, my empathy for what she went through is very strong. Even the idea of that happening makes me shudder.

I had to find out if this could happen to one of my kids who we gave the Oculus to for Christmas. He said that he doesn’t belong to any games where you can create an avatar. Phew. I caught it in time.

Back to the story.

In response to this attack, Facebook said this past week that they were deploying a new “personal boundary” of 4 feet so that other avatars cannot invade your personal space.

I am angry at them. I am angry that they didn’t think of this before now. But then I kept reading and, um, maybe they did know or at least suspect it?

In reported comments from Facebook’s CTO Andrew Bosworth, he supposedly said in an interview late last year, “‘So I think we have a privacy trade-off - if you want to have a high degree of content, safety or what we would call integrity, well that trades off against privacy.’” (Reported by the BBC, who I think had the best article on this story. Note, it is unclear if he said this before or after being aware of the sexual assault incident.)

Ummm, what?

I kept reading.

He says, "To process visual information about an avatar or how close one is to another, that is going to be so expensive computationally, that is going to take up so much computer power, I don't know what technology can do that."

Apparently it is their own technology that can do it. And the fact that they rolled it out in February 2022 means it either wasn’t very difficult to figure out or, well, there is no “or.”

But that doesn’t change the fact that it looks like it took a woman becoming a victim for them to release it.

A couple of weeks ago, I wrote about how we are one pandemic away from a true metaverse. My logic was that health and safety from physical interaction will stimulate the move towards the virtual world. If you had asked me about someone being able to virtually harass me, I would have thought there were safety protocols already in place to make that highly unlikely.

I hate it when I am naive.

While this story is worthy of a much longer conversation over a few cups of coffee, I will focus my commentary on what could have been.

Imagine when she entered the room, she could confidently enter as any avatar, LGBTQIA+ fully represented.

I do like the idea of a personal boundary, that she would be asked for her consent to let another avatar come closer than 4 feet. But can we go further? What if she also has a button she can push that backs her up 4 feet or maybe the button pushes everyone else around her away by 4 feet. I like that better. And, maybe, instead of a button, she can utter the magic word “911” and everyone is pushed away from her.

To avoid hearing unwanted whispers, she has a mute button so that she can mute hearing what people say. Or, following the idea above, she can just say “stop talking” and it will mute anyone around her so that she cannot hear what they are saying.

To address the pictures, we flip copyright concepts. Any picture taken must first obtain her consent.

I think we can develop even more virtual security and privacy. For example, certain precautions that can be taken before entering a virtual room or game. Offering a profile of the crowd. Is it a welcoming group or are there indicators of a potential threat? How about a preview of the room before entering?

We can do in the virtual world what we cannot in the physical world, which is track and report on an avatar’s past behavior in real time. Has this avatar harassed other avatar’s before? Give me a little background on the individual avatar’s behavior when they are in the virtual world.

Yes, I know what you are thinking. They can just take down that avatar and spin up a new one, but that is a question of identity that we will explore in later webinars and blog posts.

My point here is we don’t have to rely on victim reporting, which itself is traumatizing. That harassment happened online and is recorded by the surveillance technology. We know the who and when. Maybe we can use it to warn others, perhaps catch crime before it happens.

Oh no! I think I just made an argument in favor of the dystopian world of precogs pictured in Minority Report.

You’re the best,

Caroline

P.S. Can we bind our avatar’s, through smart contracts, to rules of behavior that we determine for ourselves in the Metaverse? Building on the concept that the Metaverse is about individualization?

I think my brain just exploded.

About the author: Caroline McCaffery is a co-founder at ClearOPS, an A.I. privacy tech company managing privacy and security operations data to make mundane tasks, simple. She is a frequent blogger and speaker with over 20 years of experience as a lawyer working with tech startups. You can connect with her on Linkedin. 

About ClearOPS. Do you have clients that ask you to respond to their security questionnaires? Are you thinking about a SOC2 to get rid of them? A SOC2 takes 6 months so what are you going to do in the meantime? You are going to use ClearOPS. ClearOPS now offers hosted security pages. Providing you with even more proof for your customers that you “take their privacy and security seriously.” Inquiries: [email protected]

Reply

or to participate.