Microsoft Cracks Down on Harassment in AltspaceVR

While there have always been instances of harassment in AltspaceVR, Microsoft’s social VR platform, it’s hardly the first place you’d look for a wide-open, anything goes, type of environment in virtual reality. VRChat (in its early days) easily takes that dubious spot. But the harassment in AltspaceVR finally reached the point where Microsoft felt the need to take action.

On Wednesday, the company took down the areas where users could freely congregate and talk to one another – including Campfire, News, and the Entertainment Commons social space. No warning, nothing. If you want a virtual space to simply hang out with other virtual beings, AltspaceVR is no longer your destination.

Alex Kipman, division head of mixed reality, said

As platforms like AltspaceVR evolve, it’s important that we review existing experiences and assess whether they adequately meet customer needs today and in the future. This includes helping people connect better with those who share common interests while ensuring that the spaces they access are free from inappropriate behavior and harassment.

Personal Bubbles aren’t Enough

Microsoft works to end harassment in AltspaceVR by shutting down open social spaces.

Harassment in AltspaceVR has been an issue even though the platform has long relied upon an optional personal bubble to establish virtual distance between your avatar and other virtual beings. Along with shuttering the social spaces, personal bubbles will now be on by default, and users joining events will be initially muted.

Years behind AltspaceVR, Meta recently implemented personal boundaries for its virtual avatars in its Horizon Worlds and Horizon Venues platforms. But Meta is late to the game, and it’s no surprise to see Microsoft taking additional steps.

More Measures to Combat Harassment in AltspaceVR

This is just the first of several changes Microsoft is planning for AltspaceVR. As Engadget noted,

In the coming weeks, Microsoft said it would require people to use a Microsoft Account to access AltspaceVR. As a result of that requirement, parents will have the option to use the company’s Family Safety feature to limit how much time their kids can spend within the app.

We’ll likely see verified account requirements become the standard for almost all virtual environments in the future. Even with verified accounts, people find creative ways around it – look at the number of underage children in the social VR games on Meta’s Quest HMD. And holding people to specific accounts doesn’t always work when users can falsely report others even if they’ve done nothing wrong. And this doesn’t even touch on the challenges of moderating content. Meta employs over 15,000 people (almost all outsourced at lower pay) to prevent “. . . the spread of images, videos, and messages about suicides, beheadings and sexual acts” on Facebook. The New York Times notes that digital content moderation will cost the tech industry about $8.8 billion in 2021.

Does anyone really think that virtual worlds could skirt these issues? It seems that the dreams of an open Metaverse are getting dashed against the jagged cliffs of human behavior.

A little historical perspective (in short supply in the tech industry) will remind us it’s always been that way when starting new communities (whether real or virtual). The rough and tumble towns of 19th century American West all opted to hire sheriffs, which (contrary to the Hollywood versions) were often low-paying jobs that also involved mundane tasks such as cleaning the streets. By the early 20th century, most of those towns ended up with a full-blown police force.

Real-life has always had built-in safeguards when people interact with one another. Our newborn, mostly consequence-free, virtual environments will follow the same trajectory. Someone or something (in this case, code) will evolve to protect users from the few bad actors who can ruin a VR experience for everybody.

Is Microsoft Killing a Virtual Community?

Predictably, criticism has already arisen of Microsoft’s move to combat harassment in AltspaceVR. TechSpot offered what might be the most insightful perspective.

The real story here isn’t the fact that Microsoft is trying to make AltspaceVR a safer environment for users. Instead, they’re simply driving existing users away in order to rebrand the platform as an event-based space for artists, brands and businesses.

And we’ll (partially) concede the point. It’s no surprise to see social VR platforms focus more on “artists, brands, and businesses,” modeling the same evolution of the web in its early days. Someone has to pay the bills for the server farms.

Otherwise, they’re going to be tracking your eyeballs for advertising, which may happen anyway.

But TechSpot’s criticism ignores the larger issue here – the ethical challenges in virtual environments. Silicon Valley often seems to live in a child-like environment (the recent Meta news where employees will now be known as “Metamates”) even though the tech companies shape and dominate our social interactions. Virtual environments need to be welcoming, diverse, and inclusive, and it’s not enough to simply ask everyone to “play nice.” How far these requirements evolve – and the consequences for bad behavior – remain to be determined. But if there are none, we’re back to the Gyges Ring story.

AltspaceVR – Will the Party Continue?

AltspaceVR has always had a special place in our VR lives as we were there with the first users. And we stayed for the closing party in 2017 when the founders ran out of money and planned to shut down. It was a surreal moment when the lights didn’t go out, and a few days later, Microsoft announced it was the savior. The platform has been a fascinating virtual play and work space, and it often paved the way creatively for the virtual events we now see in other virtual worlds. We’ll see what happens now as new safeguards are implemented.

Let us know what you think of Microsoft’s effort to combat harassment in AltspaceVR.