Trust & Safety Research: Building Resilient Teams Behind Immersive Spaces

Published on September 5, 2025
Last Updated on September 5, 2025

Virtual reality (VR) isn’t only for games. People attend concerts, make friends and take classes via their digital selves (aka avatars). You can (almost) go anywhere, do anything and be anyone. As virtual lives begin to feel as real as the offline ones, the stakes rise. What happens when the rules break down? Who’s keeping these immersive worlds safe?

A new study from TaskUs’ Division of Research offers a rare inside look inside live moderation work — and examines why having wellness programs is a must for platforms that want to operate and scale responsibly.

A snapshot of the findings

VR moderators are on the frontline. But unlike traditional content moderators working behind a screen, they are fully immersed in the experience. With headsets on and avatars live, they’re part of the same digital spaces as users are, guiding newcomers, de-escalating tense moments and enforcing rules in real time.

It’s a fast-evolving role shaped by emerging technologies, social norms and user behaviors. 

In interviews conducted for the study, many VR moderators described their roles as meaningful and even joyful. They find fulfillment playing counselors, guides and protectors, especially for young users. 

At the same time, the toll can be heavy. Hate speech, harassment and doxxing are alarmingly common in VR. Its immersive nature makes harmful encounters feel even more intense and immediate. For moderators, that means stepping into high-stress situations, with little room to detach or disengage.

The study identified a number of common challenges, including:

  • Cybersickness: Prolonged headset use often causes nausea, dizziness and headaches. While some moderators build a tolerance, others struggle daily.
  • Sensory overload: Crowded VR spaces are filled with constant noise, motion and interactions. Even with breaks, overstimulation is mentally and physically draining.
  • Toxicity: Moderators regularly face verbal abuse, from harassment to racial slurs. They must remain calm, but the impact is immediate and feels personal.
  • Doxxing and harassment: Report being targeted with inappropriate comments or veiled threats. Many feel vulnerable when users hint at knowing personal details like their location or background.
  • Emotional exhaustion: Managing suicide-related interactions, enforcing unclear policies and constantly navigating emotionally charged situations wears moderators down over time.

Recommendations from our team

The study also shows that moderators are at their best when they’re supported mentally, emotionally and physically. When support breaks down, so does safety. Burnout sets in, judgment slips and situations can spiral, particularly when vulnerable users are involved.

Based on the study findings, our team recommends adopting a Wellness-centered Skill, Action and Support (W-SAS) framework. It emphasizes skill development, self-care actions and robust support systems tailored to VR moderation. It builds on the Job Demands–Resources (JD-R) model to help individuals and organizations balance the pressures of the role through adequate support and coping strategies.

Suggested best practices include:

  • Skill-building beyond policy enforcement: Training in emotional regulation, cultural sensitivity, conflict resolution and stress management
  • Practical self-care actions: Regular breaks between headset sessions, physical movement and grounding tools (like textured objects or calming scents) to help reduce mental fatigue and sensory strain
  • Supportive environments: Peer mentorships, team-based coaching and proactive leadership check-ins so moderators feel safe and supported
  • Privacy-first practices: Encouraging healthy boundaries, limiting personal information sharing and preparing for gender-specific risks to reduce anxiety

Collaboration across disciplines is also important. Technology teams, researchers and mental health professionals must work together to design systems that protect both users and the people keeping virtual spaces safe.

Read the full paper to get an inside view of the realities of VR moderation work, its unique mental health risks and wellness practices needed to build safer immersive spaces.

Interested in Working With Us?

References

Migel Tanyag