Content moderators do the difficult work necessary to keep the world safe as online activity grows exponentially. We believe that content moderators, as the first responders of the internet, deserve the utmost respect and support. So while the broader industry has yet to establish clinical best practices for supporting the mental health of content moderators due to a lack of research, we wanted to get to work.
We hired industry-leading experts to research, create, test, and fine tune what we’re now calling The TaskUs Method: a comprehensive, global psychological health and safety program for content moderators, guided by the practice of evidence based psychology and grounded in neuroscience.
Our Method goes beyond reactive interventions, proactively supporting mental health, and the resiliency skills employees need to do their challenging and essential job.
In the spirit of driving discussion on how to best support content moderators, we’re sharing the TaskUs best practices we’ve learned through our extensive research and clinical programming:
- Develop and implement a comprehensive psychological health and safety program led by clinicians that is evidence-based.
- Pre-screen, onboard, and continue skills development for resiliency. Pre-screen candidates to ensure that they have the resiliency to safely perform a content moderator’s job. Expand onboarding to build resilience and provide regular skills training. By teaching skills and providing information based on neurobiology and neuroscience, moderators are better equipped to prevent the possible adverse effects of the job and to recover quickly if they are affected.
- Provide 24/7 access to psychological services. The goal should always be to lower the need for reactive care due to distress or personal crisis by increasing the effectiveness and utilization of preventative care offerings. However, access to on-demand counseling is an essential resource designed to meet employees at their specific time of need.
- Provide post-employment psychological services. These services can include case management – to connect people to on-going community services, mental health counseling, and ongoing research and data collection to determine if employee attrition relates to the content’s graphic nature. Investing in post-employment services is vital, helping to mitigate harm, advance understanding of the field, and influence best practices in the broader content moderation community.
- Invest in technology to improve wellbeing for all content moderators. While AI will never replace humans, it can reduce the volume of posts they see. New obfuscation technologies like facial blurring, grayscale, and resizing can decrease the intensity of images, mitigating the impact on content moderators. Intelligently applying these technologies may reduce the psychological risks associated with this job.
- Research and test best practices, using data and validated assessments to track program performance in real time. At TaskUs, we have a dedicated research team conducting studies on the impact of content moderation and are sharing outcomes and best practices across the industry. As so many companies try different solutions of varying intensity, there has not been sufficient research to coalesce around clinical best practices. It’s so important not only to collect, but then use real-world data to stay on top of outcomes. We offer non-diagnostic testing on a voluntary basis to assess moderators’ psychological well-being. We use reliable and valid psychometric tests to look at resiliency, quality of life, cognitive flexibility, and burnout. We engage in pre-testing and post-testing and use this data to monitor program effectiveness.
- Take cultural differences into account. It is imperative to provide a flexible program framework for different cultures and not push Western mental health care modalities to the masses. To address this issue and supplement our in-country mental health experts, we’ve trained a team of life coaches in basic psychological first aid and triage mechanisms. This stepped model of intervention helps bridge the gap of therapeutic support in those countries while ensuring that licensed professionals are always available to work in tandem to deliver programming.
- Emphasize the greater purpose. Moderators do meaningful work, acting as the first line of defense against bad actors, protecting the public from disturbing imagery, hate speech, bullying, and content that could cause real-world harm. We aim to continually emphasize the importance of the position and the impact the work has on the world. This is a focal point during group and individual sessions and is integrated into our on-boarding practices during training.
Expanding well beyond traditional “corporate wellness” offerings, these best practices address the unique needs of content moderation professionals, helping companies successfully guide and support their employees, from recruitment and throughout their advancing careers.
We are proud of our work and the steps we have taken — and we know there is still so much further to go. We look forward to continuing to drive this conversation forward for the broader industry.