Two online safety moderators with Microsoft have sued the company in the United States as they developed Post-Traumatic Stress Disorder (PTSD) after being forced to watch videos and pictures of "indescribable sexual assaults," murder and child abuse, a lawsuit said.
The complaint stressed that the moderators had to view "inhumane and disgusting content" on a continuous basis and alleged that they have had a grave psychological impact which makes it difficult for them to even look at children without "triggering" an episode and they can't use computers without having a break down anymore. The lawsuit has been filed on behalf of two Microsoft employees and their families.
The employees have accused Microsoft of "negligent infliction of emotional distress" in the lawsuit. The case has provided a glimpse of what the tech workers responsible for detecting and reporting digital content "designed to entertain the most twisted and sick minded people in the world" have to go through in their daily lives.
"It's horrendous. It's bad enough just to see a child get sexually molested. Then there are murders. Unspeakable things are done to these children," Ben Wells, one of the attorneys who filed the suit in Washington said.
A Microsoft spokesperson, however, said that the firm does not agree with the claims of the lawsuit and "takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work."
The Microsoft employees, Henry Soto and Greg Blauert, worked in the online safety department, which is responsible for complying with legislation passed in 2008 requiring tech companies to report child abuse images and other crimes. However, the suit claims that the firm did not warn the workers about the possible psychological dangers of taking this kind of work and the potential for "debilitating injuries."
"Many people simply cannot imagine what Mr Soto had to view on a daily basis as most people do not understand how horrible and inhumane the worst people in the world can be," his lawyers were quoted as saying by the Guardian.
Blauert was also required to "review thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators."
The suit said that although Microsoft created a "wellness program" and also offered counselling, the services provided by the company were ineffective and failed to assist the workers in understanding the "vicarious trauma" and PTSD they were going through.