« We have decided to open an investigation against Teleperformance. »

For the first time anywhere in the world, a team of journalists was able to visit some of the moderation centres in Morocco, Colombia and Paris* for some of the world’s biggest platforms, applications and most popular sites: Meta, TikTok, Discord, Le Bon Coin etc. Del Harvey @Delbius (former Director of Trust&Safety at Twitter), a clone of Yann Amiry (moderator at Décathlon), Azure AI Content Safety, Accenture, Teleperformance, Concentrix, who can help, automate?

Everyone’s talking about moderation, so much the better. The internet will never be safer, with less harassment and disinformation, as long as hate, alternative truths or videos of rapes and beheadings are circulated and shared by the millions in just a few minutes. You can, if and because you are going to be affected, try to recruit your own Yann Amiry (who became famous at Décathlon), your Del Harvey or Yoel Roth (who headed up Trust&Safety at Twitter) or rely on automation tools such as Azure AI Content Safety (Microsoft). And it won’t work, or not for very long. Because everything moves so fast, because regulatory bodies don’t yet exist, because automation isn’t everything. Moderating TikTok alone requires the services of tens of thousands of content moderators.

Fascinated by the subject and the issues at stake, our magazine – which is independent 🙂 went on a reporting trip to meet some of the players in this market. En-Contact will also be organising the 1st Trust&Safety conference in France this year.

In Colombia, where the Minister has decided to launch an investigation into Teleperformance.
For the first time in the world, a team of journalists was able to visit some of the centres where certain moderation operations are carried out for the world’s biggest platforms and most popular applications: Meta, TikTok, Discord. En-Contact was able to stay on site for 48 hours, with a photographer and a journalist, to visit the sites and meet the teams in charge, without supervision.

At the Teleperformance offices in Bogota, Colombia, there are plenty of places to rest and get to grips with the technologies.
They are numerous (Teleperformance employs more than 40,000 people in Colombia), young and happy to work on the subjects entrusted to them. « Making the internet and the applications we love safer is a vital issue, and one that gives meaning to our work.

After this visit, we had the feeling that we had come a long way from the attack and the suspicion cast on this company, in this same country, with a single tweet, by the deputy minister of labour, Edwin Palma Egea, in November 2022: « We have decided to open an investigation against Teleperformance ». Following this tweet, on Thursday 10 November, the share price of the company, a member of the CAC 40, fell by 33.9%.

The photos were all taken on the spot, in Bogotá, at one of the two sites where these activities are carried out: Auto Norte and Connecta Campus.

The Trust&Safety market? Booming
The moderation of illicit, dangerous or illegal content, the verification of security-related issues and stolen identities – what is now known as Trust&Safety – is a booming sector, estimated to be worth €12 billion in 2022 in Europe and expected to reach €32 billion in 2032.

It is not well known, because it is in the throes of change, and it is secretive, because each of the service providers is bound by very strict confidentiality clauses, and because the major clients do not necessarily want the backroom to be revealed.

The only researcher to have written a book on the subject is Sarah T Roberts: Behind the Screen. Content moderation in the Shadows of Social Media (2019) and, before that, Tarleton Gillespie, with Custodians of the Internet (2018).

I saw the worst of humanity A few recent cases have forced the media and regulatory authorities to take an interest in moderation practices and the working conditions of moderators: in 2019, Daniel Motaung, a former moderator at Sama, a Kenyan service provider, took his employer and sponsor Facebook to court for slavery and human trafficking.

Present 24 hours a day, the medical team is young and salaried, headed by a doctor.
Before him, Chris Gray in Ireland asked the same question in The moderator: Inside Facebook’s Dirty Work in Ireland. The title of the article in The Guardian at the time is eloquent: As a Facebook moderator, I saw the worst of humanity. Murder, torture, child abuse: each day we see things that keep us awake at night.

What we saw and heard in Bogotá shows that the reality of practices and the profession is not the same in 2023 as it was a few years ago, at least at the service provider we visited. Teleperformance has pulled out all the stops, as the company was able to do in the late 1990s, when the call centre industry exploded worldwide.

Concentrix+Webhelp, the world’s number 2 BPO company, has also entered this market. But it did not authorise us to visit its sites, in the absence of an agreement from its customers. Accenture, Telus, Besedo, PartnerHero and Alorica, other Trust&Safety players, were contacted for this report but did not respond to our requests.

Key points to know
The Trust&Safety market goes far beyond moderation activities alone. The increasingly intensive use of social platforms and networks is creating risks and problematic situations. More and more companies are therefore having to launch or review their Trust&Safety practices. Text and video content, customer reviews, maps, comments on articles, polls and live streaming can and should be moderated.

The T&S division employs 2,749 people worldwide at Teleperformance, including 1,415 moderators, and works with 32 clients, in 37 countries, in 43 languages. The well-being index, monitored via wellness surveys, stood at 94% in mid-2023.

Business case example: the battle against time The images of a massacre, filmed live with a GoPro camera placed on the shooter’s helmet, show ten dead and three injured, all black. The video was removed in two minutes by Teleperformance’s moderation teams. It took 17 minutes to complete the same task in 2019, during the mass shooting in New Zealand. In the case of the Buffalo massacre, only 22 people saw it live. But millions have since seen it on the internet, as it was captured and rebroadcast in just two minutes. Angela Hession, Twitch’s VP of Trust&Safety, said she was impressed by the responsiveness of Teleperformance’s service provider.

AI-powered tools can do a lot of the moderating automatically. It is what these tools fail to moderate, or on which a human decision and analysis must be made, that reaches the Content Moderators.

The profile, health and psychological state of the moderators at the heart of the profession and practices
The moderators are recruited with and thanks to a series of sophisticated tests, including more than 14 operations and specially developed tests: personality tests, language proficiency tests, resilience tests, probity tests and, in some cases, tests of the employees’ criminal record.

Some of these tests and assessment tools were developed in conjunction with Stanford University and psychology researchers.
The very large number of applications to be processed requires the use of bots and subsequent qualified calls to candidates on the priority list.
16.5% of candidates who complete the Application Form are eventually recruited.
The successive stages are: the remote interview, the interview, the assessment centre and the job offer.

The profile of moderators in Colombia.
TP employs more than 45,000 people in Colombia, at its centres in Bogota, Medellin and Cartagena. The average age is 26; 46% are women, 54% men.

Wellness Time Daily: 30 minutes 8 dimensions of well-being are taken into account to ensure employee health: « emotional, spiritual, physical, social, occupational, intellectual, environmental, financial ». After testing and comparing different techniques, the artistic activity of painting is considered to be one of the most effective ways of distracting staff and providing a beneficial mental break. On large tables, the activity is practised and compulsory, with eloquent results. It’s a great way to have a chat.

Wellness Time Daily is a compulsory 30 minutes every day.
A typical moderator’s day Refresh training: 15 minutes.
Breaks: 30 minutes (15 min each) Coaching 1-to-1: 15 minutes.
Lunch: 1 Hour.
Wellness Time Daily: 30 minutes Production Time : 6,5 hours Total : 9 hours.

Code mauve, code green
In the event of exposure to content considered to be dangerous or critical, an exit from the set is imposed, the duration and subsequent actions of which are coded and adapted to the alert code. A mauve code signifies a degree of seriousness different from a green code, for example. The codes are recalled and notified at various points on the production floor.
Masseurs and psychologists are on site at all times, with a doctor in charge of the team.

En-Contact magazine n°130
The voice of the agents, employee retention and satisfaction indicators 91% of them are satisfied with their well-being and the quality of their working conditions.
89% of them see themselves as « Guardians of the Internet ».
The attrition rate over one year is 3%.
36% of staff have been with the company for more than a year, after being appointed to a permanent post.

« It’s the first time I’ve had a payslip » One of the things that surprised me the most during my discussions with these young employees was what they told me about their previous professional experiences. There were Brazilians, Argentinians and Colombians in the room, and for many of them, one of the recurring comments, apart from the pleasure they had in working on the subject of social networks, was: « It’s the first time I’ve had a declared job, with a pay slip ». South America is not like Europe. And here, I’m doing useful work.

Manuel Jacquinet, photos by Edouard Jacquinet.

To find out more: read the previous report on En-Contact and the comments of Andres Bernal, Director of operations in Colombia.

Read also how Concentrix+Webhelp is working with Wizz.