Kenyan workers remain the low-wage workers behind AI training, moderation, and sex chatbots. The Data Labelers Association is fighting back.
Originally published on 404 Media.
Michael Jeffrey Asia spent eight hours straight every day in Kenya staring at porn on his laptop, annotating what was happening in every frame for an AI data labeling company. When he finished his shift, he started a second job as a human worker behind an AI sexbot, sexting real lonely people he suspected were in the United States. His boss was an algorithm that directed him to go back and forth between different personas.
“It took a lot of creativity and quick thinking, because if you’re talking to a man, you have to act like a woman. If you’re talking to a woman, you have to act like a man. If you’re talking to a gay person, you have to act like a gay person,” he told me at a co-working space in Nairobi. After several months of this, like other datalabelers, he developed insomnia, PTSD, and had difficulty having sex.
“My body has gotten to the point where it can’t function. I can’t even feel it when I see someone naked. And I have a wife who expects a lot from you. I have a young family. She expects a lot from you intimately, and you can’t do that,” Asia said. “It broke a lot of things for me. My body is just not functioning.”
Asia eventually reached a breaking point and shut down its AI companies. He is currently the executive director of a Kenyan organization called the Data Labelers Association (DLA) and the author of The Emotional Labor Behind AI Intimacy, an account of his time working as a real human worker behind AI sexbots. As part of DLA, Asia is organizing workers to fight for better wages, better mental health services, an end to strict non-disclosure agreements, and better benefits for workers earning just a few dollars a day. Data labelers train, refine, and adjust the output of AI tools created by the world’s largest companies, but they are paid significantly less and do not benefit from the runaway valuations of AI companies.
Last month, DLA held one of its biggest events at the Nairobi Arboretum to register new members and help tell their stories.
These are the workers who are required to stare at horrifying content for hours on end, with few mental health resources, are largely controlled by opaque algorithms, and, importantly, are the ones who drive the runaway valuations of the world’s richest and most powerful companies.
“If you don’t understand your history, you don’t understand where you stand,” Angela, one of the speakers that day, told the workers gathered there (many of the speakers at the event declined to give their full names). “When you think about colonialism, we were under the British Empire East Africa Company (…) Literally, we are working under the company. We are just the product, we are part of its operation. Stakeholders, we can say, we are at the bottom of the bottom.”
“These multinational companies are starting to dominate and dominate here,” she added. “This is a very unfortunate supply chain and my call today as a data labeler is to build on this further. We are fighting for workers’ rights and at the same time we are fighting for the environment (…) We are fighting big business. We are fighting today’s British imperialist companies. It’s Apple, it’s Meta, it’s Gemini. Those are the companies we’re still fighting. It’s a call for solidarity, to expand our thinking beyond what we’re doing.”
During a few days in Kenya earlier this year to speak at a conference on AI and journalism, it quickly became clear to me that data labelers make up a significant portion of the country’s tech workforce. Almost everyone I talked to there had been a data labeler (or content moderator) themselves, or knew someone who was. As you exit Nairobi’s airport, you will immediately pass Sameer Business Park. Samir Business Park is an office complex that houses Sama, a San Francisco-based “data annotation and labeling company” that has contracts with Meta, OpenAI, and many other tech giants. Sama has been repeatedly sued over low wages and the fact that many of its employees suffer from PTSD from repeatedly viewing graphic content. For years, we had a giant sign outside our office that read, “Samasource THE SOUL OF AI.” My Uber driver asked me why I was going to a random office building in Nairobi’s central business district. I told her I was going to interview a data labeler. “Oh, I also label data,” she said.
michael jeffrey asia. Image: Jason Koebler
Asia studied air cargo management at university. He expected to graduate and get a job planning freight and baggage routes, but he graduated into an industry ravaged by the coronavirus and couldn’t find a job. Around this time, his child was diagnosed with lymphatic cancer and he took out a loan of about $17,000 to pay for the treatment. He needed a job, so he discovered data labeling.
“To be honest, the pay wasn’t good,” Asia told me. “It was about $240 a month. But when the economic crisis hit and my child got sick, I felt I had no other choice.”
Asia got a job at Sama, where she worked on various meta projects. “You’re given a video and asked to describe it, or you’re given a picture of a person and asked to identify a face. You’re supposed to draw a bounding box around the face and label it.” Last week, Svenska from Sweden Dagbladet reported that Sama’s Kenyan data labelers were viewing and annotating uncensored footage from Mehta’s AI camera glasses, which included highly sensitive violent footage.
Through a group of colleagues and friends who call themselves the Brotherhood, Asia eventually found another data labeling job that allowed her to work from home. “We were a group of six friends and we all had to come in with three job opportunities each week,” he said. “I found another job, but it wasn’t a good one, and I had to annotate porn.”
In this job, Asia went through a porn video frame by frame and annotated what was going on and what kind of porn category it might be. “Every second of that video, you’re putting yourself in the minds of 8 billion people on the planet. So if someone is searching for this porn in Cuba, and they’re searching for ‘dog,’ they might think, ‘This is a tag they can use,’ and that sort of thing,” he said. “So I worked on porn eight hours a day and did that project for eight months.” His “boss” at the time was basically a no-reply email sent daily with a link offering him work.
At the same time, Asia took a second job, which started right after her porn tagging shift, “training” AI companion bots, but had no way of knowing which company she was actually working for. He quickly deduced that he was simply taking on various AI sex bot personas and sexting with real people in real time.
“I felt a sense of humanity in the conversations. Most of them were lonely people,” he said. “I have several profiles and the profiles constantly switch depending on the needs of the people who show up on my dashboard. I think I’m sitting here talking to an old lady who needs love, but when she goes offline another conversation pops up and then I end up replying to a gay guy.”
Working these two jobs in quick succession left him with insomnia, PTSD, and difficulty having sex. Some data label creators work 18 hours a day. When I met him, he said he had essentially gone three full days without sleeping because his body had not yet readjusted from the chaotic schedule.
Asia said she was eventually able to receive mental health counseling through the Children’s Cancer Center. The counseling started because he was caring for a child with cancer, but quickly turned into treatment for work-related PTSD. “It was very helpful to me as a person. It was one of the best services I’ve ever received because they were there for me and I said, ‘We need to solve this.’
“Technology is necessary, but it shouldn’t come at a human cost. Is it really that difficult to provide emotional support to people working on graphic content? If this work were done in the United States, would they do the same thing they are doing in Kenya?” Would they still give the salaries they are given here? They are paid $0.01 per job here, which makes no sense. Why is there this discrimination? If they can pay the people in the US, that means they can pay the people in Kenya too.

Image: Data Labeler Association
The message from many data labelers and the lawyers who support them is that artificial intelligence is not a magic tool built by people in San Francisco who make millions of dollars a year and drive companies to exorbitant valuations. Artificial intelligence is an extractive technology that relies on the hard labor of low-wage workers around the world. For many years, the work of data labelers in Africa has been more or less “ghost work,” invisible, hidden labor that allows American tech companies to develop their products.
“AI is never AI without humans. This is not artificial intelligence. It is African intelligence,” Asia said. “Most of this is dirty work, and most of these jobs are being done here in Africa. And once the tools are working, all communication stops. You’re locked out. We’re training ourselves to death. We’re training ChatGPT, and it’s slowly killing us.”
Strict confidentiality agreements and terms of service that workers cannot opt out of create a culture of fear, and one of DLA’s goals is to make it easier for workers to speak out. When I met Asia in January, DLA had 870 members, but that number is growing rapidly.
“I’m doing this based on experience, not assumptions. I’ve been through this. I know what I’m talking about,” Asia said. “We have a monster called the NDA. The NDA is a slave tool used to enslave people not to talk about what they are going through. I am ready for any legal fight (related to the NDA) because we will not be silent. This is suffering for us and we cannot suffer in silence. This is not a colonial era. I have the right to speak out against any violation (of rights) and that is what I am doing.”
Mercy Mutemi, a labor rights lawyer who has sued several big tech companies, including Meta, over their treatment of content moderators and data labelers, said that when something happens in the United States – when a new gadget, product, feature or policy is announced – there is a similar reaction in Africa.
“When something happens in the United States, there is a cost to Africa,” she says. “Kenya has been pushing for a trade deal with the United States, right? And the conversation is going to be about immunity and protection for big tech companies. It’s like, ‘You want to do some business with us? Well, we’ve got to keep Meta out of this case.'”
Mtemi has been working on meta cases and campaigning against the NDA so that workers can speak more freely about their experiences. Tech companies “keep people in mental prisons and make them feel like they can’t talk about this, but NDAs are meaningless. Our laws don’t allow for these kinds of NDAs,” she said. “There’s a way to solve this in a way that’s not exploitative.”
Back at the Nairobi Arboretum, DLA’s message to its members is primarily that their work is important, it’s human, and they deserve better recognition.
“Africa is at the bottom of the AI supply chain. But right now, we are all here, most of you are data labelers, you are the ones providing the workforce. If you think about the whole AI ecosystem, who are the engineers and that is probably the image of AI that the majority of the world has,” Angela said. “And that’s actually very intentional. It’s very automatic and beautiful and technological to make[labor]invisible and make AI seem like this shiny object that no one can understand. It’s intentional that it hides the behind-the-scenes of labor and AI.”
Jason is the co-founder of 404 Media. He previously served as editor-in-chief of Motherboard. He loves Freedom of Information Act and surfing.


