On March 31, the Office of the Data Protection Commissioner confirmed in a formal letter that it has commenced suo moto investigations into privacy concerns related to Ray-Ban Meta smart glasses, specifically the processing of personally identifiable information for the training of Meta's AI systems.
Data Commissioner Immaculate Kassait's letter states: "The Office of the Data Protection Commissioner confirms that it has already commenced suo moto investigations into the privacy concerns raised in relation to the Ray-Ban Meta glasses and the processing of personally identifiable information for the training of Meta AI. The outcome and further developments will be communicated once the investigations are concluded."
Kenya now joins the United Kingdom ( where the Information Commissioner's Office described the reports as "concerning" and demanded information from Meta ) and the United States, where lawsuits have been filed over biometric privacy violations. A Kenyan regulator opening a formal investigation into a global technology giant over AI training practices is not a routine event. It is a signal that the Data Protection Act 2019 has operational teeth, and that Kenya intends to use them.
We wrote about this story on March 4. We called for this investigation. It has now begun.
The Timeline That Got Us Here
February 27 — Swedish newspapers Göteborgs-Posten and Svenska Dagbladet publish a joint investigation confirming that footage from Meta Ray-Ban smart glasses ( including bathroom visits, people undressing, bank card details, and intimate encounters ) was being routed to workers at Sama's Nairobi offices for AI training data annotation. Workers were paid approximately $2 per hour, required to sign NDAs, and told that raising concerns about the content risked termination.
March 4 — TechInKenya publishes "Nairobi Is Watching Your Bedroom. And Meta Knew." — profiling the Sama operation, its documented history of exposing Kenyan workers to harmful content for OpenAI in 2023, and making the explicit case that the ODPC has jurisdiction over this processing and the authority to investigate.
March 6 — The Oversight Lab, a Kenyan digital rights organisation led by Executive Director Mercy Mutemi, files a formal petition with the ODPC requesting an investigation. The petition cites Sections 8 and 9 of the Data Protection Act. It asks the ODPC to determine whether individuals recorded by the glasses consented to their data being used to train Meta AI, whether cross-border data transfers complied with the Act, and whether Meta, Sama, or associated entities conducted a data protection impact assessment before handling the material. The petition requests completion within 90 days.
March 6 onwards — Over 150 organisations and individuals sign letters of support urging the ODPC to conduct the investigation openly, transparently, and with public stakeholder participation.
March 31 — The ODPC formally confirms it has commenced investigations, placing Kenya alongside the UK and US in active regulatory scrutiny of Meta over this specific issue.
What the Investigation Covers
The ODPC's investigation has two distinct threads, and it is worth being precise about each.
The data subjects thread — individuals worldwide whose footage was captured by Ray-Ban smart glasses and processed by Kenyan workers at Sama. These are people in Sweden, Germany, France, the US, and elsewhere who were recorded by wearers of the glasses ( sometimes unknowingly, sometimes in deeply private moments ) and whose footage subsequently arrived in Nairobi for annotation. The data protection question for this thread is: was there a lawful basis for transferring that footage to Kenya for processing? Did Meta conduct the required data protection impact assessment? Were affected individuals informed that their data would be processed in Kenya?
The Kenyan workers thread — the Sama annotators who were required to view and label intimate footage as a condition of employment. Their data rights, their labour rights, and the psychological harm they suffered intersect with the Data Protection Act in ways that Kenya's regulators have not previously been asked to adjudicate at this scale.
The Oversight Lab's petition covers both threads. The ODPC's confirmation letter references the processing of personally identifiable information in connection with Meta AI training which encompasses the first thread directly and implies examination of Sama's data handling practices, which touches the second.
The Pattern That Makes This Investigation Necessary
The details of the Sama operation were not new to anyone who had followed Kenyan tech labour reporting closely. This is the same company, in the same Nairobi offices, that TIME Magazine exposed in 2023 for paying workers $1.32 to $2 per hour to label child sexual abuse material, beheadings, and suicide footage for OpenAI. Workers from that operation described developing PTSD. Sama ended that contract after public pressure then pivoted to computer vision annotation for Meta's Ray-Ban programme.
Mercy Mutemi's framing in the Oversight Lab petition draws the explicit parallel: "Just as we saw in the Worldcoin case, Kenya is once again being used as a training ground for exploitative and harmful AI." The Worldcoin biometric data collection scandal ( where Kenyan participants were paid small amounts to have their irises scanned for a cryptocurrency project ) resulted in the ODPC suspending Worldcoin's operations in Kenya and remains one of the clearest examples of the regulator acting assertively against a global tech platform.
The Ray-Ban glasses investigation presents a harder regulatory challenge than Worldcoin for one important reason: the data processing is not happening to Kenyans as subjects. It is happening by Kenyans as workers, processing data about people in other countries. The legal framework for that scenario under the Data Protection Act has not been fully tested.
What the ODPC Can Actually Do
The Data Protection Act 2019 gives the ODPC meaningful powers, but their application in a case of this complexity requires careful reading.
Investigation powers — the ODPC can require Meta, Sama, and associated entities to provide information, documentation, and access to their data processing systems. This includes demanding evidence of the legal basis for processing, records of data protection impact assessments, and documentation of consent mechanisms.
Enforcement powers — if the investigation finds violations, the ODPC can issue enforcement notices requiring specific remedial actions, impose administrative fines, and in serious cases refer matters to the Director of Public Prosecutions. The maximum fine under the Act is Ksh 5 million or 1% of annual gross turnover, figures that will not frighten Meta but that establish precedent and signal regulatory intent.
The cross-border limitation— the ODPC's jurisdiction covers data processing operations conducted in Kenya. Sama's annotation work happens in Kenya. Meta's collection of footage from wearers in Europe and the US is outside Kenya's jurisdiction directly, but the transfer of that footage to Kenya for processing is squarely within it. The legal question is whether the Act's requirements for lawful cross-border data transfers were met, specifically whether the transfer was authorised under Section 48, which requires either the data subject's consent, contractual necessity, or adequacy determinations.
What Comes Next
The Oversight Lab has requested the investigation be completed within 90 days, which would put a conclusion around late June 2026. The ODPC's confirmation letter does not commit to a timeline, saying only that findings will be communicated upon conclusion.
The investigation's credibility will depend on three things. First, whether the ODPC actually engages Meta and Sama directly demanding documentation, responses, and evidence rather than conducting a paper review. Second, whether civil society and affected stakeholders are given the opportunity to participate that Mercy Mutemi explicitly requested. Third, whether the findings, when they arrive, result in concrete enforcement action or recommendations rather than a report that is noted and filed.
Kenya has an opportunity here that goes beyond the specific facts of this case. The Worldcoin precedent demonstrated that the ODPC can act against a global platform with real consequences. A robust investigation into Meta's Ray-Ban data pipeline (one that examines the legal basis for data transfer, the adequacy of Sama's data handling, and the rights of both recorded subjects and annotating workers ) would establish Kenya as a serious data protection jurisdiction in the global conversation.
That conversation is happening at the UK ICO, at Ireland's Data Protection Commission, and in US courts simultaneously. Kenya should be in that conversation as a regulator with standing and teeth, not as a footnote in someone else's investigation.
The ODPC has taken the first step. What it does with the investigation it has opened will determine whether that step matters.
This is a follow-up to our March 4 investigation "Nairobi Is Watching Your Bedroom. And Meta Knew." Read the full original piece here.
Comments