Inteligencia Artificial (IA)
Return to Spain of iris recognition cameras: what does the law say?
Paloma Firgaira
2026-02-12
5 min read
At the beginning of 2024, the Avenida de América interchange in Madrid and numerous shopping centers in Spain were filled with queues in front of metallic devices known as orbs. These devices, equipped with cameras and sensors, scanned users' faces and irises in exchange for a reward in the cryptocurrency Worldcoin, valued at around 80 euros. The Spanish Data Protection Agency (AEPD) quickly intervened, considering that the company behind the orbs was improperly processing sensitive biometric data, such as the iris, which has particularly strict legal protection.
Almost two years later, the company, now called World, returns to Spain after expanding to countries like Germany, Austria, Italy, Poland, Portugal, and the UK. Today, it opens a location in Barcelona where users can try its technology, which promises to offer a "proof of humanity" through facial and iris scanning. According to the company, this verification is useful for dating apps like Tinder, which seek to avoid fake profiles, and for video game developers concerned about the proliferation of bots.
World's return raises several questions: Can it operate in Spain again? Why was it banned in 2024? What has changed since then? The AEPD received numerous complaints at the time for lack of information to users, for collecting data from minors, and for not allowing data deletion after consent was withdrawn.
The iris is one of the most accurate biometric data points for identifying a person, as its pattern is unique and permanent. If someone accesses the biometric template of the iris, they could impersonate the user. Therefore, after scanning about 400,000 Spaniards, the AEPD banned Worldcoin's activities in Spain in March 2024, marking the only time the Agency has imposed such precautionary measures.
The company Tools for Humanity, responsible for collecting data for Worldcoin, halted its activities following the AEPD's order and appealed the decision to the National Court, which did not rule in its favor. Subsequently, the Bavarian data protection authority (BayLDA), where the company was based, ordered in December 2024 the deletion of all iris records collected in Europe, including those from Spain.
Despite these restrictions, World continued to collect biometric data in other regions, reaching about 40 million records globally, according to the company itself. Now, World claims to have adopted a new technological approach that complies with the General Data Protection Regulation (GDPR). According to Tiago Sada, product manager, the current system uses anonymized multiparty cryptography (AMPC), fragmenting the code generated by the iris into unrecognizable parts distributed globally, so that no entity has complete access, not even to the encrypted version.
Data is processed locally in the orb, which uses AI models to verify the user's humanity. After verification, a certificate is generated, and the original data is deleted from the device, remaining only on the user's mobile. When humanity needs to be demonstrated, the system creates disposable World ID identifiers, allowing verification without revealing the real identity.
World has notified the AEPD of its return, and the Agency is analyzing the situation. Privacy experts are skeptical. Jorge García Herrero, data protection delegate, points out that although traditional identifiers are not collected, biometric processing still exists and can be linked to mobile identifiers. Carissa Véliz, a professor at the University of Oxford, warns that even with high privacy standards, these systems can be vulnerable to authoritarian regimes or cyberattacks.
On the other hand, there is speculation that OpenAI, the company behind ChatGPT, could launch a social network that guarantees the absence of bots through a "proof of humanity," where World’s technology would play a key role.