Inteligencia Artificial (IA)
Popular iPhone app pays to record calls for AI: serious security flaw exposed
Gianro Compagno
2025-09-26
5 min read
The sale of personal data is no longer a distant possibility: it is a booming reality. A recent example can be found in Spotify, where a service emerged that paid users for sharing their profiles and listening habits, which were then resold to tech companies. What seemed like a simple exchange of music data turned into a commercial transaction of our privacy. Now, Neon takes this model into a much more delicate area: phone calls, transforming the intimacy of our conversations into a high-value product.
Neon: turning voice into money
Neon is an app that proposes monetizing phone calls. Its slogan is clear: “talk, record, and earn.” It offers users the chance to earn “hundreds or even thousands of dollars a year” by allowing their conversations to be used to train artificial intelligence systems. The appeal of the proposal skyrocketed its popularity: within days, Neon ranked among the top three most downloaded apps in the Social Networking category of the U.S. App Store.
How does Neon work?
Neon's system is simple: it pays 30 cents per minute when both parties use the app, and 15 cents if only one does, with a daily limit of $30. It also incentivizes user acquisition with a referral program that grants $30 for each successful registration. The recordings always affect the sender, and if both participants use Neon, both are affected.
Terms of use: the real fine print
Beyond the financial incentives, the true scope of Neon lies in its terms of use. Users grant the company a “worldwide, exclusive, irrevocable, and transferable” license over their recordings, allowing Neon to sell, modify, create derivative works, and distribute the audio in any format, present or future. Additionally, the app includes beta features with no guarantees or responsibilities in case of failures, further expanding the margin for data use.
Availability and popularity
Neon's success was as rapid as it was unexpected: it reached the second position among the most downloaded social apps in the U.S. App Store. However, its availability seems limited to that country, as tests conducted from Spain showed the app does not appear or allow downloads.
A serious security flaw
Neon's story took a turn when a technical analysis revealed a serious vulnerability: the app did not adequately protect its users' data. According to TechCrunch, it was enough to create an account and analyze network traffic to access third-party information. Following the alert, the founder shut down the servers and announced a “security pause,” without mentioning the breach. The exposed data included:
- Phone numbers associated with accounts
- Public links to audio recordings
- Complete call transcripts
- Metadata with duration, date, and payments received
The exposure of this data allows for the reconstruction of private conversations and association with specific individuals, opening the door to risks such as identity theft or the creation of synthetic voices.
Promises and reality
Neon claims to protect privacy through the anonymization of conversations, deletion of personal data, and sales only to verified companies. However, the incident demonstrated that these mechanisms are not infallible. After the temporary shutdown, official communication was limited to promising “new layers of security,” without acknowledging the magnitude of the breach.
Final reflection
Neon's downfall does not resolve the underlying question: how much is our privacy worth in the age of artificial intelligence? The model of paying for recorded calls could be replicated in other markets, driven by the growing demand for data to train intelligent systems. What happened in the United States is an early warning: the commercialization of intimacy is already a reality, and the decision to participate or not ultimately lies with each user.