The new offer for recording phone calls and paying for sound so that it can sell data to AI companies is unbelievable, application No. 2 in the Social Network section at the Apple at the US App Store.
App, Neon MobileHe throws himself as a tool for coping with money offering “hundreds or even thousands of dollars a year” to gain access to audio talks.
The Neon website says that the company pays 30 cents per minute, when you call other neon lights and maximum up to USD 30 per day for making connections with anyone else. The application also pays for commands. The application for the first time took No. 476 in the social networks category of the American store App Store on September 18, but at the end it jumped at 10. Appfigures.
On Wednesday, Neon was noticed in the second position on the best free iPhone charts for social applications.
Neon also became the highest general application or game on Wednesday morning and became the best application No. 6.
According to the conditions of services, the company's mobile application can capture users and outgoing phone calls. However, neon marketing He claims that he only records your connection page, unless it is with another neon.
These data are sold to “AI companies”, NEON service conditions, “for the purpose of developing, training, testing and improving machine learning models, tools and artificial intelligence systems and related technologies.”
The fact that such an application exists and is allowed in the Stores app indicates how far AI has entered into force and areas of users that are once considered private. Meanwhile, the high ranking in the Apple App Store is proof that there is a certain market for the market clearly willing to exchange your privacy with pennies, regardless of higher costs for yourself or society.
Despite what Neon says the Privacy Policy, its conditions include a very broad license for user data, in which neon is given:
… all over the world, exclusive, irrevocable, transparent, free from license fees, fully paid law and license (with the right to subheve to many levels) for sale, use, host, storage, transmission, public display, public performance (including using digital audio transmission), communicating with public, reproduction, modification, for formatting, displaying, displaying. Media formats and through any media channels, in each case, are currently known or final.
It leaves a lot of space to move for neon to do more with users' data than he claims.
The conditions also contain an extensive section on beta function that do not have a guarantee and may have various types of problems and errors.

Although the Neon application raises many red flags, it can be technically legal.
“Recording only one side of the telephone connection is aimed at avoiding eavesdropping provisions”, Jennifer Daniels, partner of the law firm Empty RomeTechCrunch says privacy, security and data protection.
“According to () the laws of many states, you must have the consent of both parties to talk to write it … This is an interesting approach,” says Daniels.
Peter Jackson, lawyer and privacy lawyer Greenberg Glusker, He agreed-and says Techcrunch that the language around “unilateral transcripts” sounds as if it could be a way to say that Neon records users' connections in full, but it can simply remove what the other party said from the final transcription.
In addition, legal experts pointed to concerns about how anonymous data can be.
Neon claims Removes users' names, E -Mail and phone numbers before selling data to AI. But the company does not say how the AI or other partners that it sells could use this data. Voice data can be used to make false connections that sound as if they come from you or AI could use your voice to create your own votes AI.
“When your voice is there, you can use it for fraud,” says Jackson. “Now this company has your phone number and basically enough information – they have a recording of your voice, which can be used to create a nutritional and spending all kinds of fraud.”
Even if the company itself is trustworthy, neon does not reveal who its trusted partners are or what these entities can do with the users' data further. Neon is also subject to potential data violations, just like any company with valuable data.

In a short test, TechCrunch Neon did not provide any tips that he was registering the user's connection or warned the call recipient. The application worked like any other voice application, and the calling identifier displayed the incoming phone number as usual. (We will leave security researchers to check other application claims.)
Neon founder Alex when He did not ask for a comment.
Kiam, which is identified only as “Alex” on the company's website, runs a neon from a New York apartment, shows a business application.
LinkedIn post He indicates that Kiam collected money from Front Ventures a few months ago at his startup, but the investor did not answer the question with TechCrunch at the time of writing.
Has AI felt users to be concerned about privacy?
There was a time when companies that want to benefit from collecting data via mobile applications, operated this type of items in Sly.
When in 2019 it was revealed that Facebook paid teenagers for installing the application that was spying on them, it was a scandal. The following year, the headlines appeared again when it was discovered that the Apps APP STORE Analytics providers were conducted by dozens of seemingly harmless applications for collecting use data on the ecosystem of mobile applications. There are regular warnings to watch out for VPN applications, which are often not as private as they say. There are even government reports describing how agencies regularly buy personal data that is “available in trade” on the market.
Now AI agents regularly combine meetings to take notes, and there are always AI devices on the market. But at least in these cases everyone agrees to the recording, says Daniels.
In the light of this widespread use and sale of personal data, they are probably cynical now to think that if their data is sold, they may also be profitable from them.
Unfortunately, they can share more information than they realize and expose the privacy of others when they do it.
“There is certainly a great desire, certainly knowledge employees – and to be honest, everyone – to facilitate the work,” says Jackson. “And some of these productivity tools do it at the expense, of course, your privacy, but also, more and more often, with the privacy of those with whom you interact every day.”
















