AI 'Nudify' Bots on Telegram Can Create Nude Photos Of Anyone, Reveals Probe

AI 'Nudify' Bots on Telegram Can Create Nude Photos Of Anyone, Reveals Probe
A recent investigation has revealed that AI-powered chatbots on Telegram are being used to create nude and explicit images of real people by millions of users. These bots allow users to alter photos with a few clicks, creating deepfakes that remove clothing or fabricate sexual activity.

According to a report by Wired, as many as 4 million users are now utilised these chatbots every month to create deepfakes. Experts like Henry Ajder has raised an alarm about the situation, noting the serious risks such bots pose, especially to young girls and women.

Nightmare for women, girls
Ajder, who discovered these explicit chatbots on Telegram four years ago, described the situation as "nightmarish," citing the ease of access and the large number of people actively using them.

"It is really concerning that these tools - which are really ruining lives and creating a very nightmarish scenario primarily for young girls and for women - are still so easy to access and to find on the surface web, on one of the biggest apps in the world," he said.

In the past, celebrities such as Taylor Swift and Jenna Ortega have fallen victim to deepfakes, but even more concerning are reports of teenage girls being targeted, leading to incidents of sextortion.

Images used for sextortion
The images generated through these bots can also be utilised by intimate partners as a form of blackmail or prolonger abuse. A survey revealed that 40% of US students reported seeing deepfakes circulating in schools.

Telegram, though primarily known for translations and alerts, has become a hub for these harmful bots.

When contacted by Wired about the explicit chatbots, the company did not respond but the bots suddenly disappeared from the platform. Several creators, however, vowed to "make another bot" the next day.

Telegram CEO Pavel Durov was arrested and charged with facilitating child pornography earlier this year, but there has been little change in the platform's operations.

"Building technology is hard enough as it is. No innovator will ever build new tools if they know they can be personally held responsible for potential abuse of those tools," he had said.

Newsmakers - Science and Tech

More Newsmakers