Dark Posted October 22, 2020 Posted October 22, 2020 The cybersecurity company Sensitive AI published an investigation into a bot that uses artificial intelligence (AI) to 'strip' photos of women and then share them on the Telegram messaging service. According to the report, the bot uses software that generates a realistic approximation of the intimate parts of the women's bodies in the photos. The program learns to detect clothing items and the location of body parts. Then, from several photos of clothed and nude women, you can create a version of a nude female body that appears in any image. (You may be interested: Due to elections in the US, Twitter changes the rules of ‘retweet’). Related topics TAXES BOGOTÁ 08:01 P. M. Why is the new platform of the Ministry of Finance failing? Investigators determined that more than 100,000 women have been victims of the bot and that there are at least seven Telegram channels where the photos are shared. Channels are a tool of this application to spread messages to large audiences. Internet users access publications when they subscribe to a specific channel. Sensitive AI also explained that, through Telegram channels, people upload the photo they want to alter, wait a few minutes and the bot sends the image to the user completely free. However, if you want to remove the watermarks from the photo, you must pay around $ 1.50 (5,700 Colombian pesos). In an interview with 'MIT Technology Review', a specialized magazine of the Massachusetts Institute of Technology (MIT, for its acronym in English), Giorgio Patrini, CEO of the company that led the investigation, assured that “it is quite obvious that most of the photographs they are minors ”. Through an anonymous survey of more than 7,000 users of the bot, it was established that the majority of these are from Russia. Likewise, 63% of those surveyed said that they use it to ‘undress’ relatives, girls or women they know, 16% use it in photographs of celebrities; 8%, in images of models or women on Instagram; 7%, in internet women; and 6% said they had no intention of using it on girls. The cyber company also specified that most of the victims come from countries such as Argentina, Italy, Russia and the United States. The CEO of Sensitive AI also told the MIT magazine that the researchers tried to contact some of the women, but none of them wanted to share their experience. It is quite obvious that most of the photographs are of minors In addition, he commented that they tried to contact Telegram and the FBI, but, he said, he has not yet received a response from any of the organizations. This type of artificial intelligence is known as 'deepfake' and, according to Patrini, it is not the first time it has been used, as the researchers found another page dedicated to the creation and distribution of this content on the Russian social network VK. 4
Recommended Posts