The Grok AI tool on Elon Musk’s X will no longer be able to undress real people in the UK – but a number of similar tools remain easily accessible online, The i Paper has found.
The social media platform has told the UK Government it is “working to comply with UK law” after outcry at the use of Grok to manipulate images of women and children by removing their clothes.
X remains under investigation by UK regulator Ofcom over whether it has broken UK law by allowing users to create sexualised AI deepfakes.
However, despite the crackdown against Grok, The i Paper was able to find several other undressing apps within seconds using searches in Google, Bing and Yahoo that directed users to specific websites and Telegram.
To test the tools, The i Paper uploaded fully clothed AI-generated images of women. Some undressing apps were paywalled, but others allowed users to test them for free.
One tool, Unclothy, offered users the option of using Telegram or signing up with an email. Its website’s marketing enticed readers to “undress your boss”, “undress your colleague” and “undress your crush”.
Mad Journey said it has ‘zero tolerance for the creation of non-consensual images’Unclothy’s Telegram bot – which runs inside the encrypted messaging app and had more than 53,000 monthly users globally – took less than a minute to return an undressed image.
Another Telegram bot with nearly 200,000 monthly users called CyberREAL welcomed users with the words: “All your fantasies are REAL.”
Images generated using the free test option gave users the option to “adjust breast size” with levels of up to +5 and -5.
A third Telegram bot, Hottea, had more than 50,000 monthly users. It offered users one free daily image for its “deepnude” and “nudify” options, promising more to those who invite their friends.
Another “undressing” tool, Mad Journey, had both its own website and a version on Telegram. Its website gave users options to undress images as well as depict people in sexual positions with a written prompt.
After Telegram was connected by The i Paper, all four bots were removed from the platform.
A Telegram spokesperson said: “Non-consensual pornography and the tools used to create it are explicitly forbidden by Telegram’s terms of service and such content is removed whenever discovered.
“Moderators empowered with custom AI tools proactively monitor public parts of the platform and accept reports in order to remove millions of pieces of harmful content each day, including non-consensual pornography.”
Children ‘targeted by apps’
The National Society for the Prevention of Cruelty to Children (NSPCC) said it has received growing numbers of calls to its helpline from distressed children who have been the victims of AI undressing tools.
Children as young as 13 have reported having fake images of themselves shared online or receiving threats that they would be, the child protection charity warned.
X’s Grok is under investigation by Ofcom (Source: PA)Rani Govender, policy manager for child safety online at the NSPCC, said such tools were a “very real threat that is impacting children’s safety and well-being already”.
She said children who are victims of these bots have found the experience “highly distressing” and worry that the technology is becoming so realistic that others, such as their parents, will not believe the images are fake.
“It can feel very isolating and very lonely when suddenly something like this has taken place,” she said. “We know that girls in particular have spoken about feeling like that.”
Grok’s AI tool “really accelerated” the scale of this issue but there are “definitely a whole range of nudifying apps that present a risk to women and children”, which are “easily accessible for adults and children”, she said.
Govender added: “It’s certainly a wake-up call of a risk and a harm that we’ve been seeing.”
‘I’m so scared’
Children contacting the NSPCC described being targeted by users of “nudifying” tools and feeling scared.
In one call log shared with The i Paper, a 14-year-old boy said he shared a few pictures of his face with a girl online because he was bored.
“Next thing I know, this person used some sort of deepfake AI thing to make a porn video with my face on it,” he told the helpline.
“They’re demanding money from me, and they said if I don’t pay, my life will be over.”
A 13-year-old girl said someone she met online threatened to post a fake nude of her and claim it was her if she did not send her “actual nudes”.
“She says she will tag my friends and show them that ‘it’s me’!” she said. “I worry my real friends will judge me if this happens.”
What is the law?
A law making the creation of nonconsensual, intimate images illegal came into force this week in the UK.
It was already illegal to share intimate images of someone without their consent under the Sexual Offences Act, but not to ask an AI tool to create them.
Elon Musk launched Grok on social media platform X in 2023 (Source: Getty Images)The Government will also amend another law – currently going through parliament – which would make it illegal for companies to supply the tools designed to make them, too.
Gopinder said legal changes were “welcome”, but it remains to be seen how they will be enforced. She added that tech platforms, including app stores and search engines, need to make sure such tools are not easily accessible.
Under the Online Safety Act, social media must act on intimate image abuse by reducing the likelihood of content appearing in front of users and taking it down quickly when they become aware of it.
If Ofcom finds X has failed to meet these requirements, it could fine the company up to 10 per cent of its global revenue.
Ofcom already held powers under the Online Safety Act to take action against nudification sites for failing to put age verification measures in place.
In November, it fined Itai Tech Ltd, which provides AI tools allowing users to edit images to seemingly remove someone’s clothing, a total of £55,000 for its age check failings.
Your next read
square ROYAL FAMILY Big ReadAndrew is stepping down the property ladder – what to know about Marsh Farm
square POLITICSWhy the cost of your haircut is about to soar
square PENSIONSI pay £1,633 tax on my triple lock pension. It’s not fair
square WORLD AnalysisCracks are deepening in Trump’s Maga base, and could spell his downfall
A spokesperson for Mad Journey said they “take these concerns extremely seriously and have zero tolerance for the creation of non-consensual or illegal images of anyone, especially minors”.
They added that uploading content without consent of the person depicted or featuring children is clearly prohibited in its terms of service and that Mad Journey is “an experimental research product designed solely to test AI capabilities”.
Unclothy and Hottea were approached for comment. CyberREAL could not be reached for comment.
A spokesperson for Bing said: “We recognise the seriousness of this matter. As an internet search engine, Bing indexes content from across the open web and does not host or control the sites referenced.
“We take concerns about the quality and safety of search results seriously, and our systems are designed to reduce the visibility of low‑quality or harmful content in line with Bing’s principles. We’re reviewing the examples raised and will take appropriate action where results do not comply.”
A Google spokesperson said: “While search engines allow people to access sites that are available on the web, we’ve launched and continue to develop ranking protections that limit the visibility of harmful, non-consensual explicit content.”
Yahoo said it works to block searches which violate its terms from autocomplete searches and is working to completely block policy-violating searches linked to nudification.
Hence then, the article about it s not just musk s grok other nudification apps are targeting women and children was published today ( ) and is available on inews ( Middle East ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.
Read More Details
Finally We wish PressBee provided you with enough information of ( It’s not just Musk’s Grok – other nudification apps are targeting women and children )
Also on site :
- Maison Margiela’s 'Elegant' Duo Perfume Set Makes Fragrance Layering Easy, and It’s Just $30 at Kohl’s
- Royal Caribbean Extends Pause at Private Caribbean Resort Through 2026—Here’s What Cruisers Are Getting Instead
- NYT Connections Sports Edition Today: Hints and Answers for Saturday, January 17, 2026