Elon Musk's X to block Grok from undressing images of real people
Elon Musk's X has announced that its AI tool Grok will no longer allow users to remove clothing from images of real people in jurisdictions where it is illegal, following widespread concern over sexualized AI deepfakes. The UK government and regulator Ofcom have welcomed the move, but their investigations into whether the platform broke UK laws remain ongoing. Victims and campaigners have criticized the platform for the delay in implementing these changes, noting the harm that has already been done. X has also announced that only paid users will be able to edit images using Grok, adding an extra layer of protection to ensure those who try to abuse the tool are held accountable.
0 Comments