Apps that create non-consensual sexualised images—often referred to as “nudify” apps—are reportedly still available on major platforms like Apple Inc. and Google, despite clear policies banning such content. According to a report by the Tech Transparency Project, searching terms like “nudify” or “undress” in app stores continues to surface tools that can digitally alter images of individuals, including celebrities, to make them appear nude or partially undressed. These apps are not only accessible but, in some cases, also promoted through search suggestions and advertisements, raising serious concerns about platform accountability.
The scale of the issue is significant. The report estimates that such apps have been downloaded hundreds of millions of times and have generated substantial revenue through subscriptions and in-app purchases. While both companies maintain strict guidelines prohibiting pornographic or exploitative content, enforcement appears inconsistent. Researchers noted that even after previous crackdowns and removals, similar apps reappear quickly, often disguised as generic AI image tools but capable of misuse. This cycle highlights the difficulty in regulating rapidly evolving AI-driven applications.
Both Apple and Google have responded to the findings by taking action against some of the identified apps. Apple confirmed it removed several apps after being alerted, while Google stated that many apps violating its policies had been suspended and investigations were ongoing. However, experts argue that these reactive measures are not enough. The platforms’ review systems may fail to detect harmful capabilities when apps present themselves as harmless or multipurpose tools, allowing problematic features to slip through initial screening.
Also Read: Karnataka HC Calls Menstrual Leave a Fundamental Right, Orders Policy Implementation
The issue has also drawn attention from policymakers and regulators globally. Governments are increasingly pushing for stricter rules to combat the spread of non-consensual sexual content online. Laws such as the “Take It Down Act” in the United States aim to criminalise the sharing of such material and hold platforms accountable for failing to remove it promptly. Similarly, proposed legislation in other countries could impose legal consequences on tech executives if their platforms do not adequately control harmful content.
Experts warn that beyond policy enforcement, the design of app store algorithms plays a major role in the visibility of these apps. Search rankings and recommendation systems often prioritise engagement, which can inadvertently amplify controversial or harmful applications. As AI technology continues to advance, the challenge for tech giants will be not only enforcing rules but also proactively preventing misuse—ensuring that innovation does not come at the cost of user safety and dignity.
Also Read: Chhattisgarh Moves Toward Uniform Civil Code Plan, Panel to Draft Framework