With the increase of AI-generated content, there has been much discussion concerning its potential misuse. A few Instagram applications were found advertising the creation of nude photographs without user content. Apple has tightened down on these apps, removing them from the App Store.
According to 404 media, some companies were using Instagram advertising to offer apps that could “undress any girl for free.” These advertising sent people directly to the software Store to download the software, according to the report.
According to the article, Apple declined to comment directly on the story but did contact 404 Media for further information. 404 media then sent Apple direct links to the applications, which the company then removed. According to the report, Apple has withdrawn three apps from the App Store. The report states that this was only feasible if 404 Media provided the corporation with connections to certain apps. According to the research, Apple was unable to identify the applications that breached App Store standards.
Apple’s App Store regulations are strict, yet apps like these finds a way to get in. Apple’s App Store standards for developers plainly specify that, Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy.
Apple also has policy for user-generated content which says, Apps with user-generated content or services that end up being used primarily for pornographic content, Chatroulette-style experiences, objectification of real people (e.g. “hot-or-not” voting), making physical threats, or bullying do not belong on the App Store and may be removed without notice.
While Apple has deleted these three applications with the help of the report, it raises issues about how it intends to address harmful AI-generated content in apps. However, the research emphasizes the need of Apple improving its app review process to prevent such apps from showing up on the App Store in the first place.