A brand new report has highlighted how Apple faces a steep problem in stopping ‘Face Swap’ apps within the App Retailer that may in any other case be used to create deepfake photographs – together with pornography.
The corporate regulates its App Retailer through a “walled backyard” coverage the place apps are required to adjust to Apple’s tips, however a report from 404 Media suggests “twin use” apps that incorporate options like face swapping can be utilized to change faces onto pornographic content material – typically utilizing minors.
Apple’s challenges with Twin Use apps
The reporter discovered a face swap advert on Reddit, with the app suggesting a sequence of internet sites, together with pornographic ones, to swap faces into.
Because the report itself says “I examined the app and located that, for customers keen to pay a subscription price of $20 a month, it makes it extremely straightforward to generate nonconsensual deepfake porn of anybody.”
“All I needed to do was present one picture of the individual I wished to deepfake, and use an in-app web browser to navigate to the video I wished them to seem in. Because the advert I noticed on Reddit urged, after I navigated to a particular Pornhub video, the app robotically pulled the video and created deepfake porn of the individual within the picture I supplied.”
“The complete course of, from the second I noticed the advert on one of the crucial widespread web sites on the planet to the finished deepfake video, took about 5 minutes, and created a extremely convincing consequence.
Given Apple does not enable porn apps on the App Retailer, this looks like a option to circumnavigate that coverage one way or the other whereas probably sourcing content material from an grownup website.
Whereas Apple Intelligence can’t generate photographs of that sort, the corporate would possibly have to take a deeper take a look at third-party choices on its App Retailer earlier than lengthy.
Extra from iMore