Apple pulls 500px photo-sharing app from App Store over nude images (Update: and possible Child Porn)

Screen Shot 2013-01-22 at 3.41.09 PM

Apple has yanked popular photo-sharing app 500px for iPhone and iPad from the App Store over “concerns of nude photos,” according to a report by TechCrunch.

The Cupertino, Calif.-based company pulled it from the App Store around 1 a.m. EST on Tuesday. The startup’s COO, Evgeny Tchebotarev, told TechCrunch that Apple doesn’t want children to search and find nude photos unintentionally via the app:

The move came shortly after last night’s discussions with Apple related to an updated version of 500px for iOS, which was in the hands of an App Store reviewer. The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these type of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

Tchebotarev clarified that 500px does not allow pornography, as it is against the service’s terms and conditions, and the nudes found within the community tend to include an “artistic” nature. The app also depends on users to flag inappropriate images, but it is working on a feature that will auto-identify and tag nude images so they won’t appear in search.

500px told Apple yesterday that it would make any necessary changes to the app to rectify the situation, but Apple apparently couldn’t wait. Tchebotarev said, as retold by TechCrunch, “the changes 500px promised Apple should be done now and are being submitted immediately.”


Update: An Apple spokesperson supplied The Next Web with the following statement about the removal:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.

Get the full report at TechCrunch.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel