Why does Apple even have ratings and parental controls?
I think maybe Apple needs to get out more. I’m pretty sure they have no idea what pornography is.
When the Hottest Girls app was released earlier this week claiming to be the “first iPhone app with nudity” on the iTunes app store, it was both exciting, and expected. After all, Apple had announced that the new iPhone OS would have parental controls, and the new SDK would allow for developers to create adult-themed ratings for applications, so the general consensus was that apps with nudity and other adult themes were only a matter of time.
However, it now appears that the Hottest Girls app has been pulled by Apple only a day after we broke the news. Apple PR has released a statement saying: “Apple will not distribute applications that contain inappropriate content, such as pornography. The developer of this application added inappropriate content directly from their server after the application had been approved and distributed, and after the developer had subsequently been asked to remove some offensive content. This was a direct violation of the terms of the iPhone Developer Program. The application is no longer available on the App Store.”
Now, there are a couple very big problems with Apple pulling this app from the store. First, technically, the app is not distributed with objectionable material – it pulls images from the developer’s server. In effect Apple already distributes an app with similar capabilities with every iPhone and iPod touch called Safari. Secondly, the general consensus from the comments on this issue seem to imply that topless women posing does not constitute “pornography” for the general population these days (well, the general internet savvy population, at least). Given how readily available genuine hard core images and video are on the web, the fact is most kids (and every adult) have seen far more graphic sexual images than was seen in the Hottest Girls app.
But the real problem is that Apple itself is sending confusing messages to its developers. The latest iPhone software developer kit includes a rating scheme where you, the developer, check off items from a list of things that may be inappropriate for some viewers, and the SDK will generate its own rating for your app when it appears on iTunes. While the ratings interface has this warning:
“Applications must not contain any obscene, pornographic, offensive or defamatory content or materials of any kind (text, graphics, images, photographs etc.,) or other content or materials that in Apple’s reasonable judgment may be found objectionable.”
That is extremely subjective and confusing, especially in the context of the Apple-supplied ratings interface. Just take a look at the screenshot below.
As you can see, Apple has TWO categories for flagging adult content of a sexual nature, each of which can be rated as “None”, “Infrequent/Mild”, or “Frequent/Intense”. The first, Mature/Suggestive Themes, would seem to imply to sexual themes where you may not actually see the act – such a s a Grand Theft Auto-type game where you might take a hooker into a parked car, see it rocking, and then come out smiling. But the second and more confusing category, Graphic sexual content and nudity, would seem to imply that Apple fully expects nudity, and perhaps quite a bit more to be included in apps. If it didn’t, why put it there? Why not simply say “No nudity” instead of using subjective words like “pornography” and “obscene”? As it stands now, apps like Hottest Girls, who actually DO fill out the form correctly, will see there app listed as below, with a 17+ rating for following Apple’s rules, which seems right.
Above: This is how Hottest Girls appeared (briefly) on the iTunes store.
So, we are still left wondering, “why was this app pulled?” But the answer I suppose breaks down into the bigger question of “What is pornography?” Most people have rightly pointed out that in pretty much every other country outside of the US (and I will assume some Muslim countries) topless beaches are the norm, and photos of topless women grace newspaper pages uncensored. The type of images the Hottest Girls app provided would hardly be considered porn by most, and certainly not the “Intense/Frequent Graphic Sexual content and nudity” that the developers themselves rated their app.
I’m not saying Apple HAS to allow nudity/adult material on the app store, I’m saying it is sending out mixed signals. I understand the potential embarrassment Apple is trying to avoid when the top 100 paid apps would all be porn apps, but really, it isn’t Apple that should be embarrassed so much as society. My problem is that Apple claims topless women are “objectionable” while ultra violent scenes of gore and torture are not. If the new ratings systems and parental controls are designed to stop kids under 17 years old from seeing one thing, why can’t we trust it to work for both? There is no set definition of “offensive”, and while I have personally never been offended by anything I have ever seen, read, or heard, I know there are a ton of uptight people out there looking for fights who would claim violent video games are worse for their kids than seeing a naked woman.
So to me, the question left posed to Apple is, if topless shots of bikini models is too hardcore for the “Intense/Frequent Graphic Sexual content and nudity” category, what isn’t?
“If it didn’t, why put it there? Why not just rely on their initial “no pornography” statement and leave it at that?”
Maybe they put the message out twice to make sure it gets filtered out. There are zero restrictions on the kinds of images and videos that people can put on their own phone. But why does Apple have to get into the business of selling it? Since when does any company have any kind of obligation to sell a certain category of product? If Apple doesn’t thing it fits Apple’s brand, the Apple doesn’t have to sell it. Companies make these kinds of decisions every day. So call it nudity, call it pornography, call it whatever you want, if Apple doesn’t want it in its App store, then Apple doesn’t have to put it in its App store.
Doc, I think the fact the developer “snuck” the content onto his servers AFTER the app was approved by Apple had something to do with it being pulled… And regarding Apple’s stance, I for one am very glad that the AppStore is – for now – safe from bombardment with thousands of trash soft-porn apps from devs looking to make a quick buck…
If you chose “Infrequent/Mild Graphic Sexual content and nudity” from the iTunes Connect interface the web application tells you what graphical sexual content isn’t allowed on the App Store.
From the screenshot I can tell you that the author didn’t check the box for graphic sexual content, e.g. you can write about big, juicy and nice boobs, but you can’t show them.
Perhaps, Dr Mac, you could explain why you yourself, along with many other bloggers and websites, promoted this app as “porn”. Your article was entitled “And then there was porn”. So you are also responsible for mislabeling this app as porn when it was nothing more than naked women.
@DocEverywhere,
then how did Hottest girls get on the iTunes store with that rating? Are you saying if an app is approved, and then changes it’s own rating to indicate sexual content, then it goes through without any approval process?
Doc haven’t you been reading the financial press?
Apple can’t get out more because it had a liver transplant 2 months ago!
Rather a confusing system Apple has set up. What’s the difference between “mildly” graphic and “intensely” graphic? And you failed to note that there’s yet a third category (below “Realistic “Violence”) which is also called “sexual content or nudity” — how would one decide if it’s “intense'” or “graphic”?
Seriously sounds like this was put this together ad-hoc in the wake of a couple (if you’ll pardon the pun) bad apple apps.
It occurred to me that Apple should open an “adult” section of the App Store. That way parents can completely block their kids from those apps with parental controls and adults can buy whatever legal adult applications they want. Apple doesn’t have to include “adult” App Store applications with their Top 10, if they don’t want to.
Apple isn’t selling these apps so much as it is distributing them. Whether I want to buy a soft porn app for my iPhone or not, I don’t want Apple making the choice for me. This reminds me that years ago, at MacWorld Expos, Apple had a separate section, away from the main show floor, for “sexually mature” applications. There were a couple of short new segments on it, but no one seemed to care. It was what it was.
I personally feel that Apple Should distribute any legal application with suitable labeling and let adults decide for themselves what apps they want to download.
Jonro, an Adults Only section in the App Store makes perfect sense.
It would be clearly defined and clearly able to be blocked by whoever decides not to go there.
I also think Apple needs to add a few more categories to the App Store. Specifically, there really needs to be a “Religion and Spirituality” category. I’ve run many searches (with the App Store’s woefully inadequate search function) and been inundated with religious apps (of all denominations) masquerading as “Education”, “Research” and “Lifestyle”.
I have nothing against religion (keep it to yourself please), but like any good bookstore, there are many categories and subcategories to make finding things easier.
It seems that Apple is twisting itself into knots trying NOT to offend people.
Earth to Apple… in this climate of Political Correctness and societal nannification, you can be guaranteed that almost ANYTHING will offend SOMEONE and all it takes is ONE complainer to cause a stink and get something banned. It may generate a lot of publicity and some will gain through controversy, real or fabricated.
Please Apple, let adults decide for themselves and for their OWN kids.
“Why does Apple even have ratings”?
(1) Because they want to (i.e. it is theirs).
(2) Because there is a large segment of the public likes a venue that isn’t trashed. It’s a real differentiator. Disney doesn’t seem too harmed by similar policies.
“it pulls images from the developer’s server. In effect Apple already distributes an app with similar capabilities with every iPhone and iPod touch called Safari.”
I’m sorry, but this holds no water. Safari is a web browser, Hottest Girls was an app dedicated to pulling down these pictures off this server. It’s one thing when your twitter app or your ebook reader connect to servers you don’t control where most of the content is harmless or even benificial, it’s something else when the point of the app is to look at pictures of naked women. And who’s to say that graphic depictions of sexual acts wouldn’t appear on that server?
People don’t want the baby shaking game, but they want the porn. Oh, and Apple shouldn’t be capricious in its rejections. Whatever.
The whole nudity vs. violence in movies and games conflict has a name. Call me paranoid, but I consider it a global birth control.
Adults Only section in the App Store
And with its own list of top 100 … or is that topless 100? 🙂
iWhore, your new app to find relief anywhere, full working geolocation, picture, measures and rates per service.