I will break the HN spirit, but im fucking horrified of this thread. _So_ many people being happy that a company is protecting them from themselves, or their family members. Where is my controversial personal websites? Did i take a wrong left turn somewhere?
This is an opt-in feature. You can enable it for yourself. It is also available as a parental control for your kid's devices. I think that is 100% appropriate.
I don't want to see dick pic spam. Actually I don't want anyone to send me naked pictures. Previously it was up to every messaging app to figure this out themselves. Now they can use the Sensitive Content Analysis framework. It also means they don't need to give PTSD to humans building a model to classify crap like CSAM images.
All it does is detection. It is up to the app whether to prohibit sending/receiving the message, how to notify the user, etc. The API documentation says this:
> Apple provides the SensitiveContentAnalysis framework to prevent people from viewing unwanted content, not as a way for an app to report on someone’s behavior. To protect user privacy, don’t transmit any information off the user’s device about whether the SensitiveContentAnalysis framework has identified an image or video as containing nudity. For more information, see the Developer Program License Agreement.
This is how they sold it: of course, nobody wants to see unsolicited dick pics. Good framing, they obviously pay their marketing department very well.
Yet, consider how many unsolicited dick pics you actually see in a day, week, or month- and if sometimes blocking a % of them is worth permanently ceding more privacy to Apple.
If you’re seeing so many dick pics to where this is positive trade off for you, you need to reflect on how you’re interacting with the web.
I’m going to go on a limb and guess you’re a dude.
I suggest you expand your social circle and talk to some women that are moderately active on social media.
Women are nowadays confronted with their face generated on fake porn by AI, hell some are confronted with generated sexual abuse as a form of threat and harassment if they piss off the wrong person, and you’re out here talking about dick pics.
What’s worse is that you just couldn’t resist to add a dash of misogynistic victim blaming as a cherry on top.
It’s not even being prudish. It’s clearly informing you that hey, be careful about sending nudes because you might not realize the consequences – the messaging is about as nonjudgmental as it gets. As someone who has who has literally made amateur porn and probably as far from prude at it gets, this really doesn’t register on my prude detector.
Why? Its a slab of silicon scanning a representation of a photo and spitting out a probability. Its no less creepy than originally taking the photo, and postprocessing it, and categorizing it in your photos app.
If any kind of metrics leave the phone's chassis as a result, then its quite creepy, but I was operating under the assumption that they are not.
There is a problem out there of unsolicited nudes; I can see this being welcome capability for a lot of people [0]. It doesn't nudge up against any privacy issues. Seems like a good idea overall.
Of course, there is also a real issue with the fact that, as closed source software on a locked down platform, we can't know what happens next. But that is just part of the deal with iPhones; there is already a lot of data like that (eg, I'd expect the US uses iPhone GPS data froom targets to hunt them down).
[0] Not sure what the feature actually does because nobody has posted details here, so there is some guesswork here.
If it detects cat pictures, what evil thing is it going to do? Label it as a photo of a pet (I don't even know, I don't use a scanning phone)?
If it detects nudity, what kind of unwanted behavior might it exhibit then, report to legal guardians? Not the picture itself but even just that the device is being used for that.
I can see how this scanning+warning is more creepy than scanning+labeling cat pictures, even if the information screen tells you it was just used for this warning screen.
Not only trust, also just knowing what the system does above board. The documentation can say exactly what it does, but who reads that? At least not until they get a "hey that's an interesting pic you took there" pop up
I'd also be creeped out, but honestly it's not bad to be a bit paranoid about who can see what you're currently sending onto the internet and double checking that things are as they should
optional things like this are fine. preventing me from joining e.g. NSFW discord servers wholesale is not imo. As an adult I should be able to use my phone however I want
Given the internet as it is, and as it has been even back when FOSS discussions included hating GIF because of patent enforcement, kids shouldn't be on the (general) internet at all.
Smartphones are even worse, given the deliberate attempts to make content more addictive.
I'm not sure how to square that particular circle with the likelihood of social exclusion from not being online — it's not like me putting (general) in brackets in the first paragraph will convince the right people that there's money to be made in a genuinely safe subset, despite the existence of YouTube Kids and whatever Netflix' thing is called.
So, I can’t make stickers with penises in them for whatever reason (you’d think they’d lump creating/receiving them in the same setting), but the ‘are you sure you want to send or receive something that looks naughty’ was, in fact, turned off by default for me. Anyone else?
This was a "welcome to the new version, let's get started" walkthrough optional setting for me. So the user here turned it on and acts surprised it does what it says?
The new major OS releases include a version of this feature that you can enable without the parental control context/overhead. So for example if you don't want to see random penises that might get messaged to you, you can avoid seeing that.
It is optional and disabled by default, just like the child-oriented Communication Safety feature set. They call the adult-oriented version "Sensitive Content Warnings".
I’ve been to more sex parties than I can count and there’s definitely a significant amount of porn on my iOS devices, so I’d not consider myself prude.
Still I think that giving people the option to hide unsolicited dick picks on their devices is a good idea and not prude.
I think the same about an option hiding pictures of spiders.
Extrapolating from that we may touch hiding queer or black people, or mixed couples one day, which will be an interesting situation.
I don't think this screen would prevent anyone doing CSA from sending that M, and a C independently wishing to send SM is not necessarily being A, so I'm not sure what you mean
Does that sort of content show up often for people who would not prefer it and don't go looking for it?
As someone who's used earlier versions of iOS for some years now, and who knows a bunch more people who also have, that's not a problem I'm aware any of us ever experiencing. I realise that anecdotes are not data, but it doesn't seem like it should be a common issue at all...
That's not what your article is about though, or even so much as mentions in passing, so I find the critique that the provided data is not about CSAM legitimate