The scanning just doesn't include contacting children - it includes CSAM. Talking to kids isn't CSAM. You're talking about something else altogether, and something which is purely hypothetical.
> The scanning just doesn't include contacting children - it includes CSAM.
My understanding is that they are not only talking about having a list of hashes (of illegal images), but also having some kind of machine learning. And they are not only talking about scanning images, but text, too.
I don't know what you expect them to report when scanning conversations with machine learning?
The reason is because it works. They're not stupid - they can use signal.
The reality is that the privacy options not only exist, they're really good - often better and easier to use than the mainstream stuff.
They will just pivot to other tools.