A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren’t “grooming” children.
Too bad politicians don’t have to base their decisions on an expert opinion. Any credible person will answer that this is a bad idea, but the issue is “easy and quick solutions to difficult problems”.
They’ll just du what they did before - “Our experts say: it’s perfectly safe and secure. No we won’t tell you the names of our experts to protect their privacy and personal safety”
Too bad politicians don’t have to base their decisions on an expert opinion. Any credible person will answer that this is a bad idea, but the issue is “easy and quick solutions to difficult problems”.
Every time you have a child that’s abused, you have a child that’s abused.
Seems like the solution to this is the normal ways we deal with child abuse: social workers and school counsels.
They’ll just du what they did before - “Our experts say: it’s perfectly safe and secure. No we won’t tell you the names of our experts to protect their privacy and personal safety”