The prevailing opinion here seems to be that we’d really like for there to not be an omnipresent panopticon because protect the children or terrorists or, apparently, malware. If your imagination is particularly lacking on how this might be weaponized just remember that antifa is now designated as an terrorist organization in US, so you better not be a suspected member of it — as in, you best not have sent a buddy a message on signal about how those tiki torch carrying nazi larpers aren’t exactly great guys, or off to a black site you go for supporting terrorism.
If you want to prosecute people send physical goons, which are of limited quantity, rather than limitless, cheaper and better by the day pervasive surveillance of everybody and everything.
OK, sorry to keep repeating myself here, but... I strongly oppose any kind of "panopticon" like ChatControl.
What I would like to see, is, say, Signal complying with lawful interception orders in the same way that any EU telecoms provider currently does.
So, provide cleartext contents of communications to/from a cleary identified party, for a limited time, by judicial order, for a clearly specified reason.
> pervasive surveillance of everybody and everything
This is exactly what lawful intercept laws are supposed to prevent. And yeah, of course, abuse, but under a functioning rule of law there are at least ways to remedy that, unlike with mass surveillance and/or malware...
> I strongly oppose any kind of "panopticon" like ChatControl. What I would like to see, is, say, Signal [...] provide cleartext contents of communications to/from a cleary identified party
Those statements simply aren't compatible.
Right now, Signal is designed by cryptography experts to provide the best privacy we know how to build: messages are only readable by you or the intended recipient. "Lawful intercept" necessarily means some additional third party is given the ability to read messages.
It doesn't matter what kind of legal framework you have around that, because you can't just build a cryptosystem where the key is "a warrant issued under due process." There has to be a system, somewhere, that has access to plaintext messages and can give law enforcement and courts access. The judges, officers, technicians, suppliers, and software involved in building and using this system are all potential vectors by which this access can be compromised or misused -- whether via software or hardware attacks, social engineering, or abuse of power.
Maybe your country has "functioning rule of law", and every single government official and all the vendors they hire are pure as snow, but what about all the rest of us living in imperfect countries? What about when a less-than-totally-law-abiding regime comes into power?
You're proposing that we secure our private conversations with TSA luggage locks.
> You're proposing that we secure our private conversations with TSA luggage locks
No -- that's an incredibly reductive summary, and the attitude you display here is, if left unchecked, exactly what will allow something equally ridiculous like ChatControl to pass eventually.
There has been plenty of previous debate when innovations like postal mail, telegraph traffic and phone calls were introduced. This debate has resulted in laws, jurisprudence, and corresponding operating procedures for law enforcement.
You may believe there are no legitimate reasons to intercept private communications, but the actual laws of the country you live in right now say otherwise, I guarantee you. You may not like that, and/or not believe in the rule of law anymore anyway, but I can't help you with that.
What I can hopefully convince you of, is that there needs to be some way to bring modern technology in line with existing laws, while avoiding "9/11"-style breakdowns of civil rights.
We can draw analogy between any two things. An encrypted chat is not a letter in the mail or a call on the telephone. It is an entirely new thing. Backdooring such chats is not "bringing technology in line with existing laws" it is, very clearly, passing new laws, and creating new invasions of privacy. It must be justified anew. The justification for wiretapping was not that there was no way to fight crime without it. Otherwise, when the criminals became wise to it, and began to hold their conversations offline, there would have been a new law, requiring that all rooms be fitted with microphones that the police could tap into as necessary. No such law was passed. Instead, the justification for wiretapping was simply that, once police had identified some transmission as relating to the committing of a crime, they could obtain a warrant, and then tap into the communication. The physical capacity without any effort by uninvolved individuals was the entire justification. That capacity does not exist with encrypted chats. The analogy is therefore much closer to the "mandated microphones" described above. Everyone is being required to take action to reduce their own privacy, regardless of whether they are subject to a warrant.
What is most striking about our "mandated microphone" analogy is the utter futility of it. Criminals have no issue breaking the law, and hence have no issue outfitting a room with no microphones in which to carry out their dealings. The same is true of any law targeting encrypted chats.
For a real-world example of the problem you're describing, China's Salt Typhoon attacks compromised lawful intercept infrastructure in the USA to engage in espionage. A mandatory backdoor in Signal would be at risk from similar attacks.
The two sides in this debate seem to be talking at cross purposes, which is why it goes round and round.
A: "We need to do this, however it's done, it was possible before so it must be possible now"
B: "You can't do this because of the implementation details (i.e. you can't break encryption without breaking it for everyone)"
ad infinitum.
Regardless of my own views on this, it seems to me that A needs to make a concrete proposal