If this gets pushed through, you will gradually lose control of your government much like how the people of the UK already lost control of theirs.
What are you going to do when the government's interests inevitably drift out of alignment with yours? Start a political movement? You will have the police knocking on your door for criticizing the establishment.
Start a revolution? You have no weapons. You can't even organize a resistance because all channels of communication are monitored.
You have neither the pen nor the sword. There is no longer an incentive for the government to serve you, and so it eventually won't.
No amount of protest will recover the freedom you once had. You're heading towards a society where everyone feels oppressed but no one can do anything about it.
>you will gradually lose control of your government much like how the people of the UK already lost control of theirs.
As a UK citizen, can you explain your reasoning here? We haven't implemented anything like the chat control proposal and while a few politicians have brought up similar ideas, there is a lot of pushback against it.
>You can't even organize a resistance because all channels of communication are monitored.
One of the awful things about this proposed legislation is that what I quoted you saying is not true. Software like PGP is easy to use, and criminals already do. The government has absolutely no possibility of breaking RSA the way things are now, and as such scanning all messages will do nothing other than prove more definitively that criminals are still beyond their gavel. In reality, the only individuals who will get spied on are regular people who don't open their terminal just to send a text; exactly the people who should not be spied on in the first place.
When the government realizes this invasive legislature is ineffective, they will probably crack down even harder. After all, what we are willing to accept from rulers has by the looks of it already increased dramatically. I wonder if it at some point it becomes illegal simply to posses encryption software on your personal devices, perhaps even possession of prime numbers that could theoretically be used in modern encryption. How far will the government go to take this illegal math from you?
Criminalize encryption. Oh you're using cryptograhy? Well then clearly you are a child molesting, money laundering, drug trafficking terrorist. No need to actually decrypt anything when cryptography is incriminating evidence unto itself.
Computers are subversive. Cryptography alone can defeat police, judges, governments and militaries, and computers have democratized access to cryptography to the point even common citizens have it. They cannot tolerate it.
It's a politico-technological arms race. They make their silly laws. We make technologies that completely nullify those laws. They need to increase their overall tyranny just to maintain the exact same level of control they had before. The end result is either an uncontrollable, ungovernable, unpoliceable population, or a totalitarian state that surveils, monitors and controls everything. There is no middle ground.
We are rapidly advancing towards this totalitarianism, and we are eventually going to find out if the people have what it takes to resist and become ungovernable.
One day we will need government signatures to run software on "our" computers. All the free software in the world won't help if we can't run it. The only way to resist that is to somehow develop the means to fabricate our own chips at home.
If all of your messages can be read in plaintext, your going to have to transfer you keys some other way and it will be very detectable that you are sending encrypted messages which will be next on the chopping block.
Both apple and android are teeing their infra up to support deleting apps they don’t like. Windows is moving towards e2e attestation, and Mac is basically already there. Once that’s all done, you just need to enforce hardware manufacturers boot only into ‘trusted’ operating systems. No more Linux. No more unsigned execution. No more encryption.
It's already this dystopic, like any medium where people can talk freely gets eventually controlled by corrupt politicians etc.
Anyways, the control of speech isn't only in surveillance, it's ingrained deeply in culture, taboos, education, etc.
I have talked to religious people before, they all exhibited a certain characteristic, you could talk about somethings but you can't touch on other things, their mind won't accept it, so they bug and start saying nonsense.
I've noticed the same thing with most people when it comes to certain subjects, you'd be talking to an educated person with a relatively high IQ and a mind that is capable to think critically in certain domains, yet once you point out something their mind has been trained to deem anti cultural (like for example who controls what), they turn into Agent Smith and they stop listening to reason.
Anyways, this is HN so what I'm saying is that the control of the controllers is already absolute, it's been linearly increasing for years, they'd cause something then tighten their control of us for "our safety", until one day we get enough and some take out the guillotines and others the bald eagle etc. been happening for millenia, if we as a species were able to rebel on authority before s*it absolutely hit the fan, history would've been a lot cleaner.
LOL. People nowadays don't start revolutions not because of weapons or lack thereof. It's because they're thoroughly entertained and fed; even the entire political circus is a sort of morbid reality show: people tune in to the news to shake their head in disgust at today's latest antics, and will do so tomorrow, because it's all panem et circenses for grown-ups.
The Internet has become the greatest instrument of mass control ever created in the history of the world. It's done. As long people have their Doordash and Netflix, and are too busy working or scrolling instead of thinking deep thoughts, and reading anarchist philosophy, the kings has nothing to fear.
Also, no need to single out the EU. The entire government-as-reality-TV is well and truly an American creation, and your three-letter agencies don't even have to pass any laws to collect information about its citizens. We're all in the same shit, my brother/sister.
That's literally how we got here. People got a taste of unmitigated unprecedented freedom online for the last three decades, and found it so gross that they allowed things to swing the other way.
Even one decade ago, the threat of SOPA/PIPA rallied the internet successfully. Just over a decade later, we're at the point of allowing age verification, for morality's sake, without hardly a peep. The cypherpunks are losing, hard, and honestly, deserve failure for how well their utopia turned out.
What exactly did those people taste that it got them upset so much and who exactly those "people" are? Last time I checked these laws are pushed through as covertly and sneaky as possible and no "people" asked for them. I can't recall any demonstrations with protesters asking to violate their privacy to keep them safe for those evil internet trolls that want to have a sexual intercourse with their relatives.
You're trying to frame the classic authoritarian power grab and desire to fully control the plebs as push from the society. This doesn't sound convincing.
> You're trying to frame the classic authoritarian power grab
Half of US states now have age verification for pornography; three will be requiring age verification to even download apps soon. There is indeed a push from society to get the internet under control, even if the EU is not necessarily connected the same way.
This is a huge, unprecedented reversal of opinion over the last decade that has almost completely gone over HN's head. The EFF, TechDirt, HN, Reddit view of the world has been tried, found wanting, and is being rejected. The EFF which once rallied the internet against SOPA/PIPA... currently is yelling into a void. Nobody believes in a free internet anymore.
Civil liberties, like elections and liberal principles in general, are unfortunately only popular when the right side (coincidentally one's own) is winning
You keep saying things that are completely unsubstantiated as though they were fact. "Nobody believes..." _all people_ this, _complete failure_ that...
You're either a shill, an ideologue or arguing dishonestly.
Don't worry; HN makes such statements all the time, you can't accuse me of not grasping the format. On that note, not once did I use the words "complete failure" or "all people" despite your quotation in this thread, so please don't argue dishonestly yourself.
I cited a reality: We went from SOPA/PIPA over copyright, to no question about age verification on morality grounds. It shows a trend towards zero interest in free and open internet activism. Such a trend indicates something is severely wrong, and the idea of an open internet has become disconnected from popular belief, internationally, as something to strive for. Prove me wrong.
How we literally "got here" was Section 230. You can easily stifle free speech by holding Facebook and X accountable for every single post ever made on their platforms. But that would capsize the American investment economy, so we have to protect them just a little bit. It creates a perverse, bipartisan incentive to export the most reprehensible opinions that still qualify as legal.
European citizens (and soon, American ones too) are discovering that they never held the cards. When you ask your OEMs, cloud providers and DNS resolver who's side they're really on, it's not yours. You, the customer, hold no guillotine over their head.
> Start a revolution? You have no weapons. You can't even organize a resistance because all channels of communication are monitored.
Unlike which country? The US I presume? I see very much a lack of any revolutions in the US, and the most resistance done in the past few decades was done by people with no weapons.
I'd say most revolution-like movements of any kind in the US since the Civil War happened without weapons.
Even further, those who have traditionally been most vocal about second amendment rights are currently the biggest cheerleaders for the current authoritarian trend. Quite the plot twist.
Please stop funding, allying with and protecting the manufacturers of surveillance tools. Stop exporting Palantir products and importing privacy-destroying devices from businesses like Greyshift and Cellebrite. Insist that the US government stop shielding hackers-for-hire like NSO Group who indiscriminately lease their products for discriminatory and illegal purposes. Stop defending "OEM" control that we have all known is a stand-in for federal steering since the Snowden leaks. Stop marketing E2EE while backdooring server and client hardware for "emergency" purposes.
Do that, and you'll never be accused of hypocrisy again. Signed, a US citizen.
Which of course never happened, as each member country retains full sovereignty in every possible way you can think of, which is actually fully enshrined in the way EU works.
> Which of course never happened, as each member country retains full sovereignty in every possible way you can think of, which is actually fully enshrined in the way EU works.
Which of course is false.
> The principle was derived from an interpretation of the European Court of Justice, which ruled that European law has priority over any contravening national law, including the constitution of a member state itself.
And if you read literally one paragraph from the bit that you quoted:
"The majority of national courts have generally recognized and accepted this principle, except for the part where European law outranks a member state's constitution. As a result, national constitutional courts have also reserved the right to review the conformity of EU law with national constitutional law"
And guess why and how they are able to do that - that's right, by retaining full sovereignty of their own justice systems. Even obeying rulings of the ECHR is purely a matter of courtesy more than anything, as neither EU nor ECHR have any enforcement mechanism beyond withholding funding, as many EU member states have proven time and time and time again.
From the article, the current flavor of "threat" this is being positioned to fight is CSAM.
Does anyone believe that predators commit those heinous offenses because of the availability of encrypted channels to distribute those products of their crimes? I sure don't. The materials exist because of predators' access to children, which these surveillance measures won't solve.
Best case scenario (and this is wildly optimistic) the offenders won't be able to find any 'safe' channels to distribute their materials to each other. The authorities really think every predator will just give up and stop abusing just because of that? What a joke.
More likely of course, those criminals will just use decentralized tools that can't be suppressed or monitored, even as simple as plain old GPG and email. Therefore nothing of value will be gained from removing all privacy from all communication.
This has nothing to do with csam and arguing that point is on purpose, to distract people and the politicians can say “xp84 supports child pornography!”
It has everything to do with censorship and complete control over people’s ability to communicate. Politicians hate free speech and they want to control their citizens completely including their thoughts. This is true evil.
But politicians are - in general - neither evil, nor do they have any real incentive to ”control citizens’ thoughts”. It doesn’t make sense. They can be gullible. Non-Technical. Owned by lobbyists. Under pressure to deliver on the apparent problem of the day (csam, terror, whatever). But I don’t think there is a general crusade against privacy. That’s why I think it’s so infuriating: I’m sure it’s not even deliberately dismantling privacy. They’re doing it blindly.
This is pushed by parties that have a good track record of preserving integrity. That’s why it’s so surprising.
If they are "just doing their job" why are they asking for an exemption that would apply only to them? No, they firmly believe that safety should be gained at the cost of privacy.
> But politicians are - in general - neither evil, nor do they have any real incentive to ”control citizens’ thoughts”.
As someone coming from authoritarian state, this is such an alien line of reasoning to me. By definition, those in power want more power. The more control over the people you have, the more power you get. Ergo, you always want more control.
It's easy to overlook this if you've spent your entire life in a democratic country, as democracies have power dynamics that obscure this goal, making it less of a priority for politicians. For instance, attempting to seize too much power can backfire, giving political opponents leverage against you. However, the closer a system drifts toward autocracy and the fewer constraints on power there are, the more achievable this goal becomes and the more likely politicians are to pursue it.
Oh, and also politics selects for psychopaths who are known for their desire for control.
> I’m sure it’s not even deliberately dismantling privacy.
But it is not even dismantling privacy. ChatControl would run client-side and only report what's deemed illegal. Almost all communications are legal, and almost all of the legal communications wouldn't be reported to anyone at all. They would stay private.
The problem I see is that the "client-side scanner" has to be opaque to some extent: it's fundamentally impossible to have an open source list of illegal material without sharing the illegal material itself. Meaning that whoever controls that list can abuse it. E.g. by making the scanner report political opponents.
This is a real risk, and the reason I am against ChatControl.
> Does anyone believe that predators commit those heinous offenses because of the availability of encrypted channels to distribute those products of their crimes?
Who says that? I don't think they say that.
> The authorities really think every predator will just give up and stop abusing just because of that?
Nope, they think they will be able to arrest more predators.
> More likely of course, those criminals will just use [...]
You'd be surprised how many criminals are technically illiterate and just use whatever is the default.
The thing that is crazy to me is that they choose to go after Signal of all things. Certainly there would be higher priority targets than a messaging app that has no social networking features to speak of, if child predators were really the target here.
This is nonsense. Anyone who has the smallest clue would use Signal for anything sensitive. Of course people would use Signal to talk about illegal stuff.
I am against ChatControl. But I am amazed by all the bullshit arguments that people find to criticise ChatControl.
If you have more control, obviously it's easier to track criminals. That's not the question at all. The question is: what is the cost to society? A few decades ago, all communications were unencrypted and people were fine. Why would it be different now? That's the question you need to answer.
> A few decades ago, all communications were unencrypted and people were fine.
A few decades ago, a user base using whatever was available was about 99% lower than now. As well as governments were so illiterate that they could not read with the tech they had even those unencrypted messages.
So ChatControl means that e.g. Signal would be obligated to automatically scan pictures and messages sent for CSAM. This is beyond encryption. And if they were to actually do that, it would mean it's non sensical for people spreading this material to use it as they would immediately be caught, so they would just use other tools.
But people are talking about both - the ridiculousness of the premise that this would help combat this and additionally of course the cost of privacy.
It's beyond encryption. Teenagers sending each other pictures could get flagged by AI etc. Any of your messages and images having potential to get falsely positively flagged.
So what? If predators cannot talk to children over SnapChat, that's a win, wouldn't you say?
The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?
If you cannot audit what is being reporting (with whatever means necessary to make sure it is doing what it should be doing), then whoever controls it could abuse it.
That's the problem. That's the reason not to implement it. But it's completely overwhelmed by the flood of invalid arguments.
> The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?
By definition, they must state what is actually illegal, lest I be hidden laws with hidden punishments.
And those lists of 'illegal' need to be publicly disclosed, so we are aware.
At least in the USA a naked picture of someone who is 17y364d old is 'child porn', but that extra day makes it 'barely legal'. But yet, most USA jurisdictions say that 16y can have sex. Just that pictures are EVIL even if you take them yourself.
Again however, I tend to more agree with Stallman that CSAM or child porn picture possession should either be legal or have a mens area attached, and not strict possession. Its proof of a crime, and shouldn't in of itself be a crime.
But because a picture is a crime, we get these horrific laws.
It was unencrypted and “it was fine“ because it was technically nearly impossible to store and process all communications. Now, one small server cluster can analyse all communication channels in a country in real time. The only thing stopping it is the encryption.
All communications were unencrypted because encrypting them would have incurred unduly burdensome processing. Nowadays computers can encrypt and decrypt on the fly for virtually free.
People using online communication system were a niche, not the norm andost people didn't have the tool and knowledge to access someone else's digital communication.
You're all assuming that predators who are already deliberating using apps which are encrypted to share CSAM won't just move to something else where there is encryption – which will always be possible unless the EU fines a way to ban maths or reverts back to the pre-digital age.
This might catch the odd moron sharing stuff on Facebook or on their phone, but I doubt it will stop the average offender was is already going out of their way to use encrypted apps/services.
But okay great, at least you catch the morons I guess, but at what cost? Here in the UK it's pretty common to be arrested for tweets at it is. There's no doubt in my mind this will be used to catch individuals committing speech crimes who are currently getting away with it because they share their opinions behind closed doors.
I strongly believe it will catch the average offender. The average human doesn't have a clue about cryptography.
It won't catch all of them, of course. My point is that it is invalid to say that it won't catch anyone.
> but at what cost?
EXACTLY! The problem is that whoever controls the list of illegal material can abuse it. We fundamentally cannot audit the list because the material on this list is highly illegal. There is a risk of abuse.
Most victims of child abuse know their aggressor because it is part of their social circle: dad, mother, uncle, brother, sport coach or a friend of the parents/sibling.
Absolutely, evidence of abuse is secondary to the actual abuse.
Plus, the fact you could use/make AI/LLM/etc generate nefarious content that is hard to tell is fake, tells you the abuse isn't even what they are interested in.
No, you don’t get it. Hosting or possessing CSAM has criminal penalties even if no children were involved. For example AI generated imagery.
In fact, even if zero children are ever trafficked or abused going forward, and pedophiles only use old photos of children from 30 years ago, merely having these images is still an issue.
Conversely, the vast majority of sexual abuse of minors doesn’t involve images and goes unreported. "Considerable evidence exists to show that at least 20% of American women and 5% to 10% of American men experienced some form of sexual abuse as children" (Finkelhor, 1994). "Most sexual abuse is committed by men (90%) and by persons known to the child (70% to 90%), with family members constituting one-third to one-half of the perpetrators against girls and 10% to 20% of the perpetrators against boys" (Finkelhor, 1994).
In short - if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it.
"In short - if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it."
What would be the most straightforwand way? Install a camera in every home?
Yes, abuse is usually more to be found inside families. And the solution kind of complicated, involving social workers, phone numbers victims can call, safe houses for mothers with children to flee into, police officers with sensitive training who care, teachers who are not burned out to actually pay attention to troubled kids ...
> if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it.
* First, this is not what politicians do. What they want is to look like they are fighting it.
* Second, what is your more straightforward way to fight CSAM? Asking for a backdoor is pretty straightforward, I find. I would rather say that fighting CSAM is more difficult than that.
This is completely untrue! Important communications have always been enciphered since language has been created I’d wager, whether that cipher is specific terms (grog means attack that person in 10 seconds!) or a book cipher, e.i. The first letter of a bible verse than the second letter of the next verse etc. Humans have been encrypting communication since communication was possible.
It is now only recently possible to dragnet in mass many communications, store, and analyze them. The past decades have brought new threats to privacy democracy through breaking encryption at the state scale.
Are least where I'm from, there are pretty strong laws against reading snailmail post of others. To this day, any law enforcement that tries to open people's snail mail will laughed out of the courtroom, and quite possibly out of their jobs too!
Today nobody uses snail mail. This proposal is the equivalent of proposing to read everyone's private letters back in the day.
Well, what is "the problem"? Is it children being abused, or is it the distribution of CSAM?
And if you say both - how would you rate the relative severity of the two problems? Specifically, if you had to pick between preventing the rape of a child, and preventing N acts of CSAM distribution, how big would N have to be to make it worth choosing the latter?
I don't think they care what N is, they are just scapegoating a vile group they know will have no defenders, and they can use it to silence the critics by associating them with that group.
What's worse for you? Being raped as a child. Or, having people sexually gratify themselves looking at images of you being abused; using those images to groom other children, or to trade and encourage the rape of other children?
You might as well ask someone which eye they prefer to have gouged out with a blunt screw.
Let's do both: try to stop child sexual abuse and try to stop images of abused children being used by abusers.
Ask anyone you know who has been sexually assaulted or raped what they think of the idea of pictures or recordings of that being both kept by the perpetrator and widely disseminated. I think you'll find very few who'd say that's totally fine. But given that there can be no CSAM without child abuse, the direct physical abuse is clearly the primary problem.
> You're seriously suggesting that any policy that doesn't 100% eliminate a problem is a joke?
I think a more charitable reading is that any policy that doesn't 100% _target_ a problem is a joke. This policy doesn't have a plausible way that it will protect children from being victimized, so I think it's reasonable to remove the "think of the children" cloak it's wearing and assess it on the merits of whether encryption is beneficial for the social discourse of a society.
> This policy doesn't have a plausible way that it will protect children from being victimized
Of course it does. "It will detect and report messages from predators to children, therefore preventing the child to get to the point where they send revealing pictures or meet the predator in person". Done.
Well, maybe the word "plausible" is doing too much work in my statement.
Most abuse happens from people known to the child, and of that portion, most are family members. It seems like there is sufficient opportunity in-person comms to route around this limitation.
Moreover, even the communications that do happen online can still easily happen through encrypted media; presumably the perpetrators will simply move to other ways of communicating. And kids, at least kids over 10 or so, don't seem like a demographic particularly likely to follow this law anyhow.
There's another nuance worth considering: by and large, parents _want_ their kids to have access to encrypted communications. I'll happily assist my kiddo in maintaining good opsec - that's much more important to me than some silly and uninformed policy decision being made far away by people I've never met.
So, the kids are still going to be where the encrypted comms are. I still think it's reasonable to say that the protections offered to kids by criminalizing encryption are implausible.
> Most abuse happens from people known to the child
Sure, but it means that at least some happen from people unknown to the child. If ChatControl doesn't cause any problem but helps preventing those abuses, then it's worth it. The question is: what are the problems caused by ChatControl?
Saying "only a minority of children get abused this way, so it's not worth it" won't go far, IMO. It's not a valid argument against ChatControl in itself.
> presumably the perpetrators will simply move to other ways of communicating.
The perpetrators have to contact kids over apps that the kids use. Like Snapchat or TikTok. It's not like the kids will routinely install a weird app to talk to weird people...
> parents _want_ their kids to have access to encrypted communications.
But ChatControl doesn't remove the encryption! It scans everything locally, before it gets encrypted and sent.
> by criminalizing encryption
It's not criminalizing encryption: it's forcing a local scan on your device. Just like there are already scans happening on clouds for non-E2EE data.
Don't get me wrong: I am against ChatControl. For me the problem is that I see a potential for abuse with the "list" (whether it's a list or a sum of weights) of illegal material. This list cannot be made public (because it's highly illegal material), so it's hard to audit. So whoever has control over it can abuse it, e.g. to find political opponents. That's my problem with ChatControl.
> Best case scenario (and this is wildly optimistic) the offenders won't be able to find any 'safe' channels to distribute their materials to each other.
The theory is based on the documented fact that most crime is poorly thought through with terrible operational security. 41% is straight up opportunistic, spur of the moment, zero planning.
It won't stop technologically savvy predators who plan things carefully; but that statistically is probably only a few percent of predators; so yes, it's probably pretty darn effective. There are no shortage of laws that are less effective that you probably don't want repealed - like how 40% of murderers and 75% of rapists get away with it. Sleep well tonight.
Exactly. Econ 101: why do consumption taxes work at all? By increasing the amount of pain associated with purchasing a particular indulgent product, you decrease the consumption of that product on the margin. When you increase the price of cigarettes by 20%, cigarette smoking in a society decreases. But for the most addicted, no consumption tax will probably act as a deterrent.
Some individuals will find a way to distribute and consume child pornography no matter the cost. But other addicted individuals will stop consuming if doing so becomes so laborious because they are consuming or distributing on the margin. I.e, imagine the individual who doesn't want to be consuming it, who knows they shouldn't—this type of deterrent may be the breaking point that gets them to stop altogether. And if you reduce the amount of consumption or production by any measure, you decrease a hell of a lot of suffering.
But anyway, the goal of this legislation is not to drive the level of distribution to 0. The goal of policymakers could be seen charitably as an attempt to curtail consumption, because any reduction in consumption is a good thing.
Let's say you're actually texting in a group. Even if you use perfect operational security, odds are terrible that all members of your group will perfectly uphold the same level of security every time they share their content.
One is going to slip up. He's going to get arrested. And he's going to turn the whole group in to reduce his sentence. Everyone else meanwhile has their operational security become proof of intent, proof of deliberation, proof of trying to evade authorities. They thought they were clever with the encrypted ZIP files, but the judge and jury are going to be merciless. I don't think most authorities have a problem with that.
I think the challenge for society here is not to simply reject attempts like this, but how to prevent them from being pushed over and over until a specific context allows it to be approved.
The accepted solution is to have a constitution that says otherwise.
Which is a bit complicated here, as the EU has no real constitution and this 'law' (really a regulation) is a blatant violation of the constitutions of countries that did choose to establish secrecy of correspondence.
> The accepted solution is to have a constitution that says otherwise
And the willingness and ability to enforce it. The current iteration of ChatControl is pushed by Denmark, which is at present the President of the Council of the European Union. The Danish Constitution itself enshrines the right to privacy of communication [0], but this is not stopping Denmark from wanting to ratify ChatControl anyway.
In the Netherlands we have the “Eerste kamer” (first chamber, also called Senate) that is responsible for verifying that the proposed laws are in accordance with our “constitution”. They are elected of band with the normal government which should ensure that no single party is able to steamroll laws through both chambers.
In theory the "Bundespräsident" in Germany is supposed to only ratify laws that are in accordance with the constitution, but I don't think it happens that he refuses to do this.
> but this is not stopping Denmark from wanting to ratify ChatControl anyway.
What the TLDR of the motivation behind this? Is it just politicians playing to their base (think of the children) or corporate lobbying. or religion, etc?
Seems to me that the negatives of passing something like this are super obvious and dystopian.
I suspect it's a mix of many Danish politicians' own authoritarian tendencies/ambitions and corporate lobbying, though I have no proof of the latter when it comes to ChatControl specifically.
Generally speaking, the Danish government also tends to behave in authoritarian ways. E.g., Denmark has wilfully violated EU regulations on data retention for many, many years. In 2021, a Danish court ruled that the Danish Ministry of Justice could continue its mass surveillance practices even though they were (and still are) illegal under EU law: https://www.information.dk/indland/2021/06/justitsministerie...
Currently Denmark is also trying to leverage its position as the President of the Council of the EU to legalise, on a EU-wide level, the form of data retention that Denmark has been illegally practising: https://ec.europa.eu/info/law/better-regulation/have-your-sa...
Interesting. I am not expert on politics of Denmark, so my question is: is this push universal across political parties or it’s a feature of a specific political block that rules for the past X years and consistently worked in this direction?
Generalized, this looks to me like a question about why humans sometimes get hell-bent about some idea and become blind to the side effects and ignorant when it comes to risk management.
Sometimes it could be malice or personal gains. Sometimes, I think, it could be just a strong bias towards some idea that causes a mental blindness. Such blindness can happen to anyone, at any level of power (or lack thereof), politicians are not unique in this - the only difference is the scope of impact due to the power they have. And we aren't particularly filtering them against such behavior - on the contrary, I feel that many people want politicians to have an agenda and even cheer when they put their agenda above the actual reality, any consequences be damned.
Both the right to privacy and the right to protection of personal data appear to have pretty big exemptions for government.
The right to private communications was modified by the ECHR to give an exemption for prevention of crime/protection of morals/etc.[1] and the right to protection of personal data exempts any legitimate basis laid down by law[2].
I imagine they'd be able to figure out some form of Chat Control that passed legal muster. Perhaps a reduced version of Chat Control, say, demanding secret key escrow, but only demanding data access/scans of those suspected of a crime rather than everyone.
Legal rulings also seem to indicate that general scanning could be permitted if there was a serious threat to national security, so once a system to allow breaking encryption and scanning is in place, then it could be extended to what they want with the right excuse.
> I imagine they'd be able to figure out some form of Chat Control that passed legal muster. Perhaps a reduced version of Chat Control, say, demanding secret key escrow, but only demanding data access/scans of those suspected of a crime rather than everyone.
Isn't that pretty much excatly how it is done in Russia, which was ruled by ECHR to be illegal[0]?
I'm not familiar with EU law, but reading Title II article 7 and 8 makes me feel this could be an optimistic interpretation of what the Treaty of Lisbon guarantees. I'm sure the supporters of chat control would love to argue something like "ChatControl respects the private communications of an individual by protecting how the data is processed to ensure only the legitimate basis of processing the data is incurred by the law" in court.
I would hope the EU courts would disagree, but I'm not sure if anyone can say until it's tested directly.
Even the EU council's legal service thinks the law as-proposed is probably incompatible with Article 7 and 8:
> The CLS concludes that, in the light of the case law of the Court of Justice at this stage, the regime of the detection order, as currently provided for by the proposed Regulation with regard to interpersonal communications, constitutes a particularly serious limitation to the rights to privacy and personal data protection enshrined in Article 7 and 8 of the Charter.
I think there are variants of the ChatControl proposal which were clearly problematic, but the different variations of the proposal try to toe the line since. This report talks to the 2022 era proposal.
I think of constitution as a contract between the citizens and the state and the (judiciary?)
Like, constitution both defines the rights of citizens and the limits of those rights and the same goes for the states.
I feel as if the creators of constitutions think that it is a set of checks and balances...
Just as if how a citizen violates something written in the constitution, the state can punish it.
In the same manner, I believe that the constitution thought that if the state violates some constitutional right of citizen, then citizens can point that out and (punish?) the state as the legitimacy of state is through that constitution which they might be breaking...
I concur (fancy word for believe which I wanted to share lol) you are talking about america. The thing is, revolutions are often messy and so much things are happening in america that I think that people are just overwhelmed and have even forgotten all the stuff happening in the past... Like tarrifs were huge thing, then epstein news then this I think autism thing by trump.
Like, the amount of political discourse is happening less and idk, oh shit, just remembered the uh person deporting thing which was illegal which was done anyway
If these things happened in isolation, they would all have huge actions against govt. but they are happening back to back and so everyone's just kinda silent I think, frankly I believe overwhelmed.
I believe that just as in nepal, in america everyone is whining on social media but nobody's taking action. Nepal blocked social media and so people in nepal were kinda forced to take action irl and it worked kinda nice in the end tbh
So maybe its social media which is enabling this thing.... which is funny to me as I am doing the same thing right now lol
A large portion of the population either does not believe or does not mind the violations of our constitution to achieve their desired outcomes. As an American, it came as a surprise to me that we do not, in fact, have broadly shared values about our system of governance. This year has been a devastating blow to my confidence in our democracy and the ability of people to govern themselves generally.
> This year has been a devastating blow to my confidence in our democracy and the ability of people to govern themselves generally.
The latter has been on my mind for quite some time.
The logical conclusion of "people can't govern themselves generally" kind of gestures at religion as a solution - after all, if man cannot govern themselves, why not rely on a higher power to manage them?
Of course, the problem with that point of view is that from the atheistic perspective, there is no higher power, and from the agnostic perspective, whatever higher power there is is inscrutable and beyond our ken.
This then leads me to the conclusion that religion is ultimately a creation of men, and are thus prone to the same power-corrupting vices as any other institution created by men.
Except that leaves no real solution the problem of the governance of people. And it's a quandary I see no realistic chance of escape from.
> As an American, it came as a surprise to me that we do not, in fact, have broadly shared values about our system of governance.
It shouldn't, America is two very distinct nations. The shape and nature of those nations vary wildly in classical Baudrilliardian sidewinding progression, but it's rooted in the very early history of British North America. Two distinct primogenitor colonies and societies, Jamestown and Plymouth. Founded for different reasons, in different contexts, by different people. Understanding the disparity is key to understanding a great deal about America. This divide has always persisted. Jefferson was of Tidewater, Hamilton was of Yankeedom. Democrats vs Whigs. Dixie vs Yankeedom. This split persists in history, and is much the reason why America is ostensibly a two party system. Even if the regional divide is not as hard and fast as it once was, even if the matters in which they differ change radically over time, the divide itself will always persist. It's wrapped up in the pre-revolutionary context the country was founded on. America will always be two countries in a trenchcoat, two echoes of wildly different cultures set against each other for dominance. You should always be keen to remember that. The union isn't of 13 distinct colonies, but two distinct cultures always in tension. It's a fundamental structure within our larger cultural blueprint.
Of course I understood there were vast cultural and political differences causing tension. I just also believed that we had a shared system of fundamental values enshrined in the constitution and when push came to shove, we would all rally behind it. That's what I thought American patriotism meant; I genuinely thought I could count on Red voters to rabidly defend the constitution.
You are most definitely not right. The EU charter of fundamental rights is an agreement that holds legal binding. The institutions who are supposed to uphold the charter are CJEU, European Commission, FRA, NHRIs.
The people who wrote this proposal said it themselves - "Whilst different in nature and generally speaking less intrusive, the newly created power to issue removal orders in respect of known child sexual abuse material certainly also affects fundamental rights, most notably those of the users concerned relating to freedom of expression and information."
This proposal is illegal. The fact that CJEU at least haven't issued a statement that this is illegal tells you everything you need to know about the EU and its democracy.
Plenty of EU states already have a constitution in which this proposal would be de facto unconstitutional.
The issue is what is the European Commission willing to do in order to guarantee that fat contract check goes to Palantir or Thorn or whoever has the best quid pro quo of the day.
This is not Stasi this is Tech billionaires playing kings and buying the EC and Europol for pennies on the dollar and with it the privacy of virtually every citizen of zero interest for law enforcement or agencies.
fwiw, amending the US constitution generally requires a 2/3 majority in both houses of congress to propose the amendment, and then further ratification by 3/4 of the states make the amendment law. it's a fairly long process, and amendments sometime get bogged down and die in the 2nd phase.
(there is another process which calls for a convention, but such a convention would have broad powers to change many things and so far the "two sides" (US rules tilt toward two parties rather than more) have been too scared of what might happen to do that)
> The accepted solution is to have a constitution that says otherwise.
Constitutions don't enforce themselves. The US constitution has a crystal clear right to bear arms but multiple jurisdictions ignore it and multiple supreme court rulings and make firearm ownership functionally impossible anyway. Free speech regulations have, thankfully, been more robust.
The only thing that stops bad things happening is a critical mass of people who believe in the values the constitution memorializes and who have enough veto power to stop attempts to erode these values.
The US has such a critical mass, the gun debate notwithstanding. Does the EU have enough people who still believe in freedom?
How so? My point is that US constitutional protections on firearm ownership have undeniably eroded. The presence of text on the page did not prevent this erosion. I'm using gun rights as an example of a situation in which text granting a right becomes irrelevant if people stop believing in the values behind the text.
People do believe in freedom of speech in the US, thankfully, even if they've stopped defending gun rights in some places.
EU free speech protections are in the same position gun rights are in the US, and for surprisingly similar reasons.
This simply isn't true. If anything, constitutional protections have dramatically expanded since the amendment was passed.
This is because until the 14th Amendment and the incorporation doctrine, the Bill of Rights only restricted the Federal government, not the States. Prior to the that, state and local governments could (and did) restrict not just firearms, but other rights as well.
Hell, the Bill of Rights still hasn't been fully incorporated, so for instance, despite the 7th Amendment stating otherwise, you don't have the right to a jury trial in civil cases in every state nor the right to indictment by grand jury (5th Amendment).
Of course, some states copied parts of the constitution into their own and had some form of protection, but it was by no means universal. Massachusetts even had a state church until 1833.
when you are talking to a european audience, they tend to be in favor of gun control so they don't care about erosion of those rights (like the people in the US who also favor eroding them, wording of the rules be damned)
HN is to a large extent a popularity contest, and people here are more in favor of free speech than guns. the US record on protecting free speech is very good.
> you are talking to a european audience, they tend to be in favor of gun control so they don't care about erosion of those rights
You have accidentally properly identified the european problem and precisely the reason that chat control will pass: shortsightedness. If people only rise up to protect rights "they need", soon no rights will be left.
Most of the erosion is done through court challenges.
Historically, courts have maintained that legislation is pursued under "good faith". This was the justification for not overturning ACA on the grounds of it being an unconstitutional tax: the lawmakers didn't mean to make it an unapportioned tax, even though it effectively is, so it's okay yall. Washington St just did this with income taxes on capital gains in direct violation of their state constitution a year or two ago.
Where I live, you cannot open carry. That is a direct violation of 2A, but the courts have said it's okay baby because it's not an undue burden to pay a fee and waste a day of your life. Pure nonsense. Just change the constitution for goodness sake.
> The US constitution has a crystal clear right to bear arms
It looks like it was drafted by an ESL speaker. It's by far the worst-drafted amendment, grammatically speaking:
> A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.
It's not even a valid English sentence, and it certainly never bothers to define "Arms." Not to mention that, as written, it appears to make it illegal for me to tell you that you cannot come to my house with a gun, because that's me infringing your right. It doesn't constrain Congress. It constrained anyone who wants to take away your right to bear arms.
Sheer lunacy as written. Ungrammatical and implies some insane shit.
But no, you're right, it's crystal clear. Much like how the First Amendment says
> Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press
which in crystal clear terms makes it legal to mass-distribute child pornography. To prohibit it would restrict the freedom of the press.
Ok, get a 2/3 majority of the House and Senate to approve a proposed edit removing those commas, and then get 3/4 of the state legislatures to approve it.
Until then, the commas are officially part of the text.
But the text of the Constitution only changes through amendments.
That said, the effective meaning of the Constitution is "whatever a majority of the Supreme Court agrees it is."
And to a degree, given the power to impeach Supreme Court justices, "whatever a majority of the Supreme Court agrees it is, and with Congress sufficiently on board to not impeach sufficient justices to force a shift in the balance of the Court."
But you chose to tell us about your interpretation. :-)
I had less the current legal interpretation and more the meaning at the time of writing down, as it would reveal itself in current text, in mind, which is relevant to this argument.
KPGv2 pointed out the phrasing of the 2nd amendment is not clear.
GLdRH said "Remove the first and last comma and the sentence works splendidly".
I said that modifying the literal text requires going through the amendment process.
You said "Comma rules change over time."
I reiterated that the literal text does not change except through the amendment process, and also noted that fundamentally the literal words don't matter much as it's up to a majority of the Supreme Court how to interpret any of it.
You then brought up modern language usages of commas.
I replied that how you or I today interpret the text is irrelevant because only the Supreme Court's opinion matters.
At no point in this conversation have I expressed a specific interpretation of the text, so your indication that I chose to tell the discussion about my interpretation seems weird and maybe you're misreading usernames somewhere along the way.
, then when you want to discuss meaning issues due to grammar rules, you need to use 18th century grammar. I perceived GLdRH to use 21th century grammar to encode the same sentence. The literal text does not need to be modified, since it uses 18th century grammar rules. Only when you want to parse it with 21th century grammar rules, you need to preprocess it to adjust the grammar first. This preprocessing doesn't need to be written back, since the grammar rules of the text haven't changed. We are only circumventing the parser not supporting the texts grammar.
> only the Supreme Court's opinion matters.
This is purely about syntactic issues, not about semantics. The Supreme Court applies also semantics, such as the other legal system definitions of the time. I wasn't replying to that aspect.
I've commented this elsewhere, but rights in the US are generally much more absolute than here in Europe.
For example, in the EU you technically have the right to freedom of expression, but you can also be arrested if you say something that could offend someone.
Similarly rights to privacy are often ignored whenever a justification can be made that it's appropriate to do so.
I don't know about elsewhere in the world, but here in the UK you don't even have a right to remain silent because the government added a loophole so that if you're arrested in a UK airport they can arbitrarily force you to answer their questions and provide passwords for any private devices. For this reason you often here reports of people being randomly arrested in UK airports, and the government does this deliberately so they can violate your rights.
> For example, in the EU you technically have the right to freedom of expression, but you can also be arrested if you say something that could offend someone.
Oral violence also has consequences. From invoking or reinforcing mental diseases over fear and isolation to blackmail and being socially judged on while being innocent. Do you accept random beatings when people feel like it on the street?
It all depends who’s defining “hate”. The people you like who are in charge today won’t be there in 20 years, and if any kind of extremism leaks in to society, you could find yourself unable to advocate for your beliefs without getting arrested.
For example, the US government is trying to label any posthumous criticism of Charlie Kirk "Hate Speech". You can see how dangerous this could be when the hate-mongers get to decide what is considered hate speech.
Honestly, the current administration baffles me. There is so much activity that flies squarely against the constitution in a not at all subtle or clever way; just blatant, "I don't care."
It's one thing to be disruptive and enforce immigration law "by the books" but entirely separate to then go out of your way to not enforce it legally while at the same time violating or attempting to violate the constitution on pretty fundamental levels.
The only way I see to prevent the constant pushing is that every single time some council or committee presents something like this every single of one of their private communication gets leaked for everyone to peruse at their leisure from whatsapp to bank statements.
They want to erode people's privacy? Let them walk their talk first and see how that goes.
Tempting though that is, I think that's the wrong way to resolve it: The people proposing it (law people) are a different culture than us (computer people), and likely have a funamental misunderstanding about the necessary consequences of what they're asking for.
Why would they exclude themselves from the rule if they werent worry about it? Its not like theres no pedophiles in those positions. I wonder who are they going to offer the job of watching the photos of families with kids for this.
> Why would they exclude themselves from the rule if they werent worry about it?
They don't even understand that they haven't. Sure, they've written the words to exclude themselves (e.g. UK's Investigatory Powers Act), but that's just not how computers work.
The people who write these laws, live in a world where a human can personally review if evidence was gathered unlawfully, and just throw out unlawful evidence.
A hacked computer can imitate a police officer a million times a second, the hacker controlling that computer can be untraceable, and they can do it for blackmail on 98% of literally everyone with any skeleton in the closet at the same time for less than any of these people earn in a week.
The people proposing these laws just haven't internalised that yet.
> how to prevent them from being pushed over and over until a specific context allows it to be approved.
We need more diverse mobile OSes that can be used as daily drivers. Right now, it's almost a mono-culture with the Apple-Google duopoly. Without this duopoly, centralization and totalitarian temptations would be less likely.
There's GrapheneOS, which is excellent and can be used without Google, but it relies on Google hardware and might be susceptible to viability issues if/when Google closes down AOSP. Nevertheless, they are working on their own device that will come with GrapheneOS pre-installed, which is exciting.
There's also SailfishOS, which has a regular GNU/Linux userland and almost usable at this stage with native applications. As a stopgap, it can also run Android applications with an emulation layer, and plenty of banking ones work just fine.
I think "hacktivist" here means hacking into the politician's inboxes and leaking the contents, like "politicians want to do this to you; let's see how they like it when it's done to them" sort of thing.
>The only way I see to prevent the constant pushing is that every single time some council or committee presents something like this
Yes but.. it can't just be vague exhortations and generalities. I didn't know the pertinent bodies previously, but after GPT'ing on it, it looks like they include:
One is "DG Home," an EU department on security that drafts legislation.
Another is Europol, a security coordination body that can't legislate but frequently advocates for this kind of legislation.
And then there's LEWP, The law enforcement working party, a "working group" comprised of security officials from member EU states, also involved in EU policy making in some capacity.
I think the blocking states should be resisting these at these respective bodies too.
I'm convinced the people suggesting this type of thing are influenced or even compromised by their constituent's enemies and NOT the result of poor education on the topic.
This policy for example would be most helpful to enemies to the EU. It would lower the cost of acquiring the data for China and Russia as it allows them to mass acquire data in transmission without incurring the cost of local operations. The easiest system in the world to hack is that of a policy maker.
> It would lower the cost of acquiring the data for China and Russia
Yes, it would lower such barriers for countries that are commonly seen today as Europe's adversaries. But in this case, the U.S. (or rather, U.S. organisations and corporations) might be the primary bad actor pushing for ChatControl. See e.g.:
"Thorn works with a group of technology partners who serve the organization as members of the Technology Task Force. The goal of the program includes developing technological barriers and initiatives to ensure the safety of children online and deter sexual predators on the Internet. Various corporate members of the task force include Facebook, Google, Irdeto, Microsoft, Mozilla, Palantir, Salesforce Foundation, Symantec, and Twitter.[7] ... Netzpolitik.org and the investigative platform Follow the Money criticize that "Thorn has blurred the line between advocacy for children’s rights and its own interest as a vendor of scanning software."[11][12] The possible conflict of interest has also been picked up by Balkan Insight,[13] Le Monde,[14] and El Diario.[15] A documentary by the German public-service television broadcaster ZDF criticizes Thorn’s influence on the legislative process of the European Union for a law from which Thorn would profit financially.[16][17] A move of a former member of Europol to Thorn has been found to be maladministration by the European Ombudsman Emily O'Reilly.[18][19]"
Additionally, it would not surprise me at all if Palantir is lobbying for this either. Many EU countries, like Germany and Denmark, have already integrated Palantir's software into the intelligence, defence, and policing arms of their governments.
But at the end of the day, while it is convenient to blame external actors like U.S. corporations, ultimately the blame lies solely on the shoulders of European politicians. People in positions of power will tend to seek more, and I'm sure European politicians are more than happy to wield these tools for their own gain regardless of whether Palantir or Thorn is lobbying them.
you have left out how it can be used to monitor violation of corporate copyright materials.
And what it means for silencing political speech is enormous.
I would argue that a surefire way of guaranteeing the right to privacy is to instead continuously push for absolute-transparency laws for politicians and governments. If they’re going to demand every private citizen’s records are always open for view, then the same should be said for governments - no security clearances, no redactions, no “National Security” excuse.
Is it patently unreasonable? Yes, but cloaked in the “combat corruption” excuse it can be just as effective in a highly-partisan society such as this - just like their “bUt WhAt AbOuT tHe ChIlDrEn” bullshit props up their demands for global surveillance.
If only we could show them how this kind of things may go wrong. I don't know, the case of some leader of a nation they are having trouble with, abusing of a similar access with their data.
But they will probably think that is only bad when others do it to them.
> If only we could show them how this kind of things may go wrong.
We can. This has already happened with the fairly recent SALT TYPHOON hacks. China (ostensibly) abused lawful wiretapping mechanisms to spy on American (and other) citizens and politicians. The news at the time wasn't always explicit about the mechanism, but that's what happened.
China wouldn't have been able to do this if those mechanisms didn't exist in the first place.
The only real option is to get your country to leave the EU. An unelected cabal of people making sweeping decisions for countless member states isn't democratic, so yeet it while you can.
>European Commission: Commissioners are nominated by elected national governments and must be approved by the directly elected European Parliament.
With so many levels of indirection, that citizen votes are irrelevant and they don't need to care about it - only about support of major political group at the top. And surprisingly enough Parliment is relatively stable.
>Council of the EU: Ministers are accountable to their national parliaments, which are elected by citizens.
same as above.
i don't advocate for leaving the EU, but this needs to change. Those positions, which are the ones pushing for such legislation usually, need to be held accountable by citizens. At least EC.
No more rotations, or other such bullshit.
Right now EU is sitting in middle ground between federation and trade union, reaping(from citizens point of view) downsides of both systems.
By implementing direct democracy via internet, which creates laws which disallow that.
But, amongst a few others, there is a technical problem, how do we log in to vote? That mechanism must be unhackable, configurable by computer illiterates, and it must not invade privacy.
This has to be written in the constitution somehow ; it has to comes down to the values of everyone - and i believe a lot of education has to do with it. Currently people are simply not tilted by it as much - or not in a way comparable to other topics.
The prevention has to be in the underlying layer of physics / math / the internet such that the state is _unable _ to make (or at least enforce) such laws.
We need to accept and celebrate a world in which the capabilities of states are constrained by our innovations, not merely the extremely occasional votes we cast.
Here's the proof: https://en.wikipedia.org/wiki/Mass_killings_under_communist_... . Those kinds of mass killings can only happen when the citizens are disarmed, because it's logistically impossible for a government to seize absolute power when a significant proportion of the citizens are armed.
> it's logistically impossible for a government to seize absolute power when a significant proportion of the citizens are armed.
This is literally, and provably, untrue. For example:
The Soviet Union: The Bolsheviks initially proclaimed that "the arming of the working people" was essential to prevent "restoration of the power of the exploiters". It was only later that they restricted private gun ownership.
The Nazis: Contrary to popular gun rights narratives, Nazi gun laws actually relaxed restrictions for most Germans while targeting specific groups. Sometimes authoritarianism is the same as populism.
Rwanda: Prior to the genocide, the government systematically distributed weapons to local administrators and militia groups while ensuring targeted populations remained defenseless.
Myanmar: Armed civilian resistance groups formed, but the were essentially wiped out by the overwhelming advantages in air power and heavy weaponry that an actual organized military had. The firearms were useless. Arguably, worse than useless as those who fought back died in large numbers.
Venezuela: The regime armed its supporters while systematically removing weapons from the general population. The population was well armed, they just couldn't fight back against an organized government response.
Really? Why does America, the country with the most guns by far, have the most gun deaths by far? It's very tiring arguing these very obvious points over and over.
Nazi Germany, Communist China and Soviet Russia have by far the largest number of deaths by _men with guns_, over a hundred million people killed by their own governments. The guns of US citizens have so far prevented this kind of government-led mass citizen genocide from happening. The number of people killed by gun violence in the US is a rounding error compared to the number of people killed by Mao, Hitler and Stalin.
Most people killed by these regimes killed people as aliens. If truly want to compare the actions of the USA, you must also count there handling of there aliens (e.g. in wars).
> The guns of US citizens have so far prevented this kind of government-led mass citizen genocide from happening.
No they haven't. Our system of checks and balances has. At no point has there been a civil war in which the US's citizens attempted to fight back against the US military. If there were, the citizens would lose without even presenting a challenge.
>the citizens would lose without even presenting a challenge.
That's not true. The US Army spent 20 years and trillions of dollars trying to impose regime change on Afghanistan, but were defeated by a group (the Taliban) that had very little military capability beyond men with rifles and some explosives to make improvised bombs. (Yes, they also had decades-old weapons with which to shoot down helicopters.) Algeria's war of independence from France in the 1950s and early 1960s is another example where a group with very little in the way of military capability defeated one of the most powerful militaries in the world.
I don't necessarily buy the argument that the US should continue with the gun status quo just because all those guns would come in handy in a revolution, but you haven't successfully refuted the argument.
The Algerian war doesn't really prove much either, except that terrorism works.
The Algerians hid within the population and gradually picked at the French, like flies biting a bull. Eventually the French got bored and wandered off to find a new form of entertainment. If anything the French lost to propaganda, not guns.
Yes, but we're discussing a civil war or revolution in the US, where the rebels or revolutionaries would be able to engage in terrorism and to hide within the population -- and where there are so many long guns in private hands that the defending force (the government) probably wouldn't be able to deprive the attacking force (the rebels) of long guns simply by punishing any civilian found with a long gun in their home.
My point is that it wasn't the guns that saved the Algerians. Knives, bayonets, and IEDs would have been equaly effective for the sort of guerilla tactics that eventually won the war.
The Afghanistan bit is over simplified isn't it? My understanding is that the US military successfully imposed regime change between 2001-2003. I doubt those rifles slowed the tanks and bombers much at all.
The fact that we packed up and left eventually doesn't really change the fact that the US rolled over the men with guns like they weren't there in the early 2000's.
There are no solutions to that which wouldn't sound absurd. But if you could get past absurdity...
Politicians should agree to to be executed if they lose an election. Only those willing to risk their lives should be allowed to legislate. This also gives the voters the option of punishing those who pass onerous laws at the next election.
If you need extra zing, this would also apply to recall elections, so they could even be punished early.
I think it would be better if they agree to be executed if they win the election, after serving their term.
Maybe a less extreme version of this is that if you become president you are stripped of all property and become the ward of the state after your term is over, enter a monastery sort of situation, for the rest of your life.
One could argue that Putin won't stop the current war against Ukraine for the very same reason. He is obsessed with Gaddafi's undignified end in a ditch and cannot be seen as weak.
The GP's idea is very bad. Quite to the contrary, losing power should not come with disastrous personal consequences.
If they can't be punished for continuing to push bad laws, then they will continue to push them... because they benefit from those when they inevitably pass. So there are no solutions. You live in a world where Putin still exists, is still doing these godawful things, but the suggestion that if a politician loses an election his life is forfeit makes you fear that the things that already happen would happen. Or something. It's sort of sad.
The motivation in Denmark was some big cases where organized crime was only caught due to a huge hacking operation where the police was able to monitor communication on the apps commonly used by the criminals. That allowed them to take very dangerous people off the streets and now they want to do more of that, more easily. I think the discussion can never be in terms of absolutes. If your family was murdered by some criminal that was never caught earlier , but could have been if the police had access to their chats, would you still be against it? We need to remember that we’re making that decision for some future victim if we do agree that this will assist the police effectively. The other side says the police will undoubtedly abuse their powers. In which case how does the results compare?? If you think the answer is easy, one way or another, you are definitely wrong.
But the CSAM regulation under discussion doesn't do any of the things you're claiming. It mandates content scanning for CSAM and other related messages. It does not call for key escrow and decryption of messages involving organized crime. So it's not clear how you would do much against serious organized criminals with this law.
Nobody here argues against wiretaps after court rulings. The discussion here is about mandating sending a transcript of every communication you do to the state (unless you work for the specific parts of the state).
Google, Facebook, Apple, Microsoft and Amazon cannot send armed men to my front door.
Yes, they (well, google and amazon, I don't have accounts with other vendors) can terminate my accounts, but, to be honest, it is not big deal for me, especially comparing to be dragged out of my house by police, especially now, when I live in EU with residence permit and not full citizenship.
You only believe that because you have chosen to believe it.
Take Facebook end-to-end encrypted messages for example. There are certain links it won't let you send, enough though it is supposedly E2EE. (I've seen it in situations like mentioning the piratebay domain name, which it tries to auto-preview and then fails. Hacking related websites as well I've seen the issue with.)
It likes to pretend it is a mysterious error, but if you immediately send a different link, it sends just fine. I don't use chat apps much these days, so I'm not sure if others see similar behavior, but I'd wager some do. Facebook is about the least trustworthy provider I'm likely to use, FWIW, so I expect a certain amount of smoke and mirrors from them.
> the proposed legislation includes exemptions for government accounts used for “national security purposes, maintaining law and order or military purposes”. Convenient.
I can buy the military exemption, and maybe some very top level government workers that are effectively military (example: POTUS). But the EU parliament has no reason to be excluded. It is definitely a terrible law if it is so bad that they won't pass it unless they are excluded.
> top level government workers that are effectively military (example: POTUS)
POTUS is very specifically NOT a member of the military. Elected civilian control was the whole point. Even Eisenhower had to (temporarily) give up his general rank to serve as president.
Or as someone put it, "People shouldn't fear the government. The government should fear the people."
I feel like we've lost the vocabulary we ought to be using to talk about the legitimacy and role of the state. More people need to read J.S. Mill (and probably Hobbes.) Even today, works by both are surprisingly good reads and embed a lot of thoughtful and timeless wisdom.
Governments need privacy. They literally investigate child mollestation cases. They hunt spies. They handle all sorts of messy things like divorce between couples with abuse.
I'm not commenting on the government coming in at unveiling encrypted communications, but certainly a better approach than "governments should be transparent and the people should be opaque" would be "governments should be translucent and the people should be translucent too".
There is a clear difference between specific activities that need privacy (especially if it is temporary privacy or cases where it is protecting the privacy of the citizens not the government itself) and privacy by default for most or all government work.
I regularly see similar articles with similar comments here, but there's one thing I still don't understand:
From the European Convention on Human Rights[1]:
ARTICLE 8
Right to respect for private and family life
1. Everyone has the right to respect for his private and family
life, his home and his correspondence.
2. There shall be no interference by a public authority with the
exercise of this right except such as is in accordance with the
law and is necessary in a democratic society in the interests of
national security, public safety or the economic well-being of the
country, for the prevention of disorder or crime, for the protection
of health or morals, or for the protection of the rights and freedoms
of others.
So I wonder, what is the legal argument solid enough to justify interfering with everybody's right to privacy?
My layman understanding of the usual process is like, we want surveillance over those people and if it seems reasonable a judge might say ok but for a limited time. Watching everyone's communications also seems at odds with the principle of proportionality[2].
> what is the legal argument solid enough to justify interfering with everybody's right to privacy?
"... except such as is in accordance with the law"
And the "interfering" coming from ChatControl is that "some algorithm" locally scans and detects illegal material, and doesn't do anything if there is no illegal material.
> Watching everyone's communications also seems at odds with the principle of proportionality
It's a bit delicate here because one can argue it's not "watching everyone's communications". The scanning is done locally. Nobody would say that your OS is "watching your communications", right? Even though the OS has to "read" your messages in order to print them on your screen.
Note that I am against ChatControl. My problem with it is that the list of illegal material (or the "weights" of the model deciding what is illegal) cannot be audited easily (it won't be published as it is illegal material) and can be abused by whoever has control over it.
Ah, so we will fight child porn by detecting family pics of children in the shower (or w/e) and sending them off to a "trusted" 3rd party who will no doubt leak them at some point. Also, if I were a pedophile I know where I'd send my resume...
Imagine a future where it becomes easier to commit terrorism because of some technological advancements—like smaller, less traceable bombs, or chemical weapons that are easily accessible and lead to higher casualties—like in the 1,000s or more. Imagine in that scenario, that the likelihood of you or someone you know becoming the victim of a terrorist attack is now non-trivial in your society. In a future where this becomes the norm, it would be interesting to see if individuals are more willing to adopt a level of increased surveillance as it seems like the only reasonable protection against terror.
Right now this debate is oriented mostly around the fact that surveillance today is not a good deal—consumers give up their privacy and get nothing in return. But is there a tipping point? Technology draws us closer, day by day, and the threat matrix will become more sophisticated as time moves forward.
Most individuals on HN are privacy absolutists but one should recognize that tradeoffs exist. That tradeoff is just not compelling today, but that doesn't mean that will always be the case. If you go to China, where everything and everyone is surveilled, I think you'd be surprised to find that many Chinese don't mind. They feel incredibly safe and don't have to worry about being victims of crimes, having their packages stolen, walking around late at night alone, etc. Walking around in China with absolute peace of mind around my own personal safety is a very eye-opening experience as someone coming from the US. I've always advocated for stringent privacy protections; but when giving that up buys you absolute safety in your immediate environment, that's not an experience you forget.
I'm certainly not saying I'm a proponent of living in a surveillance state—I'm simply noting that tradeoffs exist and a sort of re-balancing is constantly occurring, which is just interesting to be aware of.
>Imagine a future where it becomes easier to commit terrorism because of some technological advancements
Imagine a future where aliens invade, and all of our civil rights have to be suspended in order for society to be re-focused on fighting an existential war against the invaders. I suppose this sci-fi hypothetical could happen and if it did happen then the sacrifice might even be necessary. But it's not happening now, and it's entirely reasonable to classify it as both (1) unlikely, and (2) an incredibly bad outcome we should hope that we never have to face.
I don’t know if it’s complete fearmongering to imagine a scenario in the future where chemical or biological weapons are easier to manufacture and therefore execute attacks. Hundreds of people died in Europe last year due to terrorist attacks, and compared to where our species will eventually be, many of the technologies used in these attacks are still in their infancy. The world may evolve, but the scriptures that evangelize future jihadists won’t, so the incentive to be a martyr will always exist. I just looked it up and Europe has a very bad track record at stopping attacks—of 54 planned terrorist attacks in 2024 only 19 were averted by intelligence. 35 were carried out successfully. The threat may come from factions other than just jihadists in the future, too. I agree that this is not something we have to worry about now, which is why I stated that I’m hypothesizing in the original comment. But I think it’s a bit less far fetched than a near term alien invasion :-)
> They feel incredibly safe and don't have to worry about being victims of crimes, having their packages stolen, walking around late at night alone, etc.
Em. I think feeling incredibly safe has more to do with the media telling people that no crime exists and all criminals are caught, rather than a reality of zero crime.
There is evidence that crime started being systematically under-recorded in China since they started assessing police on proportion of recorded crimes they solve.
It's not about the usefulness... it's that omnipotent surveillance creates a jarring imbalance of power between the surveillance state and the people.
If the employees of the state were subject to the same exact surveillance, then maybe it might be palatable.
Curiously, the Star Trek Universe exists in such a scenario. A common trope is asking the computer for evidence of a crime, where someone is at any time, etc. I've never heard complaints about this supposed contradiction between the utopia vision of Star Trek and the omnipotent, all-seeing computer.
But we all know the reality... a tale as old as time. The state will exclude themselves from the surveillance, and it will eventually be used as a tool for authoritarianism. It's only a matter of time with something as powerful as this.
this also assumes that criminals or terrorists will just follow the law.
you can always establish encrypted channel via DH over stenography in plaintext messaging, and just use any encrypted protocol.
if hardware is compromised a black market for such devices will surface.
Worst case scenario you create gigantic one time pads and just use them.
the whole idea is flawed as you get neither security nor privacy. in fact - it actually opens you to abuse if encryption is backdoored. Not to mention it being a gigantic slippery slope argument.
and most importantly - how to you ensure that you can ALWAYS trust your government with such powers?
Probably, but I think you are giving most bad actors too much credence. Tyler Robinson took several precautions to cover his trail in his assassination of Charlie Kirk—but he also told many individuals about his plan on discord, as well as other non-encrypted channels, etc. Not all bad actors are sophisticated in the same way.
I wouldn't trust the government with the power. If the scenario I'm posing were to actually occur, it's only a matter of time until the gestapo starts showing up at the houses of innocent individuals. This sort of thing happens in China.
Still, again, if the threat is big enough, I am curious to ponder what role individuals would want government to take in using surveillance to reduce actual human deaths in terror attacks (or any type of attack, for that matter).
>Probably, but I think you are giving most bad actors too much credence. Tyler Robinson took several precautions to cover his trail in his assassination of Charlie Kirk—but he also told many individuals about his plan on discord, as well as other non-encrypted channels, etc. Not all bad actors are sophisticated in the same way.
you're comparing organized crime, which this is supposed to combat - with a lone gunman. Stupid criminals will always exist.
>Still, again, if the threat is big enough, I am curious to ponder what role individuals would want government to take in using surveillance to reduce actual human deaths in terror attacks (or any type of attack, for that matter).
the purpose of this isn't to stop deaths. It is to entrench state power, increase agencies budget... and as they have to demonstrate that they are useful it will turn either into totalitarian hellhole with plenty 'making example of' public cases... or some attacks will go through on purpose to justify their budget after cuts...
If murder is common in the populace, then that means the social norms of that society have already drifted to the point where murder is acceptable. In that society, the murderers are probably running the government.
On your tangent about China, the people there are feeling so absolutely safe that they have the urge to install metal bars on every window of almost every home.
Better imagine a future where this old manufactured problem / manufactured solution brainwashing trick no longer works and devil's advocates get what they deserve
> .. like smaller, less traceable bombs, or chemical weapons that are easily accessible and lead to higher casualties ..
it's very easy to build a bomb, you just need to "google" and make your shopping... Killing random people in the street is easy too, you have, among others, knifes - very easy to buy and commit a crime in side streets, etc.
No I did not use chatgpt. I've always written with a lot of em dashes, Chatgpt probably got it from me :-)
> it's very easy to build a bomb [...]
Yeah, what I'm saying though is that these attacks are not happening at a scale though that is large enough for people to need to worry about their own safety personally. Your personal chance of dying in a terrorist attack is so low that it's not worth thinking about (unless maybe you live in the middle east). I'm simply noting that this might not always be the case. It's easy to imagine, with better weapons, that terrorists become much more prolific in their ability to kill; under which scenario people could be willing to give up more to have more peace of mind.
Actually you can kill people just fine with only your hands. You just need to open a medicine book, there are a few spots, where a light hit achieves the intended effect.
> it would be interesting to see if individuals are more willing to adopt a level of increased surveillance as it seems as the only reasonable protection against terror.
One presumes it would make terrorism easier if you could hack in and find out where your target is at any given time. What they're doing. What their plans are for this evening.
Also I think one could probably point to the current US president as proof for why this is an insane idea. Imagine if he really did have access to everything we say.
Yeah, totally. Again not saying I'm advocating for it in that form or manner. I'm just saying, tradeoffs could occur, that reasonable people may start to weigh differently based on the level of threat they feel to their lives personally.
I get your point, but this is baked into the social contract in China. You obey the party, give up some personal freedoms, and in exchange the party will make sure you live a prosperous safe life.
The current EU political class has completely lost their Mandate of Heaven, they command 0 respect because they’re spineless empty bureaucrats looking for a cushy consulting job after they’re done being lobbied by their future employers.
Even if your utopian idea makes sense, I don’t trust the EU politicians to bring it to life, just virtue signal
This was precisely some of the motivation behind pushing RCS onto Apple. The RCS spec has a termination point between providers -- a great spot to read some data for telecom providers and government agencies. Despite this, RCS is called "End to End" all the time. It's not. Use Signal or iMessage, depending on your security choices in iCloud.
RCS is not called “end to end” by anyone - even Apple and Google explicitly state it’s not currently E2E encrypted. Apple has pledged to add e2ee to RCS on iPhones but they’re never claimed it’s that way today.
They go out of their way to warn you it’s not the same level of security as iMessage.
Is CSA really that widespread in Europe that everyone's chat messages have to be monitored? And if it is that widespread, shouldn't they try to address it socially to prevent CSA as much as possible rather than try to catch just the subset of tech-savvy abusers, that too after they've already committed CSA?
It’s not about CSA, it’s about illegal content. And laws change all the time.
For example, an individual can generate AI images of Hollywood actors using Stable Diffusion and a decently powerful computer. Said individual had the right to share those images online with a community.
Now however the sharing and distribution of said images is considered illegal in my USA state.
So, are the images said individual created and shared three years ago subject to prosecution? Even if the law went into effect 3 months ago?
No. The right not to be tried for actions that weren't crimes at the time is pretty universally applied in the west (I am not aware of the legal situation in other parts of the world, but I imagine it's honored there too). (Article 7 of the European Convention on Human Rights for the EU, Article I, Section 9 & 10 of the constitution for the US)
> So, are the images said individual created and shared three years ago subject to prosecution?
Generally, criminal acts are judged according to the rules of the jurisdiction where they happened, so I wouldn't be too worried about this. This isn't a universal rule though, so you won't find it enshrined in constitutions or treaties.
Of course not, it's just a pretense for passing this law because its political suicide to instead say "We don't want to do any actual police work and instead want to create a massive surveillance state and monitor everything you say and do so we can better control our populations."
CSAM is just the excuse, as it is with any other laws of this nature in the past.
Agree completely. These laws are either a wedge for broader surveillance or a massive compromise on everyone else’s rights to catch a subset of a subset of users.
Everyone in this debate understands that CSA is a pretext. Nothing is going to make any sense to you if you think ChatControl is an earnest and sincere to fight CSA in particular.
The ultimate goal is for computers to run only authorized programs and to license and monitor development tools like the Soviets monitored typewriters.
I wonder where platforms like slack would land in all of this, and how would they go about akeeping people from just using their own encryption e.g. pgp over unencrypted channels? Is public key cryptography too weak to matter?
Slack is not end-to-end encrypted and belongs to a US company. So there is no need for ChatControl there: the US government already has access to everything that is written on Slack.
I believe they are referring to using GPG to encrypt data before putting it into Slack, much like using the out of band OTR. In that case all the data shared between those using GPG or OTR would only be accessible to those with the right out of band keys. There are probably not a lot of people doing this, or not enough for governments to care. I do this in IRC using irssi-otr [1].
If that ever became illegal because encryption then groups of people could simply use scripts or addons to pipe through different types of encoding to make AI fuzzy searches harder. They can try to detect these chains of encoding but it will be CPU expensive to do every combination at scale given there are literally thousands of forms of encoding that could be chained in any order and number.
> I believe they are referring to using GPG to encrypt data before putting it into Slack
In good approximation, nobody does that.
And anyone who is capable of communicating over PGP won't be covered by ChatControl anyway. They can keep using PGP over whatever they want, or just compile Signal from sources.
> If that ever became illegal because encryption then groups of people could simply use scripts or addons to pipe through different types of encoding to make AI fuzzy searches harder.
I don't think that this makes any sense at all. This is some kind of poor encryption. Either you honour the law and you send your messages in plaintext, or you don't and you use proper encryption. There is nothing worth anything in-between.
If encryption is illegal, those who really need it can still use steganography.
If you really want to use encryption under a state where it's forbidden and communication are monitored you rather want to hide your encrypted messages inside cat pictures and tiktok videos. Because blatant obfuscation might trigger warning and draw attention.
In the end it's not about making encryption technically impossible but illegal, and if you use it you'll be prosecuted.
Anyone one who does anything private or illegal will bypass that with tools that will be popular as a result.
The government is left with scanning the data of the remaining 90% of population.
They choose something sensitive as a pretext to push their agenda.
A nation is a concept that comes into existence only because people agree to lose some of their freedom, income and privacy. To what extent is the question. 100& privacy is not possible and it simply derails a nation, due to lack visibility and lack of control.
Out of interest, what happens in the case of say an open source chat app developed outside the EU. Let's add that the developers are anonymous too, like truecrypt. What power does this legislation have then?
> Apps installed through alternative app distribution undergo a Notarization process to ensure every app meets baseline platform integrity standards...
> Notarization for iOS and iPadOS apps is a baseline review that applies to all apps, regardless of their distribution channel, focused on platform policies for security and privacy and to maintain device integrity.
Can anyone explain to me what keeps anyone who doesn't want to be monitored from just sending PNGs (or similar) containing messages encrypted in each pixels LSBs?
Doesn't all that just force everyone who has something to hide to use something else, less obvious?
But would that actually stop people? I can say with certainty a law such as this would encourage me to go out of my way to create and distribute such software.
No, probably not - but those bad guys with all their child porn and terrorist plans won't mind the friction (those will either encrypt or become EU politicians).
I don't think so. If they were, it would actually be better: one can have sympathy for insanity, and at least isolate it, if not treat it.
Instead, it's extreme insecurity combined with limitless regard for infallible authority. The thought that the hoi polloi might write or say things that are beyond scrutiny is intolerable. That's the insecurity part. And all intolerable things must be criminalized, because in Europe, laws infallibly fix everything. That's the authority part.
That's not insanity. That's just how you behave when you imagine it is your mandate to perfect the world and indulge hubris sufficient to believe you have the wisdom to do so.
The is the n-th attempt to install some regulation that would (a) lead to increased surveillance of most of the population; and (b) is trivial to circumvent by those who the government is ostensibly trying to target. So clearly, the cost-benefit ratio is severely skewed for the EU population.
Assuming that the regulators are fully aware of the above points, it's not very hard to speculate what the real intentions behind all of this are.
> The is the n-th attempt to install some regulation
The sad part is that it would only take one attempt to codify the opposite into privacy laws as a basic right, should anyone ever bother to take up that gauntlet.
This is (mostly) about Tech companies' money, namely:
- Palantir Technologies
- 'not-for-profit' Thorn
> The Commission’s failure to identify the list of experts as falling within the scope of the complainant’s public access request constitutes maladministration. [0]
> ... the complainant contended that the precision rate of technologies like those developed by the organisation are often overestimated. It is therefore essential that any technical claims made by the organisation concerned are made public as this would facilitate the critical assessment of the proposal. [1]
> The Commission presented a proposal on preventing and combating child sexual abuse, looking in particular at detecting child pornography. In this context, it has mentioned that support could be provided by the software of the controversial American company Palantir... [2]
> Is Palantir’s failure to register on the Transparency Register compatible with the Commission’s transparency commitments? [2]
(Palantir only entered the Transparency Registry in March 2025 despite being a multi million vendor for Europol and European Agencies for more than a decade)
> No detailed records exist concerning a January meeting between European Commission President Ursula von der Leyen and the CEO of controversial US data analytics firm Palantir [3]
> Kutcher and CEO Julie Cordua held several meetings with EU officials from 2020 to 2023 - before the former stepped down from his role - including European Commission President Ursula von der Leyen, Home Affairs Commissioner Ylva Johansson, and European Parliament President Roberta Metsola.[4]
> The Ombudsman further concluded that Thorn had indeed influenced the legislative process of the CSAM regulation. “It is clear, for example, from the Commission’s impact assessment that the input provided by Thorn significantly informed the Commission’s decision-making. The public interest in disclosure is thus self-evident. [4]
> EU Ombudsman Emily O’Reilly has announced that she has opened an investigation into the transfer of two former Europol officials to the chat control surveillance tech provider Thorn. [5]
Take it like this: your phone already "reads" absolutely everything you put on that phone. Apple or Google could do anything they want with that, but you trust them. You trust that they don't send everything that goes into your phone to their servers.
ChatControl would run locally on your phone. It would compare the images that you receive/send to a list of illegal images, and if you happen to deal with one of them, it would report you.
How is that destroying your democracy?
Disclaimer: I am against ChatControl, but too many people seem to not understand what the problem with ChatControl is.
Because it's closed source so you have no idea of what is happening. You can then scan for other things, such as "hate speech", or "tax evasion" and then the slope becomes more slippery than a lube party on a vinyl sheet, and Kim Jong Un awaits you at the Ski Bar at the bottom.
Those passive surveillance systems have a chilling effect on democracy, just like mandatory ID on social media, and provide politicians a lever so convenient that you know that it will be used, especially in the EU.
> Because it's closed source so you have no idea of what is happening.
Exactly! That's the problem!
It's not killing the encryption, it's not sharing all your communications with the government. Those are invalid arguments. The problem is that whoever controls the proprietary part of ChatControl (and that includes the list of illegal material) can abuse it to e.g. detect political opponents, or whatever they can imagine.
I am just asking that we use the valid arguments against ChatControl. I read a lot of invalid arguments that won't help convincing politicians that it is a bad idea. They need to understand why it is a bad idea, the real reason.
I think that the correct sentence would be that it kills the purpose of encryption. Which is to prevent anyone aside of the recipient from reading your message.
They're such proponents of privacy that they've actively started encrypting as much as possible for decades but now that the EU is about to break all that they're silent.
They raised such a fuss when the FBI asked to decrypt that single iPhone years ago, but now that millions are on the line... nothing?
When Apple attempted to anticipate these laws and propose a system which tried to navigate a compromise, the “pro-privacy” faction was so politically dumb they spread FUD about it and actively made sure no reasonable compromise could ever be reached. Now the public with reap what these advocates have sowed, good and hard.
With regards to the FBI incident, Apple said at the beginning of their statement, “This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.”
The EU is proposing a law. People assure me their laws are democratic and reflect the will of the people. Who is Apple to reject the outcome of public discussion?
The FBI letter was written in a context where an agency was acting without the support of the public. That’s why the framing was all about misuse of the All Writs Act and lack of Congressional blessing for the requested power.
The number of people in these threads defending involuntary bugging of every phone because you can devil-advocate it maybe might actually save the children is insane for a forum called Hacker News. Either the contrarian population has been getting out of hand, or we have truly lost our minds and stand to lose what remains of our civil liberties.
Just need to pass it once, unfortunately. And despite all the talk against it, they get a partial fresh start to the general public every time one of these is proposed.
Honestly, I fully expect that the scanning method is already implemented and used. The US has intervened with some pretty deep surveillance in the past (ie. Canada Sihk killing) and doesn't seem to need permission to get it.
Sounds to me like the EU is looking to get a more formal approval to act on data they already have.
They'll push the scanning to the OS level, mandate that the OS does it. Hence the seemingly coordinated effort with Google on the sideloading changes, and enforcing play protect, etc.
Like the TPM & Microsoft scare when TPM first started arriving in hardware, and we all thought it would be used to lock out other OSes. Only it's for real this time.
The proposed regulation only applies to publicly available services, and only binds service providers, not end users. There is nothing preventing you from sending encrypted emails, just as there is nothing preventing you from pasting encrypted messages into WhatsApp or storing and sharing encrypted files in Dropbox.
What would prevent me from writing my own program to do something simple like sending encrypted messages?
Nothing. That is, nothing until your application becomes popular. I will keep encrypting my emails and they can pound sand once legislation for this makes it to my country. It should be a while before these shenanigans are in every distribution or kernel for Linux.
The people who are trying to install this kind of law basically do!
They want to change the public perception from "Private encrypted communication is good and desirable" to "Encrypted is unsafe. Encrypted could be scary. Encrypted enables Bad People."
In a vain attempt to inhibit access to non-broken cryptography, we will probably see operating systems that allow actual root access to the user -- or even just allowing non-manufacturer-signed executables to run! -- being painted as "unsafe platforms." Apple has already transitioned most of the way to being fully in the "trusted computing" camp, since it takes a great deal of gymnastics to even modify the OS because of the Mac's sealed system volume, and out of the box all executables must be blessed by Apple, meaning governments can put their thumb on Apple to force them to disallow any non-broken crypto tools from being used. I know this can be changed in Settings for now, but that'll probably go away eventually.
Microsoft will be next of course, and Linux will be portrayed as a "hacking tool" by contrast to the commercial OSs.
I have a theory that everything that happens in regards of governmental control in China and Russia will eventually be copied in some form in western countries.
I don't see how they can come after anyone who's using a specific protocol [0] by law. Expanding on this thought: if Chat Control passes, it will just be the death of social media as a chat platform. People will swap to something more rudimentary where it can't be enforced. Primary reason why being that it simply will be so much faster/more convenient than the apps which are forced to use chat control.
The same reason as why streaming services are being ditched in favor of piracy will happen to social media.
I don't think ChatControl is a good idea. I also think that if you want to convince people of that, using the same misleading language tactics as the other side is not the way to go.
> These scanning systems get it wrong most of the time. [...] Irish law enforcement confirms this: only 20.3% of 4,192 automated reports actually contained illegal material.
Wrong most of the time that they report something. Technically correct, although a somewhat tricky formulation.
Literally next paragraph:
> Even with hypothetical 99% accuracy (which current systems don’t achieve), scanning billions of daily messages would generate millions of false accusations.
This is a different accuracy percentage: here the author means 99% of all messages, not only the reported ones, which the previous 20.3% referred to. Furthermore, these two paragraphs together sound very fishy: if current systems are not accurate enough to generate "millions of false accusations", presumably (?) they generate at least that. But with the 20.3% true positives fraction, that would mean hundreds of thousands true accusations per day.
With Apple being able to forbid application on the App Store and Google now requiring developer to identify themselves before compiling app, and being able to block sideloading at any time, I don’t see what choice is left if you want to bypass that privacy invasion.
I mean for the actual legit user. Pedophiles will still be able to use encrypted mail, Android phone that are not Google certified and so free to sideload anything, or even just passworded zip.
Most arguments I see against ChatControl sound like bullshit to me. How do we expect to convince anyone to go against ChatControl with those?
I feel unease when it comes to ChatControl; I don't want my devices to run proprietary, opaque algorithms on all my data. And it feels like it fundamentally has to be opaque: nobody can't publish an open source list of illegal material together with their hash (precisely because it is illegal). That is why I don't want ChatControl: I would want someone to formally prove that it cannot be abused, just because of what it means. The classic example being: what happens if someone in power decides to use this system to track their opponents?
But most comments and most articles talk about anything but that, with honestly weird, unsupported claims:
> It's the end of encryption
How so? What appears on my screen is not encrypted and will never be encrypted, because I need to read it. We all decrypt our messages to read them, and we all write them unencrypted before we send them.
> It won't fight CSAM
Who are you kidding? Of course it will. It will not solve the problem entirely, but it will be pretty damn efficient at detecting CSAM when CSAM is present in the data being scanned.
> With ChatControl, every message gets automatically checked, assuming everyone is guilty until proven innocent and effectively reversing the presumption of innocence.
When you board a plane, you're searched. When you enter a concert hall, you're search. Nobody would say "you should let me board the plane with whatever I put in my bag, because I'm presumed innocent".
> While your messages still get encrypted during transmission, the system defeats the purpose of end-to-end encryption by examining your content before it gets encrypted.
Before it gets encrypted, it is not encrypted. So the system is not breaking the encryption. If (and that's a big if) this system was open source, such that anyone could check what code it is running and prove that the system is not being abused, then it would be perfectly fine. The problem is that we cannot know what the system does. But that's a different point (and one of the only valid arguments against ChatControl).
> Proton point out this approach might be worse than encryption backdoors. Backdoors give authorities access to communications you share with others. This system examines everything on your device, whether you share it or not.
How is it worse? Backdoors give access to communications, this system (on the paper) does not. This system is better, unless we admit that we can't easily audit what the system is doing exactly. Which again is the one valid argument against ChatControl.
> The regulation also pushes for mandatory age verification systems. No viable, privacy-respecting age verification technology currently exists. These systems would eliminate online anonymity, requiring users to prove their identity to access digital services.
This is plain wrong. There are ways to do age verification anonymously, period.
> Police resources would be overwhelmed investigating innocent families sharing vacation photos while real crimes go uninvestigated.
How to say you don't know how the police works without saying you don't know how the police works? Anyway, that's the problem of the police.
> Google’s algorithms flagged this legitimate medical consultation as potential abuse, permanently closed his account and refused all appeals.
The problem is the closing and refusing of appeals.
> The letter emphasizes that client-side scanning cannot distinguish between legal and illegal content without fundamentally breaking encryption and creating vulnerabilities that malicious actors can exploit.
Then explain how? How is it fundamentally breaking encryption and creating vulnerabilities? Stop using bad arguments. If you have actual reasons to go against ChatControl, talk about those. You won't win with the bullshit, invalid arguments.
> ChatControl catches only amateur criminals who directly attach problematic content to messages.
Yep, that's an argument in favour of ChatControl: it does catch some criminals. How many criminals are professionals? Do you want to make it legal to be an amateur criminal?
Don't get me wrong: I am against ChatControl. Because of one argument I believe to be valid: we fundamentally cannot know what the algorithm doing the scanning is doing, so those who control it could abuse it. Of all the discussions I have seen against ChatControl, I haven't seen another valid argument. But this one is enough.
Stop saying bullshit, start using the valid arguments. And maybe politicians will hear them.
> Don't get me wrong: I am against ChatControl. Because of one argument I believe to be valid: we fundamentally cannot know what the algorithm doing the scanning is doing, so those who control it could abuse it. Of all the discussions I have seen against ChatControl, I haven't seen another valid argument. But this one is enough.
It is not enough to know what the algorithm is doing. It also needs to be possible (for the average user as well) to stop it from doing reprehensible things. If a client-side scanning algorithm is actually searching for e. g. political content, it is possible to detect it via reverse engineering, but merely knowing it won't solve the problem, but instead lead into self-censorship.
I was just thinking that if something like this ever does get through and become law, then creating open-source alternatives which do not obey these laws would be quite trivial. What would not be trivial would be deciding where to host the servers and source code, and how to actually get this software onto people's devices.
What country would be safe for hosting code that does this that people would also trust in general? Would this be hosted on the dark web or would someone actually be brave enough to host it on their private machines? Would there be DNS that could point to this?
Then how would you install the software? You'd need a way to side-load it, which means you'd want a way to sign it. Which means either adding a new root signing authority or being able to have an existing root authority sell you a signing certificate and not revoke it.
You kind of quickly end up in some weird dystopian cyberpunk setting thinking all of this through.
Good luck preventing people from loading up a web page that runs a pure JavaScript (or WebAssembly) implementation of common cryptography algorithms and lets people copy and paste each other encrypted messages.
Good luck convincing American tech to take on a liability like this. There's a reason big tech is moving to e2e encryption like Signal and it isn't user privacy. Telling governments to fuck off because you don't have the data limits liability.
Can anyone try to explain to be how this is not a strain of mind-reading and thought crime? I mean, sure, we’re several decades away from the big event where society will adjudicate thought-crime, but this appears to be one of the first skirmishes.
ThoughtControl 2030: EU wants to scan all private thoughts and communications. Encryption as a concept prohibited except for corporations with security clearance and political connections.
If you are a smart kid in europe learn to vibecode XChacha20 & ed25519 encryption keys for you and your friends to chat with so you can go tell your incompetent government to go fuck themselves.
First they came for the Lockdown skeptics
And I did not speak out
Because I was not a Lockdown skeptic
Then they came for the Social distancing skeptics
And I did not speak out
Because I was not a Social distancing skeptic
Then they came for the Face mask skeptics
And I did not speak out
Because I was not a Face mask skeptic
Then they came for the Vaccine skeptics
And I did not speak out
Because I was not a Vaccine Skeptic
Then they came for the Vaccine passport skeptics
And I did not speak out
Because I was not a Vaccine passport skeptic
Then they came for me
And there was no one left
To speak out for me
I'm absolutely convinced now that anti-war stances will be soon included in the scope of this client side scanning. Peaceniks beware, citizens should crave war and dying for their elites.
To me this is simply an act of terrorism. People who are behind those proposals should be charged and face trial.
There is no excuse for this and it is a stain on EU history for even letting this go so far.
Anyone proposing this should not only be sacked but also referred to de-radicalisation / anti-terrorism programme in their country and forever banned from holding any kind of public sector office.
Why downvote? Because the terrorists wear suits, speak in committees, are mostly white, and there’s no blood on the floor (yet)? The method is different, but the aim is the same: intimidation and control of a population for political ends.
If terrorism is defined as using violence or threats to intimidate a population for political or ideological ends, then “Chat Control” qualifies in substance.
Violence doesn’t have to leave blood. Psychological and coercive violence is recognised in domestic law (see coercive control offences) and by the WHO. It causes measurable harm to bodies and minds.
The aim is intimidation. The whole purpose is to make people too scared to speak freely. That is intimidation of a population, by design.
It is ideological. The ideology is mass control - keeping people compliant by stripping them of private spaces to think, talk, and dissent.
The only reason it’s not “terrorism” on paper is because states write definitions that exempt themselves. But in plain terms, the act is indistinguishable in effect from terrorism: deliberate fear, coercion, and the destruction of free will.
You can argue legality if you like, but the substance matches the textbook definition.
Dear citizens of the EU:
If this gets pushed through, you will gradually lose control of your government much like how the people of the UK already lost control of theirs.
What are you going to do when the government's interests inevitably drift out of alignment with yours? Start a political movement? You will have the police knocking on your door for criticizing the establishment.
Start a revolution? You have no weapons. You can't even organize a resistance because all channels of communication are monitored.
You have neither the pen nor the sword. There is no longer an incentive for the government to serve you, and so it eventually won't.
No amount of protest will recover the freedom you once had. You're heading towards a society where everyone feels oppressed but no one can do anything about it.
>you will gradually lose control of your government much like how the people of the UK already lost control of theirs.
As a UK citizen, can you explain your reasoning here? We haven't implemented anything like the chat control proposal and while a few politicians have brought up similar ideas, there is a lot of pushback against it.
>You can't even organize a resistance because all channels of communication are monitored.
One of the awful things about this proposed legislation is that what I quoted you saying is not true. Software like PGP is easy to use, and criminals already do. The government has absolutely no possibility of breaking RSA the way things are now, and as such scanning all messages will do nothing other than prove more definitively that criminals are still beyond their gavel. In reality, the only individuals who will get spied on are regular people who don't open their terminal just to send a text; exactly the people who should not be spied on in the first place.
When the government realizes this invasive legislature is ineffective, they will probably crack down even harder. After all, what we are willing to accept from rulers has by the looks of it already increased dramatically. I wonder if it at some point it becomes illegal simply to posses encryption software on your personal devices, perhaps even possession of prime numbers that could theoretically be used in modern encryption. How far will the government go to take this illegal math from you?
> Software like PGP is easy to use
Criminalize encryption. Oh you're using cryptograhy? Well then clearly you are a child molesting, money laundering, drug trafficking terrorist. No need to actually decrypt anything when cryptography is incriminating evidence unto itself.
Computers are subversive. Cryptography alone can defeat police, judges, governments and militaries, and computers have democratized access to cryptography to the point even common citizens have it. They cannot tolerate it.
It's a politico-technological arms race. They make their silly laws. We make technologies that completely nullify those laws. They need to increase their overall tyranny just to maintain the exact same level of control they had before. The end result is either an uncontrollable, ungovernable, unpoliceable population, or a totalitarian state that surveils, monitors and controls everything. There is no middle ground.
We are rapidly advancing towards this totalitarianism, and we are eventually going to find out if the people have what it takes to resist and become ungovernable.
One day we will need government signatures to run software on "our" computers. All the free software in the world won't help if we can't run it. The only way to resist that is to somehow develop the means to fabricate our own chips at home.
If all of your messages can be read in plaintext, your going to have to transfer you keys some other way and it will be very detectable that you are sending encrypted messages which will be next on the chopping block.
Both apple and android are teeing their infra up to support deleting apps they don’t like. Windows is moving towards e2e attestation, and Mac is basically already there. Once that’s all done, you just need to enforce hardware manufacturers boot only into ‘trusted’ operating systems. No more Linux. No more unsigned execution. No more encryption.
It's already this dystopic, like any medium where people can talk freely gets eventually controlled by corrupt politicians etc.
Anyways, the control of speech isn't only in surveillance, it's ingrained deeply in culture, taboos, education, etc.
I have talked to religious people before, they all exhibited a certain characteristic, you could talk about somethings but you can't touch on other things, their mind won't accept it, so they bug and start saying nonsense.
I've noticed the same thing with most people when it comes to certain subjects, you'd be talking to an educated person with a relatively high IQ and a mind that is capable to think critically in certain domains, yet once you point out something their mind has been trained to deem anti cultural (like for example who controls what), they turn into Agent Smith and they stop listening to reason.
Anyways, this is HN so what I'm saying is that the control of the controllers is already absolute, it's been linearly increasing for years, they'd cause something then tighten their control of us for "our safety", until one day we get enough and some take out the guillotines and others the bald eagle etc. been happening for millenia, if we as a species were able to rebel on authority before s*it absolutely hit the fan, history would've been a lot cleaner.
> Start a revolution? You have no weapons.
LOL. People nowadays don't start revolutions not because of weapons or lack thereof. It's because they're thoroughly entertained and fed; even the entire political circus is a sort of morbid reality show: people tune in to the news to shake their head in disgust at today's latest antics, and will do so tomorrow, because it's all panem et circenses for grown-ups.
The Internet has become the greatest instrument of mass control ever created in the history of the world. It's done. As long people have their Doordash and Netflix, and are too busy working or scrolling instead of thinking deep thoughts, and reading anarchist philosophy, the kings has nothing to fear.
Also, no need to single out the EU. The entire government-as-reality-TV is well and truly an American creation, and your three-letter agencies don't even have to pass any laws to collect information about its citizens. We're all in the same shit, my brother/sister.
You are exactly right. But most people will call you crazy and that you are a tyrant against "democracy" or "rights".
> and reading anarchist philosophy
That's literally how we got here. People got a taste of unmitigated unprecedented freedom online for the last three decades, and found it so gross that they allowed things to swing the other way.
Even one decade ago, the threat of SOPA/PIPA rallied the internet successfully. Just over a decade later, we're at the point of allowing age verification, for morality's sake, without hardly a peep. The cypherpunks are losing, hard, and honestly, deserve failure for how well their utopia turned out.
There's no utopia. The value of unmitigated speech is in replacing unmitigated violence.
The people you mention, whoever they are, are grossed out by human nature.
What exactly did those people taste that it got them upset so much and who exactly those "people" are? Last time I checked these laws are pushed through as covertly and sneaky as possible and no "people" asked for them. I can't recall any demonstrations with protesters asking to violate their privacy to keep them safe for those evil internet trolls that want to have a sexual intercourse with their relatives.
You're trying to frame the classic authoritarian power grab and desire to fully control the plebs as push from the society. This doesn't sound convincing.
> You're trying to frame the classic authoritarian power grab
Half of US states now have age verification for pornography; three will be requiring age verification to even download apps soon. There is indeed a push from society to get the internet under control, even if the EU is not necessarily connected the same way.
This is a huge, unprecedented reversal of opinion over the last decade that has almost completely gone over HN's head. The EFF, TechDirt, HN, Reddit view of the world has been tried, found wanting, and is being rejected. The EFF which once rallied the internet against SOPA/PIPA... currently is yelling into a void. Nobody believes in a free internet anymore.
> Nobody believes in a free internet anymore.
Civil liberties, like elections and liberal principles in general, are unfortunately only popular when the right side (coincidentally one's own) is winning
You keep saying things that are completely unsubstantiated as though they were fact. "Nobody believes..." _all people_ this, _complete failure_ that...
You're either a shill, an ideologue or arguing dishonestly.
All three are bad equally.
Don't worry; HN makes such statements all the time, you can't accuse me of not grasping the format. On that note, not once did I use the words "complete failure" or "all people" despite your quotation in this thread, so please don't argue dishonestly yourself.
I cited a reality: We went from SOPA/PIPA over copyright, to no question about age verification on morality grounds. It shows a trend towards zero interest in free and open internet activism. Such a trend indicates something is severely wrong, and the idea of an open internet has become disconnected from popular belief, internationally, as something to strive for. Prove me wrong.
How we literally "got here" was Section 230. You can easily stifle free speech by holding Facebook and X accountable for every single post ever made on their platforms. But that would capsize the American investment economy, so we have to protect them just a little bit. It creates a perverse, bipartisan incentive to export the most reprehensible opinions that still qualify as legal.
European citizens (and soon, American ones too) are discovering that they never held the cards. When you ask your OEMs, cloud providers and DNS resolver who's side they're really on, it's not yours. You, the customer, hold no guillotine over their head.
> Start a revolution? You have no weapons. You can't even organize a resistance because all channels of communication are monitored.
Unlike which country? The US I presume? I see very much a lack of any revolutions in the US, and the most resistance done in the past few decades was done by people with no weapons.
I'd say most revolution-like movements of any kind in the US since the Civil War happened without weapons.
Even further, those who have traditionally been most vocal about second amendment rights are currently the biggest cheerleaders for the current authoritarian trend. Quite the plot twist.
Dear citizens of the US:
Please stop funding, allying with and protecting the manufacturers of surveillance tools. Stop exporting Palantir products and importing privacy-destroying devices from businesses like Greyshift and Cellebrite. Insist that the US government stop shielding hackers-for-hire like NSO Group who indiscriminately lease their products for discriminatory and illegal purposes. Stop defending "OEM" control that we have all known is a stand-in for federal steering since the Snowden leaks. Stop marketing E2EE while backdooring server and client hardware for "emergency" purposes.
Do that, and you'll never be accused of hypocrisy again. Signed, a US citizen.
> you will gradually lose control of your government
That happened the moment European countries surrendered their sovereignty to EU.
Which of course never happened, as each member country retains full sovereignty in every possible way you can think of, which is actually fully enshrined in the way EU works.
> Which of course never happened, as each member country retains full sovereignty in every possible way you can think of, which is actually fully enshrined in the way EU works.
Which of course is false.
> The principle was derived from an interpretation of the European Court of Justice, which ruled that European law has priority over any contravening national law, including the constitution of a member state itself.
https://en.wikipedia.org/wiki/Primacy_of_European_Union_law
And if you read literally one paragraph from the bit that you quoted:
"The majority of national courts have generally recognized and accepted this principle, except for the part where European law outranks a member state's constitution. As a result, national constitutional courts have also reserved the right to review the conformity of EU law with national constitutional law"
And guess why and how they are able to do that - that's right, by retaining full sovereignty of their own justice systems. Even obeying rulings of the ECHR is purely a matter of courtesy more than anything, as neither EU nor ECHR have any enforcement mechanism beyond withholding funding, as many EU member states have proven time and time and time again.
I'm looking forward to seeing this in practice when Chat Control passes.
From the article, the current flavor of "threat" this is being positioned to fight is CSAM.
Does anyone believe that predators commit those heinous offenses because of the availability of encrypted channels to distribute those products of their crimes? I sure don't. The materials exist because of predators' access to children, which these surveillance measures won't solve.
Best case scenario (and this is wildly optimistic) the offenders won't be able to find any 'safe' channels to distribute their materials to each other. The authorities really think every predator will just give up and stop abusing just because of that? What a joke.
More likely of course, those criminals will just use decentralized tools that can't be suppressed or monitored, even as simple as plain old GPG and email. Therefore nothing of value will be gained from removing all privacy from all communication.
This has nothing to do with csam and arguing that point is on purpose, to distract people and the politicians can say “xp84 supports child pornography!”
It has everything to do with censorship and complete control over people’s ability to communicate. Politicians hate free speech and they want to control their citizens completely including their thoughts. This is true evil.
But politicians are - in general - neither evil, nor do they have any real incentive to ”control citizens’ thoughts”. It doesn’t make sense. They can be gullible. Non-Technical. Owned by lobbyists. Under pressure to deliver on the apparent problem of the day (csam, terror, whatever). But I don’t think there is a general crusade against privacy. That’s why I think it’s so infuriating: I’m sure it’s not even deliberately dismantling privacy. They’re doing it blindly.
This is pushed by parties that have a good track record of preserving integrity. That’s why it’s so surprising.
If they are "just doing their job" why are they asking for an exemption that would apply only to them? No, they firmly believe that safety should be gained at the cost of privacy.
I could imagine that war orders may be interpreted as "illegal" and therefore reported. Which may not be desirable?
So it's ok if the database containing my nudes leaks, but not if it contains state secrets? I feel really protected!
> But politicians are - in general - neither evil, nor do they have any real incentive to ”control citizens’ thoughts”.
As someone coming from authoritarian state, this is such an alien line of reasoning to me. By definition, those in power want more power. The more control over the people you have, the more power you get. Ergo, you always want more control.
It's easy to overlook this if you've spent your entire life in a democratic country, as democracies have power dynamics that obscure this goal, making it less of a priority for politicians. For instance, attempting to seize too much power can backfire, giving political opponents leverage against you. However, the closer a system drifts toward autocracy and the fewer constraints on power there are, the more achievable this goal becomes and the more likely politicians are to pursue it.
Oh, and also politics selects for psychopaths who are known for their desire for control.
> I’m sure it’s not even deliberately dismantling privacy.
But it is not even dismantling privacy. ChatControl would run client-side and only report what's deemed illegal. Almost all communications are legal, and almost all of the legal communications wouldn't be reported to anyone at all. They would stay private.
The problem I see is that the "client-side scanner" has to be opaque to some extent: it's fundamentally impossible to have an open source list of illegal material without sharing the illegal material itself. Meaning that whoever controls that list can abuse it. E.g. by making the scanner report political opponents.
This is a real risk, and the reason I am against ChatControl.
But it isn't dismantling privacy per se.
Disclaimer: I am against ChatControl.
> Does anyone believe that predators commit those heinous offenses because of the availability of encrypted channels to distribute those products of their crimes?
Who says that? I don't think they say that.
> The authorities really think every predator will just give up and stop abusing just because of that?
Nope, they think they will be able to arrest more predators.
> More likely of course, those criminals will just use [...]
You'd be surprised how many criminals are technically illiterate and just use whatever is the default.
The thing that is crazy to me is that they choose to go after Signal of all things. Certainly there would be higher priority targets than a messaging app that has no social networking features to speak of, if child predators were really the target here.
This is nonsense. Anyone who has the smallest clue would use Signal for anything sensitive. Of course people would use Signal to talk about illegal stuff.
I am against ChatControl. But I am amazed by all the bullshit arguments that people find to criticise ChatControl.
If you have more control, obviously it's easier to track criminals. That's not the question at all. The question is: what is the cost to society? A few decades ago, all communications were unencrypted and people were fine. Why would it be different now? That's the question you need to answer.
> A few decades ago, all communications were unencrypted and people were fine.
A few decades ago, a user base using whatever was available was about 99% lower than now. As well as governments were so illiterate that they could not read with the tech they had even those unencrypted messages.
Snowden was more than a decade ago. The NSA was recording everything.
So ChatControl means that e.g. Signal would be obligated to automatically scan pictures and messages sent for CSAM. This is beyond encryption. And if they were to actually do that, it would mean it's non sensical for people spreading this material to use it as they would immediately be caught, so they would just use other tools.
But people are talking about both - the ridiculousness of the premise that this would help combat this and additionally of course the cost of privacy.
It's beyond encryption. Teenagers sending each other pictures could get flagged by AI etc. Any of your messages and images having potential to get falsely positively flagged.
So what? If predators cannot talk to children over SnapChat, that's a win, wouldn't you say?
The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?
If you cannot audit what is being reporting (with whatever means necessary to make sure it is doing what it should be doing), then whoever controls it could abuse it.
That's the problem. That's the reason not to implement it. But it's completely overwhelmed by the flood of invalid arguments.
I think that a world where underage children can't access tik tok and snapchat is an acceptable cost to keep our rights for privacy.
> The only valid argument I see against ChatControl is that fundamentally, you cannot know what it is reporting. It's not like if there would be an open source list of illegal material together with the hashes, right?
By definition, they must state what is actually illegal, lest I be hidden laws with hidden punishments.
And those lists of 'illegal' need to be publicly disclosed, so we are aware.
At least in the USA a naked picture of someone who is 17y364d old is 'child porn', but that extra day makes it 'barely legal'. But yet, most USA jurisdictions say that 16y can have sex. Just that pictures are EVIL even if you take them yourself.
Again however, I tend to more agree with Stallman that CSAM or child porn picture possession should either be legal or have a mens area attached, and not strict possession. Its proof of a crime, and shouldn't in of itself be a crime.
But because a picture is a crime, we get these horrific laws.
> The only valid argument
Really? The only one?
Really, yes. I am against ChatControl myself, and I am genuinely struggling to find credible messages against it.
It was unencrypted and “it was fine“ because it was technically nearly impossible to store and process all communications. Now, one small server cluster can analyse all communication channels in a country in real time. The only thing stopping it is the encryption.
Ok, but with ChatControl, you still send your messages encrypted. They are scanned on your device.
So all communications aren't stored outside of your device, right?
All communications were unencrypted because encrypting them would have incurred unduly burdensome processing. Nowadays computers can encrypt and decrypt on the fly for virtually free.
Sure. Still people considered themselves free and living in democracies. Why wouldn't it be the case today?
People using online communication system were a niche, not the norm andost people didn't have the tool and knowledge to access someone else's digital communication.
It is not the case anymore.
You're all assuming that predators who are already deliberating using apps which are encrypted to share CSAM won't just move to something else where there is encryption – which will always be possible unless the EU fines a way to ban maths or reverts back to the pre-digital age.
This might catch the odd moron sharing stuff on Facebook or on their phone, but I doubt it will stop the average offender was is already going out of their way to use encrypted apps/services.
But okay great, at least you catch the morons I guess, but at what cost? Here in the UK it's pretty common to be arrested for tweets at it is. There's no doubt in my mind this will be used to catch individuals committing speech crimes who are currently getting away with it because they share their opinions behind closed doors.
> but I doubt it will stop the average offender
I strongly believe it will catch the average offender. The average human doesn't have a clue about cryptography.
It won't catch all of them, of course. My point is that it is invalid to say that it won't catch anyone.
> but at what cost?
EXACTLY! The problem is that whoever controls the list of illegal material can abuse it. We fundamentally cannot audit the list because the material on this list is highly illegal. There is a risk of abuse.
Anyone using a mobile device for CSAM is in prison by now.
Predators use mainstream social media to enter in contact with children.
Most victims of child abuse know their aggressor because it is part of their social circle: dad, mother, uncle, brother, sport coach or a friend of the parents/sibling.
They better ban password protected zip files too!
They will when they can.
Absolutely, evidence of abuse is secondary to the actual abuse.
Plus, the fact you could use/make AI/LLM/etc generate nefarious content that is hard to tell is fake, tells you the abuse isn't even what they are interested in.
Best case scenario would be, lots of children will be saved from abuse because the magic software somehow discovers that. I kind of doubt it though.
No, you don’t get it. Hosting or possessing CSAM has criminal penalties even if no children were involved. For example AI generated imagery.
In fact, even if zero children are ever trafficked or abused going forward, and pedophiles only use old photos of children from 30 years ago, merely having these images is still an issue.
Conversely, the vast majority of sexual abuse of minors doesn’t involve images and goes unreported. "Considerable evidence exists to show that at least 20% of American women and 5% to 10% of American men experienced some form of sexual abuse as children" (Finkelhor, 1994). "Most sexual abuse is committed by men (90%) and by persons known to the child (70% to 90%), with family members constituting one-third to one-half of the perpetrators against girls and 10% to 20% of the perpetrators against boys" (Finkelhor, 1994).
In short - if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it.
"No, you don’t get it."
Did you get my last sentence?
"In short - if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it."
What would be the most straightforwand way? Install a camera in every home?
Yes, abuse is usually more to be found inside families. And the solution kind of complicated, involving social workers, phone numbers victims can call, safe houses for mothers with children to flee into, police officers with sensitive training who care, teachers who are not burned out to actually pay attention to troubled kids ...
> if they wanted to reduce child abuse, scanning everyone’s communications for CSAM would not be the most straightforward way to go about it.
* First, this is not what politicians do. What they want is to look like they are fighting it.
* Second, what is your more straightforward way to fight CSAM? Asking for a backdoor is pretty straightforward, I find. I would rather say that fighting CSAM is more difficult than that.
How do they know it's unreported if it's unreported? They mean unreported to police but reported in scientific self-report surveys?
>The authorities really think every predator will just give up and stop abusing just because of that? What a joke.
Yes, the framing is disingenuous, but so is yours. You're seriously suggesting that any policy that doesn't 100% eliminate a problem is a joke?
If the cost of the proposal is "let's throw democracy under the bus" as it is in this case, it better be damn close to 100% effective to be worth it!
I have a hard time imagining this will be more than 10% effective.
This proposal is a joke
A few decades ago, all communications were unencrypted. Would you say that democracies did not exist then?
This is completely untrue! Important communications have always been enciphered since language has been created I’d wager, whether that cipher is specific terms (grog means attack that person in 10 seconds!) or a book cipher, e.i. The first letter of a bible verse than the second letter of the next verse etc. Humans have been encrypting communication since communication was possible.
It is now only recently possible to dragnet in mass many communications, store, and analyze them. The past decades have brought new threats to privacy democracy through breaking encryption at the state scale.
> Humans have been encrypting communication since communication was possible.
Were most people encrypting their handwritten letters? Were most people encrypting their messages before sending them by SMS or with WhatsApp? Really?
No, because there was an expectation of privacy. That expectation is no longer there.
Not most but some.
Are least where I'm from, there are pretty strong laws against reading snailmail post of others. To this day, any law enforcement that tries to open people's snail mail will laughed out of the courtroom, and quite possibly out of their jobs too!
Today nobody uses snail mail. This proposal is the equivalent of proposing to read everyone's private letters back in the day.
Technical details are technical details
A few decades ago, few communications were tracked. When everything is tracked (as it is now), the only way to have privacy is with encryption.
Snowden said otherwise, more than a decade ago.
Which part are you disputing?
Well, what is "the problem"? Is it children being abused, or is it the distribution of CSAM?
And if you say both - how would you rate the relative severity of the two problems? Specifically, if you had to pick between preventing the rape of a child, and preventing N acts of CSAM distribution, how big would N have to be to make it worth choosing the latter?
I don't think they care what N is, they are just scapegoating a vile group they know will have no defenders, and they can use it to silence the critics by associating them with that group.
Bingo.
Today its the pedophiles and 15-17-philes (those are this fake group adolescent, which are also tried as adults when convenient).
Tomorrow, its the adult sex workers.
Then its the fringe group's topics that is on the outs with the majority.
Then they come for you, and nobody is able to speak up because they banned protests.
... To paraphrase Martin Niemoller.
What do you think it would be for you?
What's worse for you? Being raped as a child. Or, having people sexually gratify themselves looking at images of you being abused; using those images to groom other children, or to trade and encourage the rape of other children?
You might as well ask someone which eye they prefer to have gouged out with a blunt screw.
Let's do both: try to stop child sexual abuse and try to stop images of abused children being used by abusers.
> Well, what is "the problem"? Is it children being abused, or is it the distribution of CSAM?
It seems obvious that it is entirely the former and not at all the latter. In other words, N is positive infinity. Am I missing something?
I only care about kids being hurt. And I think this view is close to consensus.
Ask anyone you know who has been sexually assaulted or raped what they think of the idea of pictures or recordings of that being both kept by the perpetrator and widely disseminated. I think you'll find very few who'd say that's totally fine. But given that there can be no CSAM without child abuse, the direct physical abuse is clearly the primary problem.
> You're seriously suggesting that any policy that doesn't 100% eliminate a problem is a joke?
I think a more charitable reading is that any policy that doesn't 100% _target_ a problem is a joke. This policy doesn't have a plausible way that it will protect children from being victimized, so I think it's reasonable to remove the "think of the children" cloak it's wearing and assess it on the merits of whether encryption is beneficial for the social discourse of a society.
> This policy doesn't have a plausible way that it will protect children from being victimized
Of course it does. "It will detect and report messages from predators to children, therefore preventing the child to get to the point where they send revealing pictures or meet the predator in person". Done.
Well, maybe the word "plausible" is doing too much work in my statement.
Most abuse happens from people known to the child, and of that portion, most are family members. It seems like there is sufficient opportunity in-person comms to route around this limitation.
Moreover, even the communications that do happen online can still easily happen through encrypted media; presumably the perpetrators will simply move to other ways of communicating. And kids, at least kids over 10 or so, don't seem like a demographic particularly likely to follow this law anyhow.
There's another nuance worth considering: by and large, parents _want_ their kids to have access to encrypted communications. I'll happily assist my kiddo in maintaining good opsec - that's much more important to me than some silly and uninformed policy decision being made far away by people I've never met.
https://web.archive.org/web/20210522003136/https://blog.nucy...
So, the kids are still going to be where the encrypted comms are. I still think it's reasonable to say that the protections offered to kids by criminalizing encryption are implausible.
> Most abuse happens from people known to the child
Sure, but it means that at least some happen from people unknown to the child. If ChatControl doesn't cause any problem but helps preventing those abuses, then it's worth it. The question is: what are the problems caused by ChatControl?
Saying "only a minority of children get abused this way, so it's not worth it" won't go far, IMO. It's not a valid argument against ChatControl in itself.
> presumably the perpetrators will simply move to other ways of communicating.
The perpetrators have to contact kids over apps that the kids use. Like Snapchat or TikTok. It's not like the kids will routinely install a weird app to talk to weird people...
> parents _want_ their kids to have access to encrypted communications.
But ChatControl doesn't remove the encryption! It scans everything locally, before it gets encrypted and sent.
> by criminalizing encryption
It's not criminalizing encryption: it's forcing a local scan on your device. Just like there are already scans happening on clouds for non-E2EE data.
Don't get me wrong: I am against ChatControl. For me the problem is that I see a potential for abuse with the "list" (whether it's a list or a sum of weights) of illegal material. This list cannot be made public (because it's highly illegal material), so it's hard to audit. So whoever has control over it can abuse it, e.g. to find political opponents. That's my problem with ChatControl.
Is text-only CSAM even a thing?
It is ! https://en.m.wikipedia.org/wiki/ASCII_porn
That's not a bug, that's a feature. They'll say that current surveillance tools are insufficient, and demand more.
> Best case scenario (and this is wildly optimistic) the offenders won't be able to find any 'safe' channels to distribute their materials to each other.
The theory is based on the documented fact that most crime is poorly thought through with terrible operational security. 41% is straight up opportunistic, spur of the moment, zero planning.
It won't stop technologically savvy predators who plan things carefully; but that statistically is probably only a few percent of predators; so yes, it's probably pretty darn effective. There are no shortage of laws that are less effective that you probably don't want repealed - like how 40% of murderers and 75% of rapists get away with it. Sleep well tonight.
Exactly. Econ 101: why do consumption taxes work at all? By increasing the amount of pain associated with purchasing a particular indulgent product, you decrease the consumption of that product on the margin. When you increase the price of cigarettes by 20%, cigarette smoking in a society decreases. But for the most addicted, no consumption tax will probably act as a deterrent.
Some individuals will find a way to distribute and consume child pornography no matter the cost. But other addicted individuals will stop consuming if doing so becomes so laborious because they are consuming or distributing on the margin. I.e, imagine the individual who doesn't want to be consuming it, who knows they shouldn't—this type of deterrent may be the breaking point that gets them to stop altogether. And if you reduce the amount of consumption or production by any measure, you decrease a hell of a lot of suffering.
But anyway, the goal of this legislation is not to drive the level of distribution to 0. The goal of policymakers could be seen charitably as an attempt to curtail consumption, because any reduction in consumption is a good thing.
Wait. Are you calling child pornography an ”indulgent product?”
Was referring to tobacco, alcohol, soft drinks etc
Exactly my point, but also, to add to it:
Let's say you're actually texting in a group. Even if you use perfect operational security, odds are terrible that all members of your group will perfectly uphold the same level of security every time they share their content.
One is going to slip up. He's going to get arrested. And he's going to turn the whole group in to reduce his sentence. Everyone else meanwhile has their operational security become proof of intent, proof of deliberation, proof of trying to evade authorities. They thought they were clever with the encrypted ZIP files, but the judge and jury are going to be merciless. I don't think most authorities have a problem with that.
I think the challenge for society here is not to simply reject attempts like this, but how to prevent them from being pushed over and over until a specific context allows it to be approved.
The accepted solution is to have a constitution that says otherwise.
Which is a bit complicated here, as the EU has no real constitution and this 'law' (really a regulation) is a blatant violation of the constitutions of countries that did choose to establish secrecy of correspondence.
> The accepted solution is to have a constitution that says otherwise
And the willingness and ability to enforce it. The current iteration of ChatControl is pushed by Denmark, which is at present the President of the Council of the European Union. The Danish Constitution itself enshrines the right to privacy of communication [0], but this is not stopping Denmark from wanting to ratify ChatControl anyway.
[0]: https://danskelove.dk/grundloven/72
Yes but unfortunately courts are mostly reactive, not proactive
Sometimes there are some mechanisms to block unconstitutional (or other regulation) laws from passing but they're limited
Not sure how that would apply at the EU level or even at the Danish level
> Yes but unfortunately courts are mostly reactive, not proactive
I think it’s always the case, no? Unless the unconstitutional law is approved, there is nothing to dispute in court.
In the Netherlands we have the “Eerste kamer” (first chamber, also called Senate) that is responsible for verifying that the proposed laws are in accordance with our “constitution”. They are elected of band with the normal government which should ensure that no single party is able to steamroll laws through both chambers.
In theory the "Bundespräsident" in Germany is supposed to only ratify laws that are in accordance with the constitution, but I don't think it happens that he refuses to do this.
Correct. Imagine the number of challenges in court based on mere rumor of a law.
> but this is not stopping Denmark from wanting to ratify ChatControl anyway.
What the TLDR of the motivation behind this? Is it just politicians playing to their base (think of the children) or corporate lobbying. or religion, etc?
Seems to me that the negatives of passing something like this are super obvious and dystopian.
I suspect it's a mix of many Danish politicians' own authoritarian tendencies/ambitions and corporate lobbying, though I have no proof of the latter when it comes to ChatControl specifically.
Generally speaking, there is a lot of dark money in Danish politics, and the EU has repeatedly flagged Denmark as a country lacking in transparency with regards to corporate lobbying: https://www.altinget.dk/artikel/eu-kritik-af-danmark-puster-...
Generally speaking, the Danish government also tends to behave in authoritarian ways. E.g., Denmark has wilfully violated EU regulations on data retention for many, many years. In 2021, a Danish court ruled that the Danish Ministry of Justice could continue its mass surveillance practices even though they were (and still are) illegal under EU law: https://www.information.dk/indland/2021/06/justitsministerie...
Currently Denmark is also trying to leverage its position as the President of the Council of the EU to legalise, on a EU-wide level, the form of data retention that Denmark has been illegally practising: https://ec.europa.eu/info/law/better-regulation/have-your-sa...
Interesting. I am not expert on politics of Denmark, so my question is: is this push universal across political parties or it’s a feature of a specific political block that rules for the past X years and consistently worked in this direction?
There was another thread on specifically our minister of justice, with comments that touch on the historical aspect: https://news.ycombinator.com/item?id=45248802
Generalized, this looks to me like a question about why humans sometimes get hell-bent about some idea and become blind to the side effects and ignorant when it comes to risk management.
Sometimes it could be malice or personal gains. Sometimes, I think, it could be just a strong bias towards some idea that causes a mental blindness. Such blindness can happen to anyone, at any level of power (or lack thereof), politicians are not unique in this - the only difference is the scope of impact due to the power they have. And we aren't particularly filtering them against such behavior - on the contrary, I feel that many people want politicians to have an agenda and even cheer when they put their agenda above the actual reality, any consequences be damned.
If I was leading another western nation I would be looking at the right wing takeover of the US government in terror.
For sure. Does anyone want Trump to know everything you write? Erdogan if Turkey ever does enter the EU?
EU has the Charter of Fundamental rights which is a part of the Treaty of Lisbon which is the constitutional basis of EU: https://en.m.wikipedia.org/wiki/Charter_of_Fundamental_Right...
In the charter, the protection of personal data and privacy is a recognized right. So chat control is also probably against the EU law.
Both the right to privacy and the right to protection of personal data appear to have pretty big exemptions for government.
The right to private communications was modified by the ECHR to give an exemption for prevention of crime/protection of morals/etc.[1] and the right to protection of personal data exempts any legitimate basis laid down by law[2].
I imagine they'd be able to figure out some form of Chat Control that passed legal muster. Perhaps a reduced version of Chat Control, say, demanding secret key escrow, but only demanding data access/scans of those suspected of a crime rather than everyone.
Legal rulings also seem to indicate that general scanning could be permitted if there was a serious threat to national security, so once a system to allow breaking encryption and scanning is in place, then it could be extended to what they want with the right excuse.
[1] https://fra.europa.eu/en/eu-charter/article/7-respect-privat...
[2] https://fra.europa.eu/en/eu-charter/article/8-protection-per...
> I imagine they'd be able to figure out some form of Chat Control that passed legal muster. Perhaps a reduced version of Chat Control, say, demanding secret key escrow, but only demanding data access/scans of those suspected of a crime rather than everyone.
Isn't that pretty much excatly how it is done in Russia, which was ruled by ECHR to be illegal[0]?
https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22001-230854%...
I'm not familiar with EU law, but reading Title II article 7 and 8 makes me feel this could be an optimistic interpretation of what the Treaty of Lisbon guarantees. I'm sure the supporters of chat control would love to argue something like "ChatControl respects the private communications of an individual by protecting how the data is processed to ensure only the legitimate basis of processing the data is incurred by the law" in court.
I would hope the EU courts would disagree, but I'm not sure if anyone can say until it's tested directly.
Even the EU council's legal service thinks the law as-proposed is probably incompatible with Article 7 and 8:
> The CLS concludes that, in the light of the case law of the Court of Justice at this stage, the regime of the detection order, as currently provided for by the proposed Regulation with regard to interpersonal communications, constitutes a particularly serious limitation to the rights to privacy and personal data protection enshrined in Article 7 and 8 of the Charter.
https://data.consilium.europa.eu/doc/document/ST-8787-2023-I...
I think there are variants of the ChatControl proposal which were clearly problematic, but the different variations of the proposal try to toe the line since. This report talks to the 2022 era proposal.
As shown on the other side of Atlantic that is worthless when no one upholds the constitution.
I think of constitution as a contract between the citizens and the state and the (judiciary?)
Like, constitution both defines the rights of citizens and the limits of those rights and the same goes for the states.
I feel as if the creators of constitutions think that it is a set of checks and balances...
Just as if how a citizen violates something written in the constitution, the state can punish it.
In the same manner, I believe that the constitution thought that if the state violates some constitutional right of citizen, then citizens can point that out and (punish?) the state as the legitimacy of state is through that constitution which they might be breaking...
I concur (fancy word for believe which I wanted to share lol) you are talking about america. The thing is, revolutions are often messy and so much things are happening in america that I think that people are just overwhelmed and have even forgotten all the stuff happening in the past... Like tarrifs were huge thing, then epstein news then this I think autism thing by trump.
Like, the amount of political discourse is happening less and idk, oh shit, just remembered the uh person deporting thing which was illegal which was done anyway
If these things happened in isolation, they would all have huge actions against govt. but they are happening back to back and so everyone's just kinda silent I think, frankly I believe overwhelmed.
I believe that just as in nepal, in america everyone is whining on social media but nobody's taking action. Nepal blocked social media and so people in nepal were kinda forced to take action irl and it worked kinda nice in the end tbh
So maybe its social media which is enabling this thing.... which is funny to me as I am doing the same thing right now lol
All for sweet internet points tho.
A large portion of the population either does not believe or does not mind the violations of our constitution to achieve their desired outcomes. As an American, it came as a surprise to me that we do not, in fact, have broadly shared values about our system of governance. This year has been a devastating blow to my confidence in our democracy and the ability of people to govern themselves generally.
> This year has been a devastating blow to my confidence in our democracy and the ability of people to govern themselves generally.
The latter has been on my mind for quite some time.
The logical conclusion of "people can't govern themselves generally" kind of gestures at religion as a solution - after all, if man cannot govern themselves, why not rely on a higher power to manage them?
Of course, the problem with that point of view is that from the atheistic perspective, there is no higher power, and from the agnostic perspective, whatever higher power there is is inscrutable and beyond our ken.
This then leads me to the conclusion that religion is ultimately a creation of men, and are thus prone to the same power-corrupting vices as any other institution created by men.
Except that leaves no real solution the problem of the governance of people. And it's a quandary I see no realistic chance of escape from.
> As an American, it came as a surprise to me that we do not, in fact, have broadly shared values about our system of governance.
It shouldn't, America is two very distinct nations. The shape and nature of those nations vary wildly in classical Baudrilliardian sidewinding progression, but it's rooted in the very early history of British North America. Two distinct primogenitor colonies and societies, Jamestown and Plymouth. Founded for different reasons, in different contexts, by different people. Understanding the disparity is key to understanding a great deal about America. This divide has always persisted. Jefferson was of Tidewater, Hamilton was of Yankeedom. Democrats vs Whigs. Dixie vs Yankeedom. This split persists in history, and is much the reason why America is ostensibly a two party system. Even if the regional divide is not as hard and fast as it once was, even if the matters in which they differ change radically over time, the divide itself will always persist. It's wrapped up in the pre-revolutionary context the country was founded on. America will always be two countries in a trenchcoat, two echoes of wildly different cultures set against each other for dominance. You should always be keen to remember that. The union isn't of 13 distinct colonies, but two distinct cultures always in tension. It's a fundamental structure within our larger cultural blueprint.
Of course I understood there were vast cultural and political differences causing tension. I just also believed that we had a shared system of fundamental values enshrined in the constitution and when push came to shove, we would all rally behind it. That's what I thought American patriotism meant; I genuinely thought I could count on Red voters to rabidly defend the constitution.
The thing I find most interesting about your reply is how it demonstrates that we live in wildly subjective realities.
Specifically, how? GP's claims can be factually substantiated. Pick whichever you claim can't.
He isn't calling the claim subjective, but underlining what the claim posits entails that we live in subjective realities.
> I concur (fancy word for believe which I wanted to share lol) you are talking about america.
Just a heads up but concur means "agree", not "believe"
I assent to that statement
You are most definitely not right. The EU charter of fundamental rights is an agreement that holds legal binding. The institutions who are supposed to uphold the charter are CJEU, European Commission, FRA, NHRIs.
The people who wrote this proposal said it themselves - "Whilst different in nature and generally speaking less intrusive, the newly created power to issue removal orders in respect of known child sexual abuse material certainly also affects fundamental rights, most notably those of the users concerned relating to freedom of expression and information."
This proposal is illegal. The fact that CJEU at least haven't issued a statement that this is illegal tells you everything you need to know about the EU and its democracy.
For practical purposes the EU does have a constitution, it's just a messy collection of treaties rather than a single codified constitution (see https://en.wikipedia.org/wiki/Treaty_establishing_a_Constitu... for why).
Plenty of EU states already have a constitution in which this proposal would be de facto unconstitutional.
The issue is what is the European Commission willing to do in order to guarantee that fat contract check goes to Palantir or Thorn or whoever has the best quid pro quo of the day.
This is not Stasi this is Tech billionaires playing kings and buying the EC and Europol for pennies on the dollar and with it the privacy of virtually every citizen of zero interest for law enforcement or agencies.
isn't constitution easily changed by parlament?
Usually not "easily". I know Germany requires 2/3 majority.
fwiw, amending the US constitution generally requires a 2/3 majority in both houses of congress to propose the amendment, and then further ratification by 3/4 of the states make the amendment law. it's a fairly long process, and amendments sometime get bogged down and die in the 2nd phase.
(there is another process which calls for a convention, but such a convention would have broad powers to change many things and so far the "two sides" (US rules tilt toward two parties rather than more) have been too scared of what might happen to do that)
> The accepted solution is to have a constitution that says otherwise.
Constitutions don't enforce themselves. The US constitution has a crystal clear right to bear arms but multiple jurisdictions ignore it and multiple supreme court rulings and make firearm ownership functionally impossible anyway. Free speech regulations have, thankfully, been more robust.
The only thing that stops bad things happening is a critical mass of people who believe in the values the constitution memorializes and who have enough veto power to stop attempts to erode these values.
The US has such a critical mass, the gun debate notwithstanding. Does the EU have enough people who still believe in freedom?
i think making your argument on free speech grounds would be stronger
How so? My point is that US constitutional protections on firearm ownership have undeniably eroded. The presence of text on the page did not prevent this erosion. I'm using gun rights as an example of a situation in which text granting a right becomes irrelevant if people stop believing in the values behind the text.
People do believe in freedom of speech in the US, thankfully, even if they've stopped defending gun rights in some places.
EU free speech protections are in the same position gun rights are in the US, and for surprisingly similar reasons.
This simply isn't true. If anything, constitutional protections have dramatically expanded since the amendment was passed.
This is because until the 14th Amendment and the incorporation doctrine, the Bill of Rights only restricted the Federal government, not the States. Prior to the that, state and local governments could (and did) restrict not just firearms, but other rights as well.
Hell, the Bill of Rights still hasn't been fully incorporated, so for instance, despite the 7th Amendment stating otherwise, you don't have the right to a jury trial in civil cases in every state nor the right to indictment by grand jury (5th Amendment).
Of course, some states copied parts of the constitution into their own and had some form of protection, but it was by no means universal. Massachusetts even had a state church until 1833.
when you are talking to a european audience, they tend to be in favor of gun control so they don't care about erosion of those rights (like the people in the US who also favor eroding them, wording of the rules be damned)
HN is to a large extent a popularity contest, and people here are more in favor of free speech than guns. the US record on protecting free speech is very good.
> you are talking to a european audience, they tend to be in favor of gun control so they don't care about erosion of those rights
You have accidentally properly identified the european problem and precisely the reason that chat control will pass: shortsightedness. If people only rise up to protect rights "they need", soon no rights will be left.
In the EU you can have guns, you just must pass some tests, that you know how to use them and you need to store them in separate ways.
But guns are vastly insufficient in this century to overthrow the state, you basically only harm your fellow citizens with them.
Most of the erosion is done through court challenges.
Historically, courts have maintained that legislation is pursued under "good faith". This was the justification for not overturning ACA on the grounds of it being an unconstitutional tax: the lawmakers didn't mean to make it an unapportioned tax, even though it effectively is, so it's okay yall. Washington St just did this with income taxes on capital gains in direct violation of their state constitution a year or two ago.
Where I live, you cannot open carry. That is a direct violation of 2A, but the courts have said it's okay baby because it's not an undue burden to pay a fee and waste a day of your life. Pure nonsense. Just change the constitution for goodness sake.
> The US constitution has a crystal clear right to bear arms
It looks like it was drafted by an ESL speaker. It's by far the worst-drafted amendment, grammatically speaking:
> A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.
It's not even a valid English sentence, and it certainly never bothers to define "Arms." Not to mention that, as written, it appears to make it illegal for me to tell you that you cannot come to my house with a gun, because that's me infringing your right. It doesn't constrain Congress. It constrained anyone who wants to take away your right to bear arms.
Sheer lunacy as written. Ungrammatical and implies some insane shit.
But no, you're right, it's crystal clear. Much like how the First Amendment says
> Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press
which in crystal clear terms makes it legal to mass-distribute child pornography. To prohibit it would restrict the freedom of the press.
It also implies that the militia is regulated.
Remove the first and last comma and the sentence works splendidly
Ok, get a 2/3 majority of the House and Senate to approve a proposed edit removing those commas, and then get 3/4 of the state legislatures to approve it.
Until then, the commas are officially part of the text.
Comma rules change over time.
But the text of the Constitution only changes through amendments.
That said, the effective meaning of the Constitution is "whatever a majority of the Supreme Court agrees it is."
And to a degree, given the power to impeach Supreme Court justices, "whatever a majority of the Supreme Court agrees it is, and with Congress sufficiently on board to not impeach sufficient justices to force a shift in the balance of the Court."
I'm responding to:
>>> Remove the first and last comma and the sentence works splendidly
>> Until then, the commas are officially part of the text.
> Comma rules change over time.
Maybe the equivalence of the sentence at drafting, today is without commas?
That depends on the opinions of 9 very specific individuals. How the text and its commas might be interpreted by you or I today is irrelevant.
But you chose to tell us about your interpretation. :-)
I had less the current legal interpretation and more the meaning at the time of writing down, as it would reveal itself in current text, in mind, which is relevant to this argument.
I see the conversation differently.
KPGv2 pointed out the phrasing of the 2nd amendment is not clear.
GLdRH said "Remove the first and last comma and the sentence works splendidly".
I said that modifying the literal text requires going through the amendment process.
You said "Comma rules change over time."
I reiterated that the literal text does not change except through the amendment process, and also noted that fundamentally the literal words don't matter much as it's up to a majority of the Supreme Court how to interpret any of it.
You then brought up modern language usages of commas.
I replied that how you or I today interpret the text is irrelevant because only the Supreme Court's opinion matters.
At no point in this conversation have I expressed a specific interpretation of the text, so your indication that I chose to tell the discussion about my interpretation seems weird and maybe you're misreading usernames somewhere along the way.
> maybe you're misreading usernames
Sorry for that.
> KPGv2 pointed out the phrasing of the 2nd amendment is not clear.
I thought me and GLdRH replied to KPGv2 stating that the 2nd amendment isn't valid grammar and this results in some of the unclarity.
When you have this:
, then when you want to discuss meaning issues due to grammar rules, you need to use 18th century grammar. I perceived GLdRH to use 21th century grammar to encode the same sentence. The literal text does not need to be modified, since it uses 18th century grammar rules. Only when you want to parse it with 21th century grammar rules, you need to preprocess it to adjust the grammar first. This preprocessing doesn't need to be written back, since the grammar rules of the text haven't changed. We are only circumventing the parser not supporting the texts grammar.> only the Supreme Court's opinion matters.
This is purely about syntactic issues, not about semantics. The Supreme Court applies also semantics, such as the other legal system definitions of the time. I wasn't replying to that aspect.
I'm not here to argue about the right to bear arms in the USA, but the 2nd amendment is anything but crystal clear in its language.
Seems pretty clear to me, although I'm neither an american nor a lawyer.
I've commented this elsewhere, but rights in the US are generally much more absolute than here in Europe.
For example, in the EU you technically have the right to freedom of expression, but you can also be arrested if you say something that could offend someone.
Similarly rights to privacy are often ignored whenever a justification can be made that it's appropriate to do so.
I don't know about elsewhere in the world, but here in the UK you don't even have a right to remain silent because the government added a loophole so that if you're arrested in a UK airport they can arbitrarily force you to answer their questions and provide passwords for any private devices. For this reason you often here reports of people being randomly arrested in UK airports, and the government does this deliberately so they can violate your rights.
> For example, in the EU you technically have the right to freedom of expression, but you can also be arrested if you say something that could offend someone.
So you actually don't have freedom of expression?
No offendings are not an expression. What do you express with them, poor anger management?
Your right to something ends were a right of someone else is violated. That's the case here.
> Your right to something ends were a right of someone else is violated. That's the case here.
Ah yes, that memorable trifecta: Life, Liberty, and the Right to Never Hear Mean Words.
Oral violence also has consequences. From invoking or reinforcing mental diseases over fear and isolation to blackmail and being socially judged on while being innocent. Do you accept random beatings when people feel like it on the street?
"... expression, but you can also be arrested if you say something that could offend someone. ..."
You probably mean hate speech.
We have laws like that too in Canada. It is a good thing.
It all depends who’s defining “hate”. The people you like who are in charge today won’t be there in 20 years, and if any kind of extremism leaks in to society, you could find yourself unable to advocate for your beliefs without getting arrested.
I mean Canada's a pretty depressing example of how bad those laws can be abused.
How on earth are hate speech laws a good thing? Or did I miss a /s?
For example, the US government is trying to label any posthumous criticism of Charlie Kirk "Hate Speech". You can see how dangerous this could be when the hate-mongers get to decide what is considered hate speech.
Honestly, the current administration baffles me. There is so much activity that flies squarely against the constitution in a not at all subtle or clever way; just blatant, "I don't care."
It's one thing to be disruptive and enforce immigration law "by the books" but entirely separate to then go out of your way to not enforce it legally while at the same time violating or attempting to violate the constitution on pretty fundamental levels.
The only way I see to prevent the constant pushing is that every single time some council or committee presents something like this every single of one of their private communication gets leaked for everyone to peruse at their leisure from whatsapp to bank statements.
They want to erode people's privacy? Let them walk their talk first and see how that goes.
Tempting though that is, I think that's the wrong way to resolve it: The people proposing it (law people) are a different culture than us (computer people), and likely have a funamental misunderstanding about the necessary consequences of what they're asking for.
Two cultures: https://benwheatley.github.io/blog/2024/05/25-12.04.31.html
Why would they exclude themselves from the rule if they werent worry about it? Its not like theres no pedophiles in those positions. I wonder who are they going to offer the job of watching the photos of families with kids for this.
> Why would they exclude themselves from the rule if they werent worry about it?
They don't even understand that they haven't. Sure, they've written the words to exclude themselves (e.g. UK's Investigatory Powers Act), but that's just not how computers work.
The people who write these laws, live in a world where a human can personally review if evidence was gathered unlawfully, and just throw out unlawful evidence.
A hacked computer can imitate a police officer a million times a second, the hacker controlling that computer can be untraceable, and they can do it for blackmail on 98% of literally everyone with any skeleton in the closet at the same time for less than any of these people earn in a week.
The people proposing these laws just haven't internalised that yet.
> how to prevent them from being pushed over and over until a specific context allows it to be approved.
We need more diverse mobile OSes that can be used as daily drivers. Right now, it's almost a mono-culture with the Apple-Google duopoly. Without this duopoly, centralization and totalitarian temptations would be less likely.
There's GrapheneOS, which is excellent and can be used without Google, but it relies on Google hardware and might be susceptible to viability issues if/when Google closes down AOSP. Nevertheless, they are working on their own device that will come with GrapheneOS pre-installed, which is exciting.
There's also SailfishOS, which has a regular GNU/Linux userland and almost usable at this stage with native applications. As a stopgap, it can also run Android applications with an emulation layer, and plenty of banking ones work just fine.
I like this idea frankly. Where are the hacktivists when we need them?
You can become an "hacktivist" by taking 15 minutes of your time to write an email to your MEP.
https://www.europarl.europa.eu/meps/en/home
I think "hacktivist" here means hacking into the politician's inboxes and leaking the contents, like "politicians want to do this to you; let's see how they like it when it's done to them" sort of thing.
No, you silly man, the politicians are protected from this law, this is just for the plebs.
>The only way I see to prevent the constant pushing is that every single time some council or committee presents something like this
Yes but.. it can't just be vague exhortations and generalities. I didn't know the pertinent bodies previously, but after GPT'ing on it, it looks like they include:
One is "DG Home," an EU department on security that drafts legislation.
Another is Europol, a security coordination body that can't legislate but frequently advocates for this kind of legislation.
And then there's LEWP, The law enforcement working party, a "working group" comprised of security officials from member EU states, also involved in EU policy making in some capacity.
I think the blocking states should be resisting these at these respective bodies too.
I'm convinced the people suggesting this type of thing are influenced or even compromised by their constituent's enemies and NOT the result of poor education on the topic.
This policy for example would be most helpful to enemies to the EU. It would lower the cost of acquiring the data for China and Russia as it allows them to mass acquire data in transmission without incurring the cost of local operations. The easiest system in the world to hack is that of a policy maker.
> It would lower the cost of acquiring the data for China and Russia
Yes, it would lower such barriers for countries that are commonly seen today as Europe's adversaries. But in this case, the U.S. (or rather, U.S. organisations and corporations) might be the primary bad actor pushing for ChatControl. See e.g.:
Thorn (organization) - https://en.wikipedia.org/wiki/Thorn_(organization)
"Thorn works with a group of technology partners who serve the organization as members of the Technology Task Force. The goal of the program includes developing technological barriers and initiatives to ensure the safety of children online and deter sexual predators on the Internet. Various corporate members of the task force include Facebook, Google, Irdeto, Microsoft, Mozilla, Palantir, Salesforce Foundation, Symantec, and Twitter.[7] ... Netzpolitik.org and the investigative platform Follow the Money criticize that "Thorn has blurred the line between advocacy for children’s rights and its own interest as a vendor of scanning software."[11][12] The possible conflict of interest has also been picked up by Balkan Insight,[13] Le Monde,[14] and El Diario.[15] A documentary by the German public-service television broadcaster ZDF criticizes Thorn’s influence on the legislative process of the European Union for a law from which Thorn would profit financially.[16][17] A move of a former member of Europol to Thorn has been found to be maladministration by the European Ombudsman Emily O'Reilly.[18][19]"
Additionally, it would not surprise me at all if Palantir is lobbying for this either. Many EU countries, like Germany and Denmark, have already integrated Palantir's software into the intelligence, defence, and policing arms of their governments.
But at the end of the day, while it is convenient to blame external actors like U.S. corporations, ultimately the blame lies solely on the shoulders of European politicians. People in positions of power will tend to seek more, and I'm sure European politicians are more than happy to wield these tools for their own gain regardless of whether Palantir or Thorn is lobbying them.
you have left out how it can be used to monitor violation of corporate copyright materials. And what it means for silencing political speech is enormous.
I would argue that a surefire way of guaranteeing the right to privacy is to instead continuously push for absolute-transparency laws for politicians and governments. If they’re going to demand every private citizen’s records are always open for view, then the same should be said for governments - no security clearances, no redactions, no “National Security” excuse.
Is it patently unreasonable? Yes, but cloaked in the “combat corruption” excuse it can be just as effective in a highly-partisan society such as this - just like their “bUt WhAt AbOuT tHe ChIlDrEn” bullshit props up their demands for global surveillance.
If only we could show them how this kind of things may go wrong. I don't know, the case of some leader of a nation they are having trouble with, abusing of a similar access with their data.
But they will probably think that is only bad when others do it to them.
> If only we could show them how this kind of things may go wrong.
We can. This has already happened with the fairly recent SALT TYPHOON hacks. China (ostensibly) abused lawful wiretapping mechanisms to spy on American (and other) citizens and politicians. The news at the time wasn't always explicit about the mechanism, but that's what happened.
China wouldn't have been able to do this if those mechanisms didn't exist in the first place.
Wait, isn’t that the law working exactly as planned?
The elephant in the room here is US.
The only real option is to get your country to leave the EU. An unelected cabal of people making sweeping decisions for countless member states isn't democratic, so yeet it while you can.
> An unelected cabal of people
European Commission: Commissioners are nominated by elected national governments and must be approved by the directly elected European Parliament.
Council of the EU: Ministers are accountable to their national parliaments, which are elected by citizens.
European Council: Composed of heads of state/government who were elected in their own countries.
European Parliament: Members are directly elected by EU citizens every five years.
>European Commission: Commissioners are nominated by elected national governments and must be approved by the directly elected European Parliament.
With so many levels of indirection, that citizen votes are irrelevant and they don't need to care about it - only about support of major political group at the top. And surprisingly enough Parliment is relatively stable.
>Council of the EU: Ministers are accountable to their national parliaments, which are elected by citizens.
same as above.
i don't advocate for leaving the EU, but this needs to change. Those positions, which are the ones pushing for such legislation usually, need to be held accountable by citizens. At least EC.
No more rotations, or other such bullshit.
Right now EU is sitting in middle ground between federation and trade union, reaping(from citizens point of view) downsides of both systems.
Strip the privileges from the bureaucrats who are involved in any type of government work or activity. No immunities, no security.
If you want to be a servant to the public be one.
By implementing direct democracy via internet, which creates laws which disallow that.
But, amongst a few others, there is a technical problem, how do we log in to vote? That mechanism must be unhackable, configurable by computer illiterates, and it must not invade privacy.
Serious question.
This has to be written in the constitution somehow ; it has to comes down to the values of everyone - and i believe a lot of education has to do with it. Currently people are simply not tilted by it as much - or not in a way comparable to other topics.
Explicit digital privacy right in each country constitution?
Priva rights are already there in most countries constitutions, but maybe adding the digital part will make it harder to push back.
Can't be done. It's pushed by the Commission - the technocratic deep state.
The prevention has to be in the underlying layer of physics / math / the internet such that the state is _unable _ to make (or at least enforce) such laws.
We need to accept and celebrate a world in which the capabilities of states are constrained by our innovations, not merely the extremely occasional votes we cast.
Agreed. In this case, there needs to be some sort of 'privacy bill of rights'. Something fundamental where any law like this cannot be passed.
This exists. But courts have to balance conflicting rights, so there is always room for interpretation.
Laws don't stop men with guns. Men with guns stop men with guns. Laws not enforced and rights not protected don't matter.
As the old saying goes, the price of freedom is eternal vigilance.
> Laws don't stop men with guns. Men with guns stop men with guns.
Prove it. Every statistic I've ever seen shows the exact opposite of this to be true.
Here's the proof: https://en.wikipedia.org/wiki/Mass_killings_under_communist_... . Those kinds of mass killings can only happen when the citizens are disarmed, because it's logistically impossible for a government to seize absolute power when a significant proportion of the citizens are armed.
> it's logistically impossible for a government to seize absolute power when a significant proportion of the citizens are armed.
This is literally, and provably, untrue. For example:
The Soviet Union: The Bolsheviks initially proclaimed that "the arming of the working people" was essential to prevent "restoration of the power of the exploiters". It was only later that they restricted private gun ownership.
The Nazis: Contrary to popular gun rights narratives, Nazi gun laws actually relaxed restrictions for most Germans while targeting specific groups. Sometimes authoritarianism is the same as populism.
Rwanda: Prior to the genocide, the government systematically distributed weapons to local administrators and militia groups while ensuring targeted populations remained defenseless.
Myanmar: Armed civilian resistance groups formed, but the were essentially wiped out by the overwhelming advantages in air power and heavy weaponry that an actual organized military had. The firearms were useless. Arguably, worse than useless as those who fought back died in large numbers.
Venezuela: The regime armed its supporters while systematically removing weapons from the general population. The population was well armed, they just couldn't fight back against an organized government response.
Those kind of mass killings also happen in authoritative regimes, which typically emerge from violent societies.
>Men with guns stop men with guns.
Really? Why does America, the country with the most guns by far, have the most gun deaths by far? It's very tiring arguing these very obvious points over and over.
Nazi Germany, Communist China and Soviet Russia have by far the largest number of deaths by _men with guns_, over a hundred million people killed by their own governments. The guns of US citizens have so far prevented this kind of government-led mass citizen genocide from happening. The number of people killed by gun violence in the US is a rounding error compared to the number of people killed by Mao, Hitler and Stalin.
Most people killed by these regimes killed people as aliens. If truly want to compare the actions of the USA, you must also count there handling of there aliens (e.g. in wars).
> The guns of US citizens have so far prevented this kind of government-led mass citizen genocide from happening.
No they haven't. Our system of checks and balances has. At no point has there been a civil war in which the US's citizens attempted to fight back against the US military. If there were, the citizens would lose without even presenting a challenge.
>the citizens would lose without even presenting a challenge.
That's not true. The US Army spent 20 years and trillions of dollars trying to impose regime change on Afghanistan, but were defeated by a group (the Taliban) that had very little military capability beyond men with rifles and some explosives to make improvised bombs. (Yes, they also had decades-old weapons with which to shoot down helicopters.) Algeria's war of independence from France in the 1950s and early 1960s is another example where a group with very little in the way of military capability defeated one of the most powerful militaries in the world.
I don't necessarily buy the argument that the US should continue with the gun status quo just because all those guns would come in handy in a revolution, but you haven't successfully refuted the argument.
The Algerian war doesn't really prove much either, except that terrorism works.
The Algerians hid within the population and gradually picked at the French, like flies biting a bull. Eventually the French got bored and wandered off to find a new form of entertainment. If anything the French lost to propaganda, not guns.
>The Algerians hid within the population
Yes, but we're discussing a civil war or revolution in the US, where the rebels or revolutionaries would be able to engage in terrorism and to hide within the population -- and where there are so many long guns in private hands that the defending force (the government) probably wouldn't be able to deprive the attacking force (the rebels) of long guns simply by punishing any civilian found with a long gun in their home.
My point is that it wasn't the guns that saved the Algerians. Knives, bayonets, and IEDs would have been equaly effective for the sort of guerilla tactics that eventually won the war.
The Afghanistan bit is over simplified isn't it? My understanding is that the US military successfully imposed regime change between 2001-2003. I doubt those rifles slowed the tanks and bombers much at all.
The fact that we packed up and left eventually doesn't really change the fact that the US rolled over the men with guns like they weren't there in the early 2000's.
There are no solutions to that which wouldn't sound absurd. But if you could get past absurdity...
Politicians should agree to to be executed if they lose an election. Only those willing to risk their lives should be allowed to legislate. This also gives the voters the option of punishing those who pass onerous laws at the next election.
If you need extra zing, this would also apply to recall elections, so they could even be punished early.
I think it would be better if they agree to be executed if they win the election, after serving their term.
Maybe a less extreme version of this is that if you become president you are stripped of all property and become the ward of the state after your term is over, enter a monastery sort of situation, for the rest of your life.
Yeah let's ensure only the craziest, most desperate for power type to be the regulators.
Hitler knew if he had lost, he would have been executed. Didn't stop him from going war.
One could argue that Putin won't stop the current war against Ukraine for the very same reason. He is obsessed with Gaddafi's undignified end in a ditch and cannot be seen as weak.
The GP's idea is very bad. Quite to the contrary, losing power should not come with disastrous personal consequences.
If they can't be punished for continuing to push bad laws, then they will continue to push them... because they benefit from those when they inevitably pass. So there are no solutions. You live in a world where Putin still exists, is still doing these godawful things, but the suggestion that if a politician loses an election his life is forfeit makes you fear that the things that already happen would happen. Or something. It's sort of sad.
> prevent them from being pushed over and over
Solve the problem it's trying to solve, then it won't be proposed again.
The problem it's trying to solve is mass surveillance...
The motivation in Denmark was some big cases where organized crime was only caught due to a huge hacking operation where the police was able to monitor communication on the apps commonly used by the criminals. That allowed them to take very dangerous people off the streets and now they want to do more of that, more easily. I think the discussion can never be in terms of absolutes. If your family was murdered by some criminal that was never caught earlier , but could have been if the police had access to their chats, would you still be against it? We need to remember that we’re making that decision for some future victim if we do agree that this will assist the police effectively. The other side says the police will undoubtedly abuse their powers. In which case how does the results compare?? If you think the answer is easy, one way or another, you are definitely wrong.
But the CSAM regulation under discussion doesn't do any of the things you're claiming. It mandates content scanning for CSAM and other related messages. It does not call for key escrow and decryption of messages involving organized crime. So it's not clear how you would do much against serious organized criminals with this law.
Nobody here argues against wiretaps after court rulings. The discussion here is about mandating sending a transcript of every communication you do to the state (unless you work for the specific parts of the state).
You mean like the mass surveillance already implemented by Google, Facebook, Apple, Microsoft, and Amazon?
That's already here. I think you should consider that this law might be aiming at some other goal.
Google, Facebook, Apple, Microsoft and Amazon cannot send armed men to my front door.
Yes, they (well, google and amazon, I don't have accounts with other vendors) can terminate my accounts, but, to be honest, it is not big deal for me, especially comparing to be dragged out of my house by police, especially now, when I live in EU with residence permit and not full citizenship.
> You mean like the mass surveillance already implemented by Google, Facebook, Apple, Microsoft, and Amazon?
No, GP is referring to mass collection and analysis of all of your communications. Google, Apple, et. all don’t have that capability today.
Hell, apple can’t even read my text messages, nor do they know I’m writing this - and I’m doing it on an iPhone.
You only believe that because you have chosen to believe it.
Take Facebook end-to-end encrypted messages for example. There are certain links it won't let you send, enough though it is supposedly E2EE. (I've seen it in situations like mentioning the piratebay domain name, which it tries to auto-preview and then fails. Hacking related websites as well I've seen the issue with.)
It likes to pretend it is a mysterious error, but if you immediately send a different link, it sends just fine. I don't use chat apps much these days, so I'm not sure if others see similar behavior, but I'd wager some do. Facebook is about the least trustworthy provider I'm likely to use, FWIW, so I expect a certain amount of smoke and mirrors from them.
The fact that EU politicians exclude themselves from the ChatControl is all you need to know about this.
Source on that?
From TFA
> the proposed legislation includes exemptions for government accounts used for “national security purposes, maintaining law and order or military purposes”. Convenient.
I can buy the military exemption, and maybe some very top level government workers that are effectively military (example: POTUS). But the EU parliament has no reason to be excluded. It is definitely a terrible law if it is so bad that they won't pass it unless they are excluded.
> top level government workers that are effectively military (example: POTUS)
POTUS is very specifically NOT a member of the military. Elected civilian control was the whole point. Even Eisenhower had to (temporarily) give up his general rank to serve as president.
I do understand your core point tho.
Interestingly Parliament is against Chat Control: https://edri.org/our-work/chat-control-what-is-actually-goin...
Page 36, section 2a here: https://www.patrick-breyer.de/wp-content/uploads/2024/04/202...
Governments should be transparent and the people should be opaque. Any government that attempts to make things otherwise looses legitimacy.
> Governments should be transparent and the people should be opaque.
I'm going to add this to my repertoire since it's a lot more concise than most of my rantings on the topic
Yes, I love this idea. I've heard it framed as "Transparency for the powerful and privacy for the weak."
Or as someone put it, "People shouldn't fear the government. The government should fear the people."
I feel like we've lost the vocabulary we ought to be using to talk about the legitimacy and role of the state. More people need to read J.S. Mill (and probably Hobbes.) Even today, works by both are surprisingly good reads and embed a lot of thoughtful and timeless wisdom.
But isn't the government fearing the people exactly why they're relentlessly pushing ChatControl?
if they feared the populace, they couldn't push for legislation that entrenches their position without any benefits to citizens.
US cops fear everyone else, and look what that gets us.
Governments need privacy. They literally investigate child mollestation cases. They hunt spies. They handle all sorts of messy things like divorce between couples with abuse.
I'm not commenting on the government coming in at unveiling encrypted communications, but certainly a better approach than "governments should be transparent and the people should be opaque" would be "governments should be translucent and the people should be translucent too".
There is a clear difference between specific activities that need privacy (especially if it is temporary privacy or cases where it is protecting the privacy of the citizens not the government itself) and privacy by default for most or all government work.
Interview from DR (Danish public news broadcast) with the Danish judicial minister Peter Hummelgaard, the politician who conceived the proposal:
https://www-dr-dk.translate.goog/nyheder/viden/teknologi/ana...
It is very obvious that he doesn't understand e2e, yet he will not listen. Bro couldn't even read the Wikipedia page
I regularly see similar articles with similar comments here, but there's one thing I still don't understand:
From the European Convention on Human Rights[1]:
So I wonder, what is the legal argument solid enough to justify interfering with everybody's right to privacy?My layman understanding of the usual process is like, we want surveillance over those people and if it seems reasonable a judge might say ok but for a limited time. Watching everyone's communications also seems at odds with the principle of proportionality[2].
[1]https://www.echr.coe.int/documents/d/echr/Convention_ENG
[2]https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:12...
> what is the legal argument solid enough to justify interfering with everybody's right to privacy?
"... except such as is in accordance with the law"
And the "interfering" coming from ChatControl is that "some algorithm" locally scans and detects illegal material, and doesn't do anything if there is no illegal material.
> Watching everyone's communications also seems at odds with the principle of proportionality
It's a bit delicate here because one can argue it's not "watching everyone's communications". The scanning is done locally. Nobody would say that your OS is "watching your communications", right? Even though the OS has to "read" your messages in order to print them on your screen.
Note that I am against ChatControl. My problem with it is that the list of illegal material (or the "weights" of the model deciding what is illegal) cannot be audited easily (it won't be published as it is illegal material) and can be abused by whoever has control over it.
> Nobody would say that your OS is "watching your communications", right?
No what? Everyone has been hating on the spying Microsoft has been doing in windows for years. How do you ask this with a straight face.
Ah, so we will fight child porn by detecting family pics of children in the shower (or w/e) and sending them off to a "trusted" 3rd party who will no doubt leak them at some point. Also, if I were a pedophile I know where I'd send my resume...
Imagine a future where it becomes easier to commit terrorism because of some technological advancements—like smaller, less traceable bombs, or chemical weapons that are easily accessible and lead to higher casualties—like in the 1,000s or more. Imagine in that scenario, that the likelihood of you or someone you know becoming the victim of a terrorist attack is now non-trivial in your society. In a future where this becomes the norm, it would be interesting to see if individuals are more willing to adopt a level of increased surveillance as it seems like the only reasonable protection against terror.
Right now this debate is oriented mostly around the fact that surveillance today is not a good deal—consumers give up their privacy and get nothing in return. But is there a tipping point? Technology draws us closer, day by day, and the threat matrix will become more sophisticated as time moves forward.
Most individuals on HN are privacy absolutists but one should recognize that tradeoffs exist. That tradeoff is just not compelling today, but that doesn't mean that will always be the case. If you go to China, where everything and everyone is surveilled, I think you'd be surprised to find that many Chinese don't mind. They feel incredibly safe and don't have to worry about being victims of crimes, having their packages stolen, walking around late at night alone, etc. Walking around in China with absolute peace of mind around my own personal safety is a very eye-opening experience as someone coming from the US. I've always advocated for stringent privacy protections; but when giving that up buys you absolute safety in your immediate environment, that's not an experience you forget.
I'm certainly not saying I'm a proponent of living in a surveillance state—I'm simply noting that tradeoffs exist and a sort of re-balancing is constantly occurring, which is just interesting to be aware of.
>Imagine a future where it becomes easier to commit terrorism because of some technological advancements
Imagine a future where aliens invade, and all of our civil rights have to be suspended in order for society to be re-focused on fighting an existential war against the invaders. I suppose this sci-fi hypothetical could happen and if it did happen then the sacrifice might even be necessary. But it's not happening now, and it's entirely reasonable to classify it as both (1) unlikely, and (2) an incredibly bad outcome we should hope that we never have to face.
I don’t know if it’s complete fearmongering to imagine a scenario in the future where chemical or biological weapons are easier to manufacture and therefore execute attacks. Hundreds of people died in Europe last year due to terrorist attacks, and compared to where our species will eventually be, many of the technologies used in these attacks are still in their infancy. The world may evolve, but the scriptures that evangelize future jihadists won’t, so the incentive to be a martyr will always exist. I just looked it up and Europe has a very bad track record at stopping attacks—of 54 planned terrorist attacks in 2024 only 19 were averted by intelligence. 35 were carried out successfully. The threat may come from factions other than just jihadists in the future, too. I agree that this is not something we have to worry about now, which is why I stated that I’m hypothesizing in the original comment. But I think it’s a bit less far fetched than a near term alien invasion :-)
The ultimate surveillance state cannot keep you ultimately safe.
This concept already exists. It has for centuries. It's called war.
> They feel incredibly safe and don't have to worry about being victims of crimes, having their packages stolen, walking around late at night alone, etc.
Em. I think feeling incredibly safe has more to do with the media telling people that no crime exists and all criminals are caught, rather than a reality of zero crime.
There is evidence that crime started being systematically under-recorded in China since they started assessing police on proportion of recorded crimes they solve.
https://archive.is/20250624235740/https://www.economist.com/...
It's not about the usefulness... it's that omnipotent surveillance creates a jarring imbalance of power between the surveillance state and the people.
If the employees of the state were subject to the same exact surveillance, then maybe it might be palatable.
Curiously, the Star Trek Universe exists in such a scenario. A common trope is asking the computer for evidence of a crime, where someone is at any time, etc. I've never heard complaints about this supposed contradiction between the utopia vision of Star Trek and the omnipotent, all-seeing computer.
But we all know the reality... a tale as old as time. The state will exclude themselves from the surveillance, and it will eventually be used as a tool for authoritarianism. It's only a matter of time with something as powerful as this.
this also assumes that criminals or terrorists will just follow the law.
you can always establish encrypted channel via DH over stenography in plaintext messaging, and just use any encrypted protocol.
if hardware is compromised a black market for such devices will surface.
Worst case scenario you create gigantic one time pads and just use them.
the whole idea is flawed as you get neither security nor privacy. in fact - it actually opens you to abuse if encryption is backdoored. Not to mention it being a gigantic slippery slope argument.
and most importantly - how to you ensure that you can ALWAYS trust your government with such powers?
> a black market for such devices will surface
Probably, but I think you are giving most bad actors too much credence. Tyler Robinson took several precautions to cover his trail in his assassination of Charlie Kirk—but he also told many individuals about his plan on discord, as well as other non-encrypted channels, etc. Not all bad actors are sophisticated in the same way.
I wouldn't trust the government with the power. If the scenario I'm posing were to actually occur, it's only a matter of time until the gestapo starts showing up at the houses of innocent individuals. This sort of thing happens in China.
Still, again, if the threat is big enough, I am curious to ponder what role individuals would want government to take in using surveillance to reduce actual human deaths in terror attacks (or any type of attack, for that matter).
>Probably, but I think you are giving most bad actors too much credence. Tyler Robinson took several precautions to cover his trail in his assassination of Charlie Kirk—but he also told many individuals about his plan on discord, as well as other non-encrypted channels, etc. Not all bad actors are sophisticated in the same way.
you're comparing organized crime, which this is supposed to combat - with a lone gunman. Stupid criminals will always exist.
>Still, again, if the threat is big enough, I am curious to ponder what role individuals would want government to take in using surveillance to reduce actual human deaths in terror attacks (or any type of attack, for that matter).
the purpose of this isn't to stop deaths. It is to entrench state power, increase agencies budget... and as they have to demonstrate that they are useful it will turn either into totalitarian hellhole with plenty 'making example of' public cases... or some attacks will go through on purpose to justify their budget after cuts...
If murder is common in the populace, then that means the social norms of that society have already drifted to the point where murder is acceptable. In that society, the murderers are probably running the government.
On your tangent about China, the people there are feeling so absolutely safe that they have the urge to install metal bars on every window of almost every home.
But China wasn't a honeypot for crime and fraud before they had the firewall, facial rec, and so on.
It is true that many Chinese citizens don't give it a thought.
But there's no demonstrable cause and effect going on there.
Better imagine a future where this old manufactured problem / manufactured solution brainwashing trick no longer works and devil's advocates get what they deserve
did you write this message with ChatGPT?
> .. like smaller, less traceable bombs, or chemical weapons that are easily accessible and lead to higher casualties ..
it's very easy to build a bomb, you just need to "google" and make your shopping... Killing random people in the street is easy too, you have, among others, knifes - very easy to buy and commit a crime in side streets, etc.
No I did not use chatgpt. I've always written with a lot of em dashes, Chatgpt probably got it from me :-)
> it's very easy to build a bomb [...]
Yeah, what I'm saying though is that these attacks are not happening at a scale though that is large enough for people to need to worry about their own safety personally. Your personal chance of dying in a terrorist attack is so low that it's not worth thinking about (unless maybe you live in the middle east). I'm simply noting that this might not always be the case. It's easy to imagine, with better weapons, that terrorists become much more prolific in their ability to kill; under which scenario people could be willing to give up more to have more peace of mind.
Actually you can kill people just fine with only your hands. You just need to open a medicine book, there are a few spots, where a light hit achieves the intended effect.
> it would be interesting to see if individuals are more willing to adopt a level of increased surveillance as it seems as the only reasonable protection against terror.
One presumes it would make terrorism easier if you could hack in and find out where your target is at any given time. What they're doing. What their plans are for this evening.
Also I think one could probably point to the current US president as proof for why this is an insane idea. Imagine if he really did have access to everything we say.
Yeah, totally. Again not saying I'm advocating for it in that form or manner. I'm just saying, tradeoffs could occur, that reasonable people may start to weigh differently based on the level of threat they feel to their lives personally.
I get your point, but this is baked into the social contract in China. You obey the party, give up some personal freedoms, and in exchange the party will make sure you live a prosperous safe life.
The current EU political class has completely lost their Mandate of Heaven, they command 0 respect because they’re spineless empty bureaucrats looking for a cushy consulting job after they’re done being lobbied by their future employers.
Even if your utopian idea makes sense, I don’t trust the EU politicians to bring it to life, just virtue signal
This was precisely some of the motivation behind pushing RCS onto Apple. The RCS spec has a termination point between providers -- a great spot to read some data for telecom providers and government agencies. Despite this, RCS is called "End to End" all the time. It's not. Use Signal or iMessage, depending on your security choices in iCloud.
RCS is not called “end to end” by anyone - even Apple and Google explicitly state it’s not currently E2E encrypted. Apple has pledged to add e2ee to RCS on iPhones but they’re never claimed it’s that way today.
They go out of their way to warn you it’s not the same level of security as iMessage.
Google Messages shows "This chat is now end-to-end encrypted" between compatible devices today.
Is CSA really that widespread in Europe that everyone's chat messages have to be monitored? And if it is that widespread, shouldn't they try to address it socially to prevent CSA as much as possible rather than try to catch just the subset of tech-savvy abusers, that too after they've already committed CSA?
It’s not about CSA, it’s about illegal content. And laws change all the time.
For example, an individual can generate AI images of Hollywood actors using Stable Diffusion and a decently powerful computer. Said individual had the right to share those images online with a community.
Now however the sharing and distribution of said images is considered illegal in my USA state.
So, are the images said individual created and shared three years ago subject to prosecution? Even if the law went into effect 3 months ago?
> Even if the law went into effect 3 months ago?
No. The right not to be tried for actions that weren't crimes at the time is pretty universally applied in the west (I am not aware of the legal situation in other parts of the world, but I imagine it's honored there too). (Article 7 of the European Convention on Human Rights for the EU, Article I, Section 9 & 10 of the constitution for the US)
> So, are the images said individual created and shared three years ago subject to prosecution?
Generally, criminal acts are judged according to the rules of the jurisdiction where they happened, so I wouldn't be too worried about this. This isn't a universal rule though, so you won't find it enshrined in constitutions or treaties.
Of course not, it's just a pretense for passing this law because its political suicide to instead say "We don't want to do any actual police work and instead want to create a massive surveillance state and monitor everything you say and do so we can better control our populations."
CSAM is just the excuse, as it is with any other laws of this nature in the past.
Agree completely. These laws are either a wedge for broader surveillance or a massive compromise on everyone else’s rights to catch a subset of a subset of users.
Everyone in this debate understands that CSA is a pretext. Nothing is going to make any sense to you if you think ChatControl is an earnest and sincere to fight CSA in particular.
The ultimate goal is for computers to run only authorized programs and to license and monitor development tools like the Soviets monitored typewriters.
With the access to phones, underage teenager may be taking nude pictures of themselves. They should be put in jail where they belong. /s
I wonder where platforms like slack would land in all of this, and how would they go about akeeping people from just using their own encryption e.g. pgp over unencrypted channels? Is public key cryptography too weak to matter?
Slack is not end-to-end encrypted and belongs to a US company. So there is no need for ChatControl there: the US government already has access to everything that is written on Slack.
I believe they are referring to using GPG to encrypt data before putting it into Slack, much like using the out of band OTR. In that case all the data shared between those using GPG or OTR would only be accessible to those with the right out of band keys. There are probably not a lot of people doing this, or not enough for governments to care. I do this in IRC using irssi-otr [1].
If that ever became illegal because encryption then groups of people could simply use scripts or addons to pipe through different types of encoding to make AI fuzzy searches harder. They can try to detect these chains of encoding but it will be CPU expensive to do every combination at scale given there are literally thousands of forms of encoding that could be chained in any order and number.
Mon -> base64 -> base2048 [2]
Tue -> base2048 -> base131072 [3]
...and so on.
[1] - https://irssi.org/documentation/help/otr/
[2] - https://github.com/qntm/base2048
[3] - https://github.com/qntm/base131072
> I believe they are referring to using GPG to encrypt data before putting it into Slack
In good approximation, nobody does that.
And anyone who is capable of communicating over PGP won't be covered by ChatControl anyway. They can keep using PGP over whatever they want, or just compile Signal from sources.
> If that ever became illegal because encryption then groups of people could simply use scripts or addons to pipe through different types of encoding to make AI fuzzy searches harder.
I don't think that this makes any sense at all. This is some kind of poor encryption. Either you honour the law and you send your messages in plaintext, or you don't and you use proper encryption. There is nothing worth anything in-between.
If encryption is illegal, those who really need it can still use steganography.
If you really want to use encryption under a state where it's forbidden and communication are monitored you rather want to hide your encrypted messages inside cat pictures and tiktok videos. Because blatant obfuscation might trigger warning and draw attention.
In the end it's not about making encryption technically impossible but illegal, and if you use it you'll be prosecuted.
I fear when its become illegal to not have a remote scanner on my computer broadcasting file contents, invoking GPG will be of much less use.
This legislation makes every digital communication open to being policed at the source. It is far too overreaching and too rife for abuse.
You are already looking for workarounds like people struggling under authoritarian regimes.
This is completely unacceptable.
Anyone one who does anything private or illegal will bypass that with tools that will be popular as a result. The government is left with scanning the data of the remaining 90% of population.
They choose something sensitive as a pretext to push their agenda.
They want the power to arrest you for your private thought crimes too.
and keep them forever to use them against you in the future, if you become a "problem"
The one thing that I never see answered in the proposals is a simple answer to, "what's stopping CSAM users from using open-source encryption?".
You can ban this at a provider scale, but you simply can't track or enforce custom implementations at a small scale.
Which political parties in which countries should one vote for?
It's a good campaign, but let's say national elections are coming, one should know which politicians are in favour or against.
How else can we let our opinion be known other than by voting for the right politicians?
A nation is a concept that comes into existence only because people agree to lose some of their freedom, income and privacy. To what extent is the question. 100& privacy is not possible and it simply derails a nation, due to lack visibility and lack of control.
Indeed, the world was a chaotic place before the soviets invented CCTV and allowed therefore the creation of civilization.
Out of interest, what happens in the case of say an open source chat app developed outside the EU. Let's add that the developers are anonymous too, like truecrypt. What power does this legislation have then?
They can just mandate it at the OS level. I don't know if the proposal envisions that already, but if it becomes popular surely that would come next.
App stores that operate in the EU are subject to EU law, and can be forced to remove noncompliant apps.
Ahh, but they’ve already mandated side loading to piss off apple! Bit of an own-goal there.
> Apps installed through alternative app distribution undergo a Notarization process to ensure every app meets baseline platform integrity standards...
> Notarization for iOS and iPadOS apps is a baseline review that applies to all apps, regardless of their distribution channel, focused on platform policies for security and privacy and to maintain device integrity.
https://support.apple.com/en-us/118110
Why do you think the EU hasn’t opposed Apple’s plan to notarize every app, even sideloaded ones? They like the censorship potential.
I think many outside of EU dismiss this as an EU only thing and don't think much about it.
1. Have you ever texted someone from EU? You are now chat controlled too.
2. EU is pumping billions to foreign countries to promote EU values. How long until they condition this "help" with chat control?
Can anyone explain to me what keeps anyone who doesn't want to be monitored from just sending PNGs (or similar) containing messages encrypted in each pixels LSBs?
Doesn't all that just force everyone who has something to hide to use something else, less obvious?
Presumably the distribution of an app that facilitates that would become illegal as well.
But would that actually stop people? I can say with certainty a law such as this would encourage me to go out of my way to create and distribute such software.
Probably friction. Will you be able to convince your friends to do that?
No, probably not - but those bad guys with all their child porn and terrorist plans won't mind the friction (those will either encrypt or become EU politicians).
You would be surprised.
I mean, look at how many technically savvy people use Telegram and think it is "safe".
Ever heard of top government officials mistakenly inviting a journalist in a group sharing top secret information?
Are the Europeans insane? The modern world is becoming a horror. I think I would rather live in a dark forest. Life is becoming pointless.
> Are the Europeans insane?
I don't think so. If they were, it would actually be better: one can have sympathy for insanity, and at least isolate it, if not treat it.
Instead, it's extreme insecurity combined with limitless regard for infallible authority. The thought that the hoi polloi might write or say things that are beyond scrutiny is intolerable. That's the insecurity part. And all intolerable things must be criminalized, because in Europe, laws infallibly fix everything. That's the authority part.
That's not insanity. That's just how you behave when you imagine it is your mandate to perfect the world and indulge hubris sufficient to believe you have the wisdom to do so.
The is the n-th attempt to install some regulation that would (a) lead to increased surveillance of most of the population; and (b) is trivial to circumvent by those who the government is ostensibly trying to target. So clearly, the cost-benefit ratio is severely skewed for the EU population.
Assuming that the regulators are fully aware of the above points, it's not very hard to speculate what the real intentions behind all of this are.
> The is the n-th attempt to install some regulation
The sad part is that it would only take one attempt to codify the opposite into privacy laws as a basic right, should anyone ever bother to take up that gauntlet.
This is (mostly) about Tech companies' money, namely:
- Palantir Technologies
- 'not-for-profit' Thorn
> The Commission’s failure to identify the list of experts as falling within the scope of the complainant’s public access request constitutes maladministration. [0]
> ... the complainant contended that the precision rate of technologies like those developed by the organisation are often overestimated. It is therefore essential that any technical claims made by the organisation concerned are made public as this would facilitate the critical assessment of the proposal. [1]
> The Commission presented a proposal on preventing and combating child sexual abuse, looking in particular at detecting child pornography. In this context, it has mentioned that support could be provided by the software of the controversial American company Palantir... [2]
> Is Palantir’s failure to register on the Transparency Register compatible with the Commission’s transparency commitments? [2]
(Palantir only entered the Transparency Registry in March 2025 despite being a multi million vendor for Europol and European Agencies for more than a decade)
> No detailed records exist concerning a January meeting between European Commission President Ursula von der Leyen and the CEO of controversial US data analytics firm Palantir [3]
> Kutcher and CEO Julie Cordua held several meetings with EU officials from 2020 to 2023 - before the former stepped down from his role - including European Commission President Ursula von der Leyen, Home Affairs Commissioner Ylva Johansson, and European Parliament President Roberta Metsola.[4]
> The Ombudsman further concluded that Thorn had indeed influenced the legislative process of the CSAM regulation. “It is clear, for example, from the Commission’s impact assessment that the input provided by Thorn significantly informed the Commission’s decision-making. The public interest in disclosure is thus self-evident. [4]
> EU Ombudsman Emily O’Reilly has announced that she has opened an investigation into the transfer of two former Europol officials to the chat control surveillance tech provider Thorn. [5]
[0] https://www.ombudsman.europa.eu/en/decision/en/176658
[1] https://www.ombudsman.europa.eu/en/recommendation/en/179395
[2] https://www.europarl.europa.eu/doceo/document/E-9-2024-00016...
[3] https://www.euractiv.com/news/commission-kept-no-records-on-...
[4] https://www.euronews.com/next/2024/07/18/european-ombudsman-...
[5] https://www.patrick-breyer.de/en/chat-control-eu-ombudsman-l...
and if people point out EU is completely corrupt and we have complete breakdown of any agencies that should keep it under control, they get downvoted.
EU turns into fascist (policies controlled by corporations) quasi state before our eyes.
If you are working for any crime agency, put away biscuits and move your lazy arse to work!
My answer to "think of the children" is "I am thinking of the children"
* of their rights to privacy
* their right to live in a democracy
* the value of warrant based search vs nazi SS style
* I want them to enjoy at -least- as much privacy as I currently enjoy
* I don't want rando creeps reading their personal messages and keeping them forever, there's a reason memory fades, it lets us grow as people
Take it like this: your phone already "reads" absolutely everything you put on that phone. Apple or Google could do anything they want with that, but you trust them. You trust that they don't send everything that goes into your phone to their servers.
ChatControl would run locally on your phone. It would compare the images that you receive/send to a list of illegal images, and if you happen to deal with one of them, it would report you.
How is that destroying your democracy?
Disclaimer: I am against ChatControl, but too many people seem to not understand what the problem with ChatControl is.
Because it's closed source so you have no idea of what is happening. You can then scan for other things, such as "hate speech", or "tax evasion" and then the slope becomes more slippery than a lube party on a vinyl sheet, and Kim Jong Un awaits you at the Ski Bar at the bottom.
Those passive surveillance systems have a chilling effect on democracy, just like mandatory ID on social media, and provide politicians a lever so convenient that you know that it will be used, especially in the EU.
> Because it's closed source so you have no idea of what is happening.
Exactly! That's the problem!
It's not killing the encryption, it's not sharing all your communications with the government. Those are invalid arguments. The problem is that whoever controls the proprietary part of ChatControl (and that includes the list of illegal material) can abuse it to e.g. detect political opponents, or whatever they can imagine.
I am just asking that we use the valid arguments against ChatControl. I read a lot of invalid arguments that won't help convincing politicians that it is a bad idea. They need to understand why it is a bad idea, the real reason.
I think that the correct sentence would be that it kills the purpose of encryption. Which is to prevent anyone aside of the recipient from reading your message.
Oh, is this the infamous 'redacted list of attendees' when people inquired about who initially worked on this legislation/proposal?
EU seems to be really good at some things, but this is an example of a legislation that can do way much harm than benefit.
Where is Apple in all of this?
They're such proponents of privacy that they've actively started encrypting as much as possible for decades but now that the EU is about to break all that they're silent.
They raised such a fuss when the FBI asked to decrypt that single iPhone years ago, but now that millions are on the line... nothing?
When Apple attempted to anticipate these laws and propose a system which tried to navigate a compromise, the “pro-privacy” faction was so politically dumb they spread FUD about it and actively made sure no reasonable compromise could ever be reached. Now the public with reap what these advocates have sowed, good and hard.
With regards to the FBI incident, Apple said at the beginning of their statement, “This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.”
The EU is proposing a law. People assure me their laws are democratic and reflect the will of the people. Who is Apple to reject the outcome of public discussion?
The FBI letter was written in a context where an agency was acting without the support of the public. That’s why the framing was all about misuse of the All Writs Act and lack of Congressional blessing for the requested power.
What would you call a "reasonable compromise" between encryption and privacy?
ChatControl is exactly what Apple did. It's client-side, so no one is able to see your messages. The police sees if content hashes match known CSAM.
This must be one of the least popular pieces of regulation ever.
The number of people in these threads defending involuntary bugging of every phone because you can devil-advocate it maybe might actually save the children is insane for a forum called Hacker News. Either the contrarian population has been getting out of hand, or we have truly lost our minds and stand to lose what remains of our civil liberties.
This will never not be in the news, will it? I feel like it's been continuously for the past 10-15 years, under various names.
Just need to pass it once, unfortunately. And despite all the talk against it, they get a partial fresh start to the general public every time one of these is proposed.
The IRA quote to Thatcher comes to mind
The people that want this to happen, really really really want it to happen. They are never going to give up, so people need to remain vigilent.
Honestly, I fully expect that the scanning method is already implemented and used. The US has intervened with some pretty deep surveillance in the past (ie. Canada Sihk killing) and doesn't seem to need permission to get it.
Sounds to me like the EU is looking to get a more formal approval to act on data they already have.
The EU should rather look at the issues at the eastern border these days.
What would prevent me from writing my own program to do something simple like sending encrypted messages? Or just emails...
They'll push the scanning to the OS level, mandate that the OS does it. Hence the seemingly coordinated effort with Google on the sideloading changes, and enforcing play protect, etc.
Like the TPM & Microsoft scare when TPM first started arriving in hardware, and we all thought it would be used to lock out other OSes. Only it's for real this time.
> They'll push the scanning to the OS level
I don't know if this is possible so easily. Does the OS scan the memory of all applications? How does it know what is text and image data?
What if it is encryped or even just obfuscated? Does the OS then track all changes of memory etc?
Or you think it'll just have a rolling keylogger so you can't type in s.th. malicious?
Everything a process does beyond touching memory is going through a syscall. The OS serves every key press to such a program.
The proposed regulation only applies to publicly available services, and only binds service providers, not end users. There is nothing preventing you from sending encrypted emails, just as there is nothing preventing you from pasting encrypted messages into WhatsApp or storing and sharing encrypted files in Dropbox.
What would prevent me from writing my own program to do something simple like sending encrypted messages?
Nothing. That is, nothing until your application becomes popular. I will keep encrypting my emails and they can pound sand once legislation for this makes it to my country. It should be a while before these shenanigans are in every distribution or kernel for Linux.
Good luck being a DOD contractor overseas, wtf?
Good luck having a bank account
Same thing that prevents you form buying a knife and walking around stabbing people.
So you think this is comparable to sending around some data over TCP or UDP?
The people who are trying to install this kind of law basically do!
They want to change the public perception from "Private encrypted communication is good and desirable" to "Encrypted is unsafe. Encrypted could be scary. Encrypted enables Bad People."
In a vain attempt to inhibit access to non-broken cryptography, we will probably see operating systems that allow actual root access to the user -- or even just allowing non-manufacturer-signed executables to run! -- being painted as "unsafe platforms." Apple has already transitioned most of the way to being fully in the "trusted computing" camp, since it takes a great deal of gymnastics to even modify the OS because of the Mac's sealed system volume, and out of the box all executables must be blessed by Apple, meaning governments can put their thumb on Apple to force them to disallow any non-broken crypto tools from being used. I know this can be changed in Settings for now, but that'll probably go away eventually.
Microsoft will be next of course, and Linux will be portrayed as a "hacking tool" by contrast to the commercial OSs.
I have a theory that everything that happens in regards of governmental control in China and Russia will eventually be copied in some form in western countries.
So what if I host my own messaging service? As in: bring back IRC?
The way I understand if your solution would become popular, the law can come after you to provide a log of messages in plain text.
Also they will have the legal power to force the popular operating systems to enforce generic keylogging/packet capturing and whatnot.
I don't see how they can come after anyone who's using a specific protocol [0] by law. Expanding on this thought: if Chat Control passes, it will just be the death of social media as a chat platform. People will swap to something more rudimentary where it can't be enforced. Primary reason why being that it simply will be so much faster/more convenient than the apps which are forced to use chat control.
The same reason as why streaming services are being ditched in favor of piracy will happen to social media.
[0]: https://en.wikipedia.org/wiki/IRC
I don't think ChatControl is a good idea. I also think that if you want to convince people of that, using the same misleading language tactics as the other side is not the way to go.
> These scanning systems get it wrong most of the time. [...] Irish law enforcement confirms this: only 20.3% of 4,192 automated reports actually contained illegal material.
Wrong most of the time that they report something. Technically correct, although a somewhat tricky formulation.
Literally next paragraph:
> Even with hypothetical 99% accuracy (which current systems don’t achieve), scanning billions of daily messages would generate millions of false accusations.
This is a different accuracy percentage: here the author means 99% of all messages, not only the reported ones, which the previous 20.3% referred to. Furthermore, these two paragraphs together sound very fishy: if current systems are not accurate enough to generate "millions of false accusations", presumably (?) they generate at least that. But with the 20.3% true positives fraction, that would mean hundreds of thousands true accusations per day.
Which part am I misunderstanding?
With Apple being able to forbid application on the App Store and Google now requiring developer to identify themselves before compiling app, and being able to block sideloading at any time, I don’t see what choice is left if you want to bypass that privacy invasion.
I mean for the actual legit user. Pedophiles will still be able to use encrypted mail, Android phone that are not Google certified and so free to sideload anything, or even just passworded zip.
The USA wants this to remain a monopoly.
Don't worry the governments would NEVER use this against you for political reasons later.
Then they're not encrypted apps.
Unenforceable tripe. Do not comply.
Ugh, I hate this but literally no one is paying attention.
Its hard because everytime this gets defeated all the EUSSR people just wait a year and try again…
Most arguments I see against ChatControl sound like bullshit to me. How do we expect to convince anyone to go against ChatControl with those?
I feel unease when it comes to ChatControl; I don't want my devices to run proprietary, opaque algorithms on all my data. And it feels like it fundamentally has to be opaque: nobody can't publish an open source list of illegal material together with their hash (precisely because it is illegal). That is why I don't want ChatControl: I would want someone to formally prove that it cannot be abused, just because of what it means. The classic example being: what happens if someone in power decides to use this system to track their opponents?
But most comments and most articles talk about anything but that, with honestly weird, unsupported claims:
> It's the end of encryption
How so? What appears on my screen is not encrypted and will never be encrypted, because I need to read it. We all decrypt our messages to read them, and we all write them unencrypted before we send them.
> It won't fight CSAM
Who are you kidding? Of course it will. It will not solve the problem entirely, but it will be pretty damn efficient at detecting CSAM when CSAM is present in the data being scanned.
> With ChatControl, every message gets automatically checked, assuming everyone is guilty until proven innocent and effectively reversing the presumption of innocence.
When you board a plane, you're searched. When you enter a concert hall, you're search. Nobody would say "you should let me board the plane with whatever I put in my bag, because I'm presumed innocent".
> While your messages still get encrypted during transmission, the system defeats the purpose of end-to-end encryption by examining your content before it gets encrypted.
Before it gets encrypted, it is not encrypted. So the system is not breaking the encryption. If (and that's a big if) this system was open source, such that anyone could check what code it is running and prove that the system is not being abused, then it would be perfectly fine. The problem is that we cannot know what the system does. But that's a different point (and one of the only valid arguments against ChatControl).
> Proton point out this approach might be worse than encryption backdoors. Backdoors give authorities access to communications you share with others. This system examines everything on your device, whether you share it or not.
How is it worse? Backdoors give access to communications, this system (on the paper) does not. This system is better, unless we admit that we can't easily audit what the system is doing exactly. Which again is the one valid argument against ChatControl.
> The regulation also pushes for mandatory age verification systems. No viable, privacy-respecting age verification technology currently exists. These systems would eliminate online anonymity, requiring users to prove their identity to access digital services.
This is plain wrong. There are ways to do age verification anonymously, period.
> Police resources would be overwhelmed investigating innocent families sharing vacation photos while real crimes go uninvestigated.
How to say you don't know how the police works without saying you don't know how the police works? Anyway, that's the problem of the police.
> Google’s algorithms flagged this legitimate medical consultation as potential abuse, permanently closed his account and refused all appeals.
The problem is the closing and refusing of appeals.
> The letter emphasizes that client-side scanning cannot distinguish between legal and illegal content without fundamentally breaking encryption and creating vulnerabilities that malicious actors can exploit.
Then explain how? How is it fundamentally breaking encryption and creating vulnerabilities? Stop using bad arguments. If you have actual reasons to go against ChatControl, talk about those. You won't win with the bullshit, invalid arguments.
> ChatControl catches only amateur criminals who directly attach problematic content to messages.
Yep, that's an argument in favour of ChatControl: it does catch some criminals. How many criminals are professionals? Do you want to make it legal to be an amateur criminal?
Don't get me wrong: I am against ChatControl. Because of one argument I believe to be valid: we fundamentally cannot know what the algorithm doing the scanning is doing, so those who control it could abuse it. Of all the discussions I have seen against ChatControl, I haven't seen another valid argument. But this one is enough.
Stop saying bullshit, start using the valid arguments. And maybe politicians will hear them.
> Don't get me wrong: I am against ChatControl. Because of one argument I believe to be valid: we fundamentally cannot know what the algorithm doing the scanning is doing, so those who control it could abuse it. Of all the discussions I have seen against ChatControl, I haven't seen another valid argument. But this one is enough.
It is not enough to know what the algorithm is doing. It also needs to be possible (for the average user as well) to stop it from doing reprehensible things. If a client-side scanning algorithm is actually searching for e. g. political content, it is possible to detect it via reverse engineering, but merely knowing it won't solve the problem, but instead lead into self-censorship.
Thanks for your feedback. You’ve raised some interesting points, I’ll take them into account and try to update some of my arguments.
I was just thinking that if something like this ever does get through and become law, then creating open-source alternatives which do not obey these laws would be quite trivial. What would not be trivial would be deciding where to host the servers and source code, and how to actually get this software onto people's devices.
What country would be safe for hosting code that does this that people would also trust in general? Would this be hosted on the dark web or would someone actually be brave enough to host it on their private machines? Would there be DNS that could point to this?
Then how would you install the software? You'd need a way to side-load it, which means you'd want a way to sign it. Which means either adding a new root signing authority or being able to have an existing root authority sell you a signing certificate and not revoke it.
You kind of quickly end up in some weird dystopian cyberpunk setting thinking all of this through.
EU CRA disallows shipment of non-accredited binaries in "critical" software categories.
Okay so are they going to block foreign github repos? This seems totally unenforceable.
You underestimate the power EU believes it has
> believes
Subset of industry feedback on EU CRA, https://github.com/orcwg/cra-hub/blob/main/product-definitio...
You just mandate the scanning into the OS, then mandate what OSes hardware is allowed to boot.
> You kind of quickly end up in some weird dystopian cyberpunk setting thinking all of this through.
The most dystopian concept out of everything you mentioned is still "you can't install unsigned software" to me.
Good luck preventing people from loading up a web page that runs a pure JavaScript (or WebAssembly) implementation of common cryptography algorithms and lets people copy and paste each other encrypted messages.
Chat Control wants to require on-device scanning, so if this becomes common they can move to mandating scanning at the OS or browser level as well.
Good luck convincing American tech to take on a liability like this. There's a reason big tech is moving to e2e encryption like Signal and it isn't user privacy. Telling governments to fuck off because you don't have the data limits liability.
"Luck" wasn't what coerced American tech businesses into subsuming the PRISM program liability. Your naivete is admirable though.
Privacy for me and not for thee?
They'll push for it repeatedly until they succeed and then it will be irreversibile.
I guess they don’t know you can encrypt files before you send them. They don’t even have to look like encrypted files.
Chat Control imagines your device being required to scan and report on all your plaintext.
Encrypted data can be input via analog device sensors.
Can anyone try to explain to be how this is not a strain of mind-reading and thought crime? I mean, sure, we’re several decades away from the big event where society will adjudicate thought-crime, but this appears to be one of the first skirmishes.
ThoughtControl 2030: EU wants to scan all private thoughts and communications. Encryption as a concept prohibited except for corporations with security clearance and political connections.
Thought crime has been illegal in the EU/UK for quite some time. But only a certain kind of thoughts
Isn't this the same regulatory body that enforced GDPR to supposedly provide citizens with more rights as to what happens to their data? Amusing.
What a classic "Think of the children!" excuse for abuse.
If you are a smart kid in europe learn to vibecode XChacha20 & ed25519 encryption keys for you and your friends to chat with so you can go tell your incompetent government to go fuck themselves.
but then they'll make this a crime
exactly, this is just step 1
they're too slow,
by the time they do the kids can just vibecode another chat app for themselve
First they came for the Lockdown skeptics And I did not speak out Because I was not a Lockdown skeptic Then they came for the Social distancing skeptics And I did not speak out Because I was not a Social distancing skeptic Then they came for the Face mask skeptics And I did not speak out Because I was not a Face mask skeptic Then they came for the Vaccine skeptics And I did not speak out Because I was not a Vaccine Skeptic Then they came for the Vaccine passport skeptics And I did not speak out Because I was not a Vaccine passport skeptic Then they came for me And there was no one left To speak out for me
Sounds like a complete tyrannical dystopian hell hole to live in.
But nevermind, We love the EU! /s
I'm absolutely convinced now that anti-war stances will be soon included in the scope of this client side scanning. Peaceniks beware, citizens should crave war and dying for their elites.
To me this is simply an act of terrorism. People who are behind those proposals should be charged and face trial.
There is no excuse for this and it is a stain on EU history for even letting this go so far.
Anyone proposing this should not only be sacked but also referred to de-radicalisation / anti-terrorism programme in their country and forever banned from holding any kind of public sector office.
There is no excuse.
Why downvote? Because the terrorists wear suits, speak in committees, are mostly white, and there’s no blood on the floor (yet)? The method is different, but the aim is the same: intimidation and control of a population for political ends.
If terrorism is defined as using violence or threats to intimidate a population for political or ideological ends, then “Chat Control” qualifies in substance.
Violence doesn’t have to leave blood. Psychological and coercive violence is recognised in domestic law (see coercive control offences) and by the WHO. It causes measurable harm to bodies and minds.
The aim is intimidation. The whole purpose is to make people too scared to speak freely. That is intimidation of a population, by design.
It is ideological. The ideology is mass control - keeping people compliant by stripping them of private spaces to think, talk, and dissent.
The only reason it’s not “terrorism” on paper is because states write definitions that exempt themselves. But in plain terms, the act is indistinguishable in effect from terrorism: deliberate fear, coercion, and the destruction of free will.
You can argue legality if you like, but the substance matches the textbook definition.