Digital Privacy and Freedoms Are Under Threat
The STOP CSAM Act threatens encryption and online privacy. Learn why the EFF opposes it — and how you can help defend digital freedom.

TL;DR:
A new bill in Congress, the STOP CSAM Act of 2025 (S.1829), aims to tackle child exploitation online, a goal we all support. But in the process, it threatens to dismantle key digital protections that keep all of us safe: encryption, free speech, and privacy. The Electronic Frontier Foundation (EFF), a trusted champion of digital rights, is sounding the alarm. We should all be listening.

Read the message from the EFF
What Is the STOP CSAM Act?
On its face, the STOP CSAM Act sounds like common-sense legislation: protect children, punish predators, and help victims. No reasonable person opposes that mission. But buried in the bill are provisions that could radically expand government and corporate power over the internet, in ways that weaken encryption and open the door to surveillance and censorship.
This bill would:
- Undermine encrypted services by creating liability for tech providers that offer end-to-end encryption, pushing companies to create backdoors.
- Force take downs of lawful content based on vague accusations, turning platforms into over-cautious moderators or outright censors.
- Expose private communications and user data to government and corporate scrutiny, even if no crime has been committed.
If you’ve ever relied on encrypted messaging, cloud backups, or privacy tools — this bill impacts you.
Why It Matters (Even If You’re Not a Techie)
Imagine a world where your private messages, health records, or stored files are only “private” until someone in power says otherwise. This bill threatens to make that world a reality. The moment we allow backdoors into encryption “just for the good guys,” we also give the keys to hackers, foreign governments, and abusive regimes.
Encryption works because it’s absolute. Either everyone is protected, or no one is.

The EFF Is Taking a Stand And So Should We
The Electronic Frontier Foundation (EFF), founded in 1990, is one of the most principled and consistent defenders of digital rights in the world. They’ve fought for your online privacy, freedom of speech, and the right to use strong encryption for over three decades.
In their recent article, the EFF lays out the problem with the STOP CSAM Act clearly: it would break the tools that keep us safe, while offering no meaningful improvement to child safety. The result? More surveillance, less privacy, and a dangerous precedent.
Good Intentions, Terrible Consequences
As a parent, I believe strongly in protecting children. But government overreach, especially when it comes to surveillance and private communications, is never the answer. The road to tyranny is often paved with noble-sounding laws.
The STOP CSAM Act gives both the government and private corporations the ability to sidestep encryption and hold service providers liable for not monitoring users closely enough. That’s a recipe for abuse. Today it’s child safety. Tomorrow it’s political speech, protest coordination, or encrypted journalism.
Current Status
As of today, the STOP CSAM Act of 2025 (S. 1829) is still in the early stages of the legislative process:
- Introduced in the Senate by Senator Josh Hawley (R-MO), alongside cosponsors including Sen. Dick Durbin (D-IL), on May 21, 2025. It has been read twice and referred to the Senate Judiciary Committee.
- No further progress such as committee hearings, markup, amendment, or a vote, has occurred yet.
- It is awaiting consideration by the Senate Judiciary Committee, which will determine if it advances to a full Senate vote.
- It's the critical window to influence its trajectory: during Senate Judiciary review.
- Congressional outreach now can shape amendments or halt dangerous provisions.
- The lack of a companion House bill means advocates can also reach out to House leadership to prevent parallel legislation.
How You Can Help: Contact Your Representatives
Tell your Senators and Representatives: Do not support S.1829.
Here’s how:
🔍 Step 1: Find your lawmakers
Visit https://www.congress.gov/members or https://www.commoncause.org/find-your-representative/
✉️ Step 2: Call or send a message
You can use this sample message:
Hello, [consider making this a formal greeting if writing]
I’m a constituent writing to express my strong opposition to the STOP CSAM Act of 2025 (S.1829). While I fully support efforts to protect children, this bill poses a serious threat to encryption, privacy, and free speech. It could force companies to weaken security for all users and opens the door to censorship and surveillance.
Please oppose this bill and stand up for digital freedoms and privacy.
Thank you,
[Your Name]
Want to Do More? Support the EFF
You can:
- Follow the EFF on Twitter and Facebook
- Donate to their work at https://supporters.eff.org/donate
- Share their action page and educate others: EFF’s Campaign Against STOP CSAM

A very deep dive
read on if you want to join me deep in the weeds…
Encryption and Privacy Under the STOP CSAM Act of 2025

The CSAM Act enables the government to gain access to the keys of your digital safe. They will get to decided what and when they are can access, not you.

End-to-End Encryption: A Cornerstone of Privacy and Security
End-to-end encryption (E2EE) is a method of secure communication that ensures only the sender and intended recipient can read the contents, not even the service provider or government. In practice, this means data is encrypted on a user’s device and only decrypted on the recipient’s device, with no intermediate party holding the keys. This provides the “best protection” for personal data, shielding it from hackers, companies, and surveillance . Apps like Signal and features like Apple’s iCloud Advanced Data Protection, rely on E2EE so that messages, backups, and calls stay confidential. Strong encryption protects individuals from cyberattacks and also empowers free expression, people can communicate without fear of eavesdropping, censorship, or warrant-less monitoring . In short, if encryption is weakened, everyone’s privacy and security is weakened; if it remains strong, it safeguards not only personal chats but also journalism, activism, attorney-client communications, and other sensitive exchanges fundamental to a free society.
What is CSAM Scanning and How Does It Work?
CSAM scanning refers to automated systems that scan user content for known Child Sexual Abuse Material (CSAM). This can happen on cloud platforms (e.g. Google or Apple scanning photos uploaded to servers) or even on end-user devices (“client-side scanning”). These systems typically use databases of hashes (digital fingerprints) of known illegal images and compare them to user files. For example, Apple in 2021 announced a plan to have iPhones scan photos before upload and alert Apple if a certain number of CSAM matches were found . The goal is to catch criminals, but implementing this means that every photo or message would be automatically inspected by algorithms. To work around encryption, such scanning often happens at the device endpoint (before data is encrypted or after it’s decrypted on arrival). Critics note that this essentially turns a personal device into a surveillance scanner: even innocent private content gets analyzed. Signal’s president Meredith Whittaker warned that client-side scanning would “turn everyone’s phone into a mass surveillance device that phones home to tech corporations and governments”, fundamentally undermining user privacy . The implications are serious: if mandated, CSAM scanning would end the notion of true private storage or communications, since an automated watchdog is always peering into your data.
Unintended Consequences of Scanning: False Flags and Broken Lives
While the intent of CSAM detection is to protect children, real-world cases show how these systems can misfire with harmful results. One striking example is Google’s false flag of a father’s account in 2022. A man in San Francisco took photos of his toddler’s infected groin area to consult a doctor – an innocent act of parenting. But because his Android phone auto-backed up images to Google Photos, Google’s automated filters flagged the medical images as CSAM . Without context, the system saw “child’s groin photo” and sounded the alarm. Google shut down the man’s Gmail and other services, and even filed a report to the authorities. The account remained disabled, and the man (initially treated as a potential criminal) faced a police investigation until it was proven he did nothing wrong . Google later refused to reinstate his account, branding his content as a severe policy violation. Privacy experts call this an “inevitable pitfall” of trying to solve a complex social problem with automation . No algorithm can understand context. As a result, an family’s life was upended – important emails, photos and digital life locked away – due to a false positive. This case underscores how even well-intentioned scanning can punish innocent users, illustrating the collateral damage when private data is subject to constant surveillance. It’s a sobering warning that “the machinery” of automated scanning and human reviewers can make grave mistakes , and those errors can devastate peoples’ lives.

The Slippery Slope: From CSAM Scanning to Mass Surveillance

A critical concern with mandated scanning systems is the “slippery slope” – once the capability exists to scan everyone’s communications for CSAM, what stops it from being repurposed? Technology built into our devices to scan for one category of illegal content can easily be redirected to scan for other material. Privacy advocates warn that a backdoor or client-side scanner for CSAM would create a new form of surveillance . If companies like Apple or Google can scan files on your device, governments could compel them to search for other content. Indeed, the American Civil Liberties Union cautioned that if Apple had built its 2021 client-side scanning system, authorities could demand it be used to detect political or religious content: images “that politicians find objectionable” or that “praise opposition parties, mock political leaders… or circumvent government censorship” . In other words, a system intended to shield children could be twisted into a general tool for censorship and political repression. This isn’t hypothetical – it’s a well-documented pattern in digital surveillance. Once a monitoring mechanism exists, the list of targets tends to expand. Today it’s CSAM; tomorrow, it could be drug content, hate speech, copyrighted media, or dissident speech – all scanned automatically under government mandate. Such mission creep is why cybersecurity experts call these proposals dangerous. As Signal’s president put it, it’s “magical thinking” to believe you can have a backdoor that “only works for the good guys” – any weakness in encryption or any scanning system will eventually be exploited for broader surveillance . This is how a purported child-safety measure can pave the way for a full-fledged mass surveillance regime monitoring everyday citizens.
Lessons from the UK: How Anti-Encryption Laws Backfire Globally
We don’t have to speculate about these risks – recent events in the UK provide a cautionary tale. The United Kingdom’s Online Safety Bill and related efforts included provisions that could require messaging platforms to remove encryption or implement scanning to combat CSAM. In fact, earlier this year the UK government issued a secret order under the Investigatory Powers Act forcing Apple to disable certain security features . Specifically, Apple was pressured to backdoor its new end-to-end encryption for iCloud backups (the “Advanced Data Protection” feature). Rather than comply, Apple chose to pull that feature for UK customers, reducing security for millions of users . Digital rights groups were alarmed. Amnesty International called the UK order an “alarming overreach” – not only would it let UK authorities pry into people’s private data, it would also undermine the privacy of users worldwide (since a backdoor for one country is a backdoor for all) . As Amnesty and Human Rights Watch noted, “access to device backups is access to your entire phone,” and strong encryption is vital to prevent such intrusive access . In other words, the UK essentially tried to compel a weakness in Apple’s encryption, threatening the privacy rights of users far beyond its borders . The backlash has been intense: technologists and even U.S. lawmakers warned that forcing encryption backdoors in the UK would create “systemic vulnerabilities” exploitable by criminals and governments worldwide. Signal flatly stated they would refuse to weaken their encryption – even if it meant pulling out of the UK market – because doing so would endanger all their users globally . The UK example shows that anti-encryption laws don’t just impact one country; they send a ripple effect through the tech ecosystem, jeopardizing the security tools that people around the world rely on.
Privacy-Focused Apps Under Threat: Signal and the Sanctity of Secure Chats
Encrypted messaging apps like Signal are built around a simple promise: no one but you and the intended recipient can ever access your messages. This promise is not just a feature but the very core of their service. For instance, Signal does not even keep cloud backups or metadata accessible to the company – it’s designed so that if law enforcement asks Signal for your messages, Signal cannot provide them (because it literally has no access). Laws like the STOP CSAM Act of 2025 directly attack this model. The bill would create new criminal and civil liabilities for platforms that “promote or facilitate” child exploitation or fail to remove CSAM . Crucially, it’s written so broadly that even an encrypted service that unknowingly hosts illegal images (which it cannot see) could be found liable . In practice, this means an end-to-end encrypted app is at risk simply for being encrypted – because it’s unable to monitor content, prosecutors might argue it’s turning a blind eye or “recklessly” facilitating abuse . The STOP CSAM Act offers an “affirmative defense” if a provider can prove it was technologically impossible to remove the CSAM without breaking encryption . But this flips the burden onto the service: they must fight costly legal battles and somehow prove a negative (that they truly couldn’t have done more), which many smaller companies can’t afford . The end result is that apps like Signal face an impossible choice: either weaken their encryption to scan user content (undermining their fundamental promise), or risk constant lawsuits and even criminal charges. Signal’s leadership has been adamant that they will do the latter – they would exit any market or face punishment rather than build a backdoor. “We will not walk back, adulterate, or otherwise perturb the robust privacy and security guarantees that people depend on,” said Signal’s president, emphasizing that they refuse to undermine encryption even under legal pressure . STOP CSAM, however, threatens to make such apps effectively illegal unless they betray their privacy principles. This would be a devastating blow to global privacy: millions of activists, journalists, lawyers, and ordinary citizens rely on these tools. Breaking Signal’s encryption “for the children” would break it for everyone, destroying a vital refuge of secure communication in a world where data surveillance is the norm.

Authoritarian Parallels: Surveillance Tech and Suppression of Dissent
It is worth noting that the kind of surveillance infrastructure being debated in democracies under the banner of CSAM prevention is strikingly similar to tactics used by authoritarian regimes. In countries like China and Russia, government authorities routinely demand access to private communications in the name of security or public order. China, for instance, has long required tech companies to maintain encryption backdoors or key escrow so that the state can decrypt any data it wants. This “secure and controllable” mandate ensures that no conversation is truly private from the government’s eyes. Russia has implemented laws (like the Yarovaya law) compelling providers to hand over encryption keys, and it has attempted to ban services that refuse (such as when Telegram declined to enable government access). These regimes provide a chilling preview of a world with weakened encryption: widespread surveillance of citizens’ communications, and the use of that surveillance to crush dissent. Indeed, repressive governments eagerly exploit encryption weaknesses to target their critics. If given a “master key” or built-in scanner, they will use it to persecute journalists, opposition figures, lawyers, minority groups – anyone deemed a threat to the regime . The STOP CSAM Act’s scanning mandates and backdoor pressures could create tools that authoritarian states would love to get their hands on. As one secure email provider noted, even liberal democracies have abused surveillance powers, so imagine what China or Russia would do with a mandate for encryption backdoors . By normalizing the defeat of encryption, we also normalize the practices of digital dictatorships. This is why the head of WhatsApp warned that if a democracy like the UK undermines encryption, “governments around the world [especially where democracy is weaker] will do exactly the same thing.” In effect, adopting laws like STOP CSAM (or its UK/EU equivalents) risks handing a playbook to authoritarian regimes: they can cite “child protection” as precedent to demand the same or go even further. The endgame looks a lot like Orwell’s worst nightmares – a world where private discourse is dead, and every message is subject to potential monitoring by the state.


Conclusion: Balancing Safety and Freedom
Protecting children online is vitally important, but breaking encryption and mandating mass scanning is a cure worse than the disease. End-to-end encryption is not a loophole for criminals – it’s a fundamental safeguard for everyone’s security, from the vulnerable individual to national security as a whole. Client-side scanning and backdoors, as promoted by the STOP CSAM Act of 2025, would force a betrayal of that security and open the door to pervasive surveillance and abuse. The advanced implications discussed – false positives ruining innocent lives, mission creep toward political surveillance, global tech companies withdrawing services, and authoritarian-style monitoring, demonstrate that the stakes couldn’t be higher. Once we lose truly private, secure communication, we lose a cornerstone of democracy and personal freedom.
Lawmakers must carefully weigh these consequences. The experiences of big tech companies, activists, and even other countries all point to the same truth: we can fight child abuse without abolishing digital privacy. In crafting solutions, it’s critical to uphold the encryption and privacy protections that keep us all safe, because a world without places to speak freely and securely is a world that is dangerous in ways we cannot afford to ignore.
Further reading







