I've already explained the dangers of the EARN IT Act, which is supported by 19 Senators, who are misleading people with a "fact" sheet that is mostly full of myths. As Senator Wyden has explained, EARN IT will undoubtedly make the problem of child sexual abuse material (CSAM) worse, not better.
In my initial posts, I compared it to FOSTA, because EARN IT repeats the basics of the FOSTA playbook. But -- and this is very important since EARN IT appears to have significant momentum in Congress -- it's not just FOSTA 2.0, it's significantly more dangerous in multiple different ways that haven't necessarily been highlighted in most discussions of the law.
First, let's look at why FOSTA was already so problematic -- and why many in Congress have raised concerns about the damage done by FOSTA or called for the outright repeal of FOSTA. FOSTA "worked" by creating a carveout from Section 230 for anything related to "sex trafficking." As we've explained repeatedly, the false premise of the bill is that if Section 230 "doesn't protect" certain types of content, that will magically force companies to "stop" the underlying activity.
Except, that's wrong. What Section 230 does is provide immunity not just for the hosting of content, but for the decisions a company takes to deal with that content. By increasing the liability, you actually disincentivize websites from taking action against such content, because any action to deal with "sex trafficking" content on your platform can be turned around and used against you in court to show you had "knowledge" that your site was used for trafficking. The end result, then, is that many sites either shut down entirely or just put blanket bans on perfectly legal activity to avoid having to carefully review anything.
And, as we've seen, the impact of FOSTA was putting women in very real danger, especially sex workers. Whereas in the past they were able to take control of their own business via websites, FOSTA made that untenable and risky for the websites. This actually increased the amount of sex trafficking, because it opened up more opportunity for traffickers to step in and provide the services that sex workers had formerly used websites for to control their own lives. This put them at much greater risk of abuse and death. And, as some experts have highlighted, these were not unintended consequences. They were consequences that were widely known and expected from the bill.
On top of that, even though the DOJ warned Congress before the law was passed that it would make it more difficult to catch sex traffickers, Congress passed it anyway and patted each other on the back, claiming that they had successfully "fought sex trafficking." Except, since then, every single report has said the opposite is true. Multiple police departments have explained that since FOSTA it has made it harder for law enforcement to track down sex traffickers, even as it's made it easier for traffickers to operate.
Last year, the (required, but delivered late) analysis of FOSTA by the Government Accountability Office, found that the law made it more difficult to track down sex traffickers and did not seem to enable the DOJ to do anything it couldn't (but didn't!) do before. The DOJ just didn't seem to need this law that Congress insisted it needed, and basically has not used it. Instead, what FOSTA has enabled in court is not an end to sex trafficking, but ambulance chasing lawyers suing companies over nonsense -- companies like Salesforce and MailChimp, who are not engaging in sex trafficking, have had to fight FOSTA cases in court.
So, FOSTA is already a complete disaster by almost any measure. It has put women at risk. It has helped sex traffickers. It has made the job of law enforcement more difficult in trying to find and apprehend sex traffickers.
Already you should be wondering why anyone in Congress would be looking to repeat that mess all over again.
But, instead of just repeating it, they're making it significantly worse. EARN IT has a few slight differences from FOSTA, each of which make the law much more dangerous. And, incredibly, it's doing this without being able to point to a single case in which Section 230 got in the way of prosecution of CSAM.
The state law land mine:
Section 230 already exempts federal criminal law violations. With FOSTA there was a push to also exempt state criminal law. This has been a pointed desire of state Attorneys General going back at least a decade and in some cases further (notably: when EARN IT lead sponsor Richard Blumenthal was Attorney General of Connecticut he was among the AGs who asked for Section 230 to exempt state criminal law).
Some people argue that since federal criminal law is already exempt, what would be the big deal with state law exemptions -- which only highlights who is ignorant of the nature of state criminal laws. Let's just say that states have a habit of passing some incredibly ridiculous laws -- and those laws can be impossible to parse (and can even be contradictory). As you may have noticed, many states have become less laboratories of democracy and much more the testing ground for totalitarianism.
Making internet companies potentially criminally liable based on a patchwork of 50+ state laws opens them up to all sorts of incredible mischief, especially when you're dealing with state AGs whose incentives are, well, suspect.
CDT has detailed examples of conflicting state laws and how they would make it nearly impossible to comply:
For instance, in Arkansas it is illegal for an âowner, operator or employeeâ of online services to âknowingly failâ to report instances of child pornography on their network to âa law enforcement official.â Because this law has apparently never been enforced (it was passed in 2001, five years after Section 230, which preempts it) it is not clear what âknowinglyâ means. Does the offender have to know that a specific subscriber transmitted a specific piece of CSAM? Or is it a much broader concept of âknowledge,â for example that some CSAM is present somewhere on their network? To whom, exactly, do these providers report CSAM? How would this law apply to service providers located outside of Arkansas, but which may have users in Arkansas?
Maryland enables law enforcement to request online services take down alleged CSAM, and if the service provider doesnât comply, law enforcement can obtain a court order to have it taken down without the court confirming the content is actually CSAM. Some states simply have incredibly broad statutes criminalizing the transmission of CSAM, such as Florida: âany person in this state who knew or reasonably should have known that he or she was transmitting child pornography . . . to another person in this state or in another jurisdiction commits a felony of the third degree.â
Finally, some states have laws that prohibit the distribution of âobsceneâ materials to minors without requiring knowledge of the character of the material or to whom the material is transmitted. For example, Georgia makes it illegal âto make available [obscene material] by allowing access to information stored in a computerâ if the defendant has a âgood reason to know the character of the materialâ and âshould have knownâ the user is a minor. State prosecutors could argue that these laws are âregardingâ the âsolicitationâ of CSAM on the theory that many abusers send obscene material to their child victims as part of their abuse.
Some early versions had a similar carve-out for state criminal laws, but after similar concerns were raised with Congress, it was modified so that it only applied to state criminal laws if it was also a violation of federal law. EARN IT has no such condition. In other words, EARN IT opens up the opportunity for significantly more mischief for both state legislatures and state Attorneys General to modify the law in dangerous ways.. and then enable state AGs to go after the companies for criminal violations. Given the current power of the "techlash" to attract grandstanding AGs who wish to abuse their power to shakedown internet companies for headlines, all sorts of nonsense is likely to be unleashed by this unbounded state law clause.
The encryption decoy:
I discussed this a bit in my original post, but it's worth spending some time on this as well. When EARN IT was first introduced, the entire tech industry realized that it was clearly designed to try to completely undermine end-to-end encryption (a goal of law enforcement for quite a while). Realizing that those concerns were getting too much negative attention for the bill, a "deal" was worked out to add Senator Pat Leahy's amendment which appeared to say that the use of encryption shouldn't be used as evidence of a violation of the law. However, in a House companion bill that came out a few months later, that language was modified in ways that looked slight, but actually undermined the encryption carve out entirely. From Riana Pfefferkorn, who called out this nonsense two years ago:
To recap, Leahyâs amendment attempts (albeit imperfectly) to foreclose tech providers from liability for online child sexual exploitation offenses âbecause the providerâ: (1) uses strong encryption, (2) canât decrypt data, or (3) doesnât take an action that would weaken its encryption. It specifies that providers âshall not be deemed to be in violation of [federal law]â and âshall not otherwise be subject to any [state criminal charge] ⌠or any [civil] claimâ due to any of those three grounds. Again, I explained here why thatâs not super robust language: for one thing, it would prompt litigation over whether potential liability is âbecause ofâ the providerâs use of encryption (if so, the case is barred) or âbecause ofâ some other reason (if so, no bar).
Thatâs a problem in the House version too (found at pp. 16-17), which waters Leahyâs language down to even weaker sauce. For one thing, it takes out Leahyâs section header, âCybersecurity protections do not give rise to liability,â and changes it to the more anodyne âEncryption technologies.â True, section headers donât actually have any legal force, but still, this makes it clear that the House bill does not intend to bar liability for using strong encryption, as Leahyâs version ostensibly was supposed to do. Instead, it merely says those three grounds shall not âserve as an independent basis for liability.â The House version also adds language not found in the Leahy amendment that expressly clarifies that courts can consider otherwise-admissible evidence of those three grounds.
What does this mean? It means that a providerâs encryption functionality can still be used to hold the provider liable for child sexual exploitation offenses that occur on the encrypted service â just not as a stand-alone claim. As an example, WhatsApp messages are end-to-end encrypted (E2EE), and WhatsApp lacks the information needed to decrypt them. Under the House EARN IT bill, those features could be used as evidence to support a court finding that WhatsApp was negligent or reckless in transmitting child sex abuse material (CSAM) on its service in violation of state law (both of which are a lower mens rea requirement than the âactual knowledgeâ standard under federal law). Plus, I also read this House language to mean that if WhatsApp got convicted in a criminal CSAM case, the court could potentially consider WhatsAppâs encryption when evaluating aggravating factors at sentencing (depending on the applicable sentencing laws or guidelines in the jurisdiction).
In short, so long as the criminal charge or civil claim against WhatsApp has some âindependent basisâ besides its encryption design (i.e., its use of E2EE, its inability to decrypt messages, and its choice not to backdoor its own encryption), that design is otherwise fair game to use against WhatsApp in the case. That was also a problem with the Leahy amendment, as said. The House version just makes it even clearer that EARN IT doesnât really protect encryption at all. And, as with the Leahy amendment, the foreseeable result is that EARN IT will discourage encryption, not protect it. The specter of protracted litigation under federal law and/or potentially dozens of state CSAM laws with variable mens rea requirements could scare providers into changing, weakening, or removing their encryption in order to avoid liability. That, of course, would do a grave disservice to cybersecurity â which is probably just one more reason why the House version did away with the phrase âcybersecurity protectionsâ in that section header.
So, take a wild guess which version is in this new EARN IT? Yup. It's the House version. Which, as Riana describes, means that if this bill becomes law encryption becomes a liability for every website.
FOSTA was bad, but at least it didn't also undermine the most important technology for protecting our data and communications.
The "voluntary" best practices committee tripwire:
Another difference between FOSTA and EARN IT is that EARN IT includes this very, very strange best practices committee, called the "National Commission on Online Child Sexual Exploitation Prevention" or NCOSEP. I'm going to assume the similarity in acronym to the organization NCOSE (The National Center on Sexual Exploitation -- formerly Morality in Media -- which has been beating the drum for this law as part of a plan to outlaw all pornography) is on purpose.
In the original version of EARN IT, this commission wouldn't just come up with "best practices," but Section 230 protections would then be only available to companies that followed those best practices. That puts a tremendous amount of power in the hands of the 19 Commissioners, many of which are designated to law enforcement folks, who don't have the greatest history in caring one bit about the public's rights or privacy. The Commission is also heavily weighted against those who understand content moderation and technology. The Commission would include five law enforcement members (the Attorney General, plus four others, including at least two prosecutors) and four "survivors of online child sexual exploitation", but only two civil liberties experts and only two computer science or encryption experts.
In other words, the commission is heavily biased towards moral panic, ignoring privacy rights, and the limits of technology.
Defenders of this note that this Commission is effectively powerless. The best practices that it would come up with don't hold any additional power in theory. But the reality is that we know such a set of best practices, coming from a government commission, will undoubtedly be used over and over again in court to argue that this or that company -- by somehow not following every such best practice -- is somehow "negligent" or otherwise malicious in intent. And judges buy that kind of argument all the time (even when best practices come from private organizations, not the government).
So the best practices are likely to be legally meaningful in reality, even as the law's backers insist they're not. Of course, this raises the separate question: if the Commission's best practices are meaningless, why are they even in the bill? But since they'll certainly be used in court, that means they'll have great power, and the majority of the Commission will be made up by people who have no experience with the challenges and impossibility of content moderation at scale, no experience with encryption, no experience with the dynamic and rapidly evolving nature of fighting content like CSAM -- and are going to come up with "best practices" while the actual experts in technology and content moderation are in the minority on the panel.
That is yet another recipe for disaster that goes way beyond FOSTA.
The surveillance mousetrap:
Undermining encryption would already be a disaster for privacy and security, but this bill goes even further in its attack on privacy. While it's not explicitly laid out in the bill, the myths and facts document that Blumenthal & Graham are sending around reveals -- repeatedly -- that they think that the way to protect yourself against the liability regime this bill imposes is to scan everything. That is, this is really a surveillance bill in disguise.
Repeatedly in the document, the Senators claim that surveillance scanning tools are "simple [and] readily accessible" and suggest over and over again that its only companies who don't spy on every bit of data that would have anything to worry about under this bill.
It's kind of incredible that this comes just a few months after there was a huge public uproar about Apple's plans to scan people's private data. Experts highlighted how such automated scanning was extremely dangerous and open to abuse and serious privacy concerns. Apple eventually backed down.
But it's clear from Senators Blumenthal & Graham's "myths and facts" document that they think any company that doesn't try to surveil everything should face criminal liability.
And that becomes an even bigger threat when you realize how much of our private lives and data have now moved into the cloud. Whereas it wasn't that long ago that we'd store our digital secrets on local machines, these days, more and more people store more and more of their information in the cloud or on devices with continuous internet access. And Blumenthal and Graham have laid bare that if companies do not scan their cloud storage and devices they have access to, they should face liability under this bill.
So, beyond the threat of crazy state laws, beyond the threat to encryption, beyond the threat from the wacky biased Commission, this bill also suggests the only way to avoid criminal liability is to spy on every user.
So, yes, more people have now recognized that FOSTA was a dangerous disaster that literally has gotten people killed. But EARN IT is way, way worse. This isn't just a new version of FOSTA. This is a much bigger, much more dangerous, much more problematic bill that should never be allowed to become law -- but has tremendous momentum to become law in a very short period of time.