Congress Remains Undefeated in Pushing Bad Tech Bills

There’s a thing called the politician’s fallacy, which goes like this:
- We must do something.
- This is something.
- Therefore, we must do this.
This fallacy explains a lot about congressional dysfunction, but it’s missing a key piece: where does Congress get The Something? And when it has to do with technology or speech, why is it usually bad?
The answer requires a bit of history. While there was no golden age of bill drafting, there is a clear line: before and after Newt Gingrich’s Contract with America. Gingrich seized upon fears of rising government debt to put forward a plan to dramatically cut the size of the government. A plan largely drafted by the Heritage Foundation (sound familiar?).
Among other things, it amounted to a massive federal brain drain. Gingrich slashed committee staff by a third, reduced the number of committees, and defunded the Office of Technology Assessment, which was a nonpartisan office in Congress that analyzed complex scientific and technical issues. Gingrich also successfully politicized hiring and paying expert staff adequate wages, making it difficult to this day for Congress to retain staff.
Even after Gingrich gutted internal knowledge in DC, Congress had fallbacks to ensure that members would be educated about the important topics they make laws on—but journey with us as we watch those dominoes fall:
Congressional committees are supposed to put on hearings with expert witnesses from whom they can seek advice on current business and then use that advice to “markup” or improve legislation. That doesn’t happen anymore. Instead, hearings are carefully choreographed to either generate clips for TV and social media or to promote legislation favored by committee leadership. The result is that much of the legislation in Congress is either directly drafted by favored stakeholders (often big business) or drafted in concert with a coalition of favored stakeholders (often constituents with real issues, but who are rarely experts and are often convinced that ineffective legislation will solve their problems).
Once a bill draft is in hand, more problems occur. Regulating tech is complicated, and bills with good intentions often in effect do little good and much harm (like SESTA/FOSTA). Congress insists on ignoring experts and trying to pass bad bills because of a political sunk cost fallacy that is roughly described by the politician’s fallacy (see above). Because of this, once Congressmembers have the support of a coalition of favored stakeholders, then they are assured they will get credit for whatever passes—or blamed if it does not pass. Congressional members repeatedly prioritize getting that credit more than they care about the actual impact of the law being considered.
If they rally the necessary support from cosponsors and coalition partners, a bill’s cosponsors have a path to pass the bill (a very hard thing to do in a Congress plagued by inaction). At that moment any changes to the bill text become adversarial to their true cause: getting credit. Any problems with bill structure, language, or constitutionality are discarded, as they are irrelevant to getting credit.
Tragically for the communities harmed by this style of careless lawmaking, so far none of these after-the-fact problems have come back to bite lawmakers who ignored the major problems of their legislation. In the lead up to the passage of SESTA/FOSTA, Congress got a huge bump from celebrities that cut ads thinking they were going to stop sex trafficking. When SESTA/FOSTA fell on its face did any of those celebrities show up and ask for Congress to fix it? No.
While this may sound like Congress being actively malicious, the bleaker truth is most members simply fail to learn enough about tech bills to meaningfully engage with their content. They have undervalued staff that are concerned about dozens of other priorities and can’t spend the time necessary to get their boss to understand. In this dynamic, the easiest path is the most likely one, so if a bill has a good story, it’s good enough for these ill-informed lawmakers and their overworked staff.
Meanwhile, some lawmakers are truly malicious, actively using popular tech bills to push pet projects like censoring speech they don’t like or killing encryption, without their cosponsors or people in Congress that ostensibly care about human and constitutional rights even picking up on it. Often it’s hard to tell whether the problems in well-intentioned but poorly drafted bills, like Take it Down, are intended or unintended.
The story of KOSA, the so-called Kids Online Safety Act, and our opposition to it, is a case study of this broken system. KOSA was never really the product of an honest fact-finding effort to get to the bottom of the harms of Big Tech and offer solutions. One of its core provisions, the Duty of Care, is a can-kicking measure that amounts to telling tech companies to stop doing bad things, and it gives the FTC the authority to punish them if things the FTC decides are bad still occur. The details are left for the FTC and tech companies to figure out. This is bad at the best of times, because it gives the FTC enormous power over speech (and similar state bills have been struck down on First Amendment grounds), but in this administration? It’s a distinctly horrific idea. After all, this is the same FTC that is stretching the definition of “unfair and deceptive trade practices” to target trans care and DEI.
Under enormous, sustained, multi-year pressure from civil society and queer kids, KOSA’s authors have made some small changes in response to First Amendment concerns. Yet tellingly, there has been hostility from the drafters toward those who bring up underlying speech problems and opportunities for abuse because of the sunk costs of already having a coalition behind the bill (at least in the Senate). The Senate coalition backing KOSA includes many who explicitly want tech companies to filter what they call “bad” content, and they’ve been playing games with semantics to dodge deserved criticism of the idea that the government shouldn’t be deciding which speech the public should see by pretending regulating content isn’t a core part of the bill.
Unfortunately for them, they can’t keep the cat in the bag. The bill’s proponents keep rolling out examples of content they want censored: Fairplay has said it’s about stopping tech from sending extreme diet videos. Senator Blumenthal has said it’s about stopping “destructive content.” Senator Blackburn has said it’s about “protecting minor children from the transgender in this culture.” They are clear that what they want is a censorship lever to pull whenever those in power decide they don’t like a certain kind of speech.
In a transparent and bizarre upside-down world of doublethink, the bill sponsors give these examples and yet still insist that KOSA is not about censoring speech. This hypocrisy is fairly typical for a bill in this position: instead of definitively cutting away the First Amendment violations in the bill’s text, they expressed their intentions for how the bill should be used and washed their hands of it – hoping someone else, like a Big Tech company or Trump’s FTC, will just so happen to interpret this bad, unclear law in the manner they would prefer.
The current state of KOSA is as follows: a Duty of Care requires tech platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors.” (Harms that we will consider below.) The bill’s proponents have reworked it to claim that these regulations are only to do with “design features”—and that it cannot be about regulating content as it focuses on design.
The regulated designs are defined as anything “that will encourage or increase the frequency, time spent, or activity of minors on the covered platform.” This definition goes on to explicitly include “personalized design features,” aka algorithms. Algorithms are what social media and other platforms use to rank and return content to users—in other words, how platforms decide what speech will be shown to whom. This definition of “design features” sweeps everything about content back in and onto the chopping block at Trump’s FTC.
Beyond this faulty definition, the bill’s authors ultimately still have not purged its true focus on “bad content” from the text. As mentioned above, the Duty of Care lists several harms that platforms must prevent or mitigate. Of those, four will wrap in protected speech to some degree:
(1) Eating disorders, substance use disorders, and suicidal behaviors. (2) Depressive disorders and anxiety disorders when such conditions have objectively verifiable and clinically diagnosable symptoms and are related to compulsive usage. (3) Patterns of use that indicate compulsive usage. (4) Physical violence or online harassment activity that is so severe, pervasive, or objectively offensive that it impacts a major life activity of a minor.
In a country with laughably insufficient healthcare and education and social support, where many parents have to work multiple jobs and remain in poverty and youth face a looming climate crisis that is proving existential, it is profoundly disingenuous to ascribe the responsibility for young people’s suffering to their one consistent tool for education, connection, and social empowerment: the Internet. Even the Surgeon General report, which many rely on to say social media is harmful to children, finds that it's a mixed bag of risks and potential benefits.
What KOSA does by listing these harms is point to profound suffering that we all have sympathy for, and allow the FTC to stand up a fictional youth who might be harmed by exposure to whatever content the FTC and current administration dislike, and unilaterally ban any algorithm that promotes such content. Mental health is not objective, and it would be hard enough to determine whether any use of technology or platforms have caused specific harms to a specific individual. But under KOSA, the FTC gets to claim what is harmful to an entire population and make scary legal threats to a company if they do not change their design and content moderation practices to conform with the FTC’s demands. This opens the door too wide to abuse.
As tech blogger Mike Masnick has cataloged, far right proponents of KOSA like Heritage have explicitly said that censoring trans content is protecting kids in line with this bill. And the Trump FTC has shown a willingness and a desire to use any authorities it has to settle scores and pressure companies into adopting policies that align with Trump’s ideology. The Trump administration has already used every opportunity to police the speech it itself controls.
All of this is in the context of mental health issues that are very personal and contextual. As the NIH states: “there is no objective testing available to diagnose depression.” This bill paints mental health with broad strokes and involves private corporations as arbiters for mental health goals and decisions for which they are ill suited. This is wrong.
What KOSA will result in, if passed, will not only be over-moderation or wholesale censorship of any content that Trump, or any future administration, would prefer not exist on the Internet. It will also destroy the content that counters the harms KOSA claims to address. Big Tech companies will not be able to discriminate between eating disorder content, or content that fights back against eating disorders. They will censor suicide prevention just as they censor suicide promotion. They will do this because it’s virtually impossible for algorithms to tell the difference, especially when it comes to honest content that treats the audience with respect, and algorithms are the only way to sort content at the current scale of the Internet. They will cut some of the Internet’s most vulnerable users—children—off from the very resources they may need most in a moment of crisis. And they will do so because it will be required by law.
This is the politician’s fallacy at work: a law to protect kids that, if passed, will likely hurt more kids than it helps—and censor the Internet for the rest of us in the process.
Member discussion