Bad Internet Bills w/ Taylor Lorenz: KOSA, SCREEN Act, & Repealing Section 230
E3

Bad Internet Bills w/ Taylor Lorenz: KOSA, SCREEN Act, & Repealing Section 230

Every top expert has said this is a complete moral panic.

This is nonsense.

Sites like Reddit couldn't exist.

Platforms like Twitter couldn't exist.

A major Biden, you know, White House advisor got up on stage and said,

"We are going to remove anonymity from the internet."

This is a bipartisan effort.

So there's not just one group that wants to do this.

RIP Privacy Guides.

We need to talk about what happened in Washington last week,

because while a lot of people are distracted by the holidays,

A House subcommittee is trying to take a sledgehammer to the open internet.

On Thursday, the Energy and Commerce subcommittee on Commerce, Manufacturing and Trade officially

advanced a bundle of nearly 20 bills, many of which have names which sound harmless or

even helpful like the Kids Online Safety Act or COSA and the Screen Act.

But when you actually take a look at these bills, the reality is much darker.

These bills are not about safety at all.

They are about implementing mass surveillance on an unprecedented scale, and dismantling

privacy and free speech on the internet as we know it.

Among these bills being advanced is the SCREEN Act, which will effectively mandate identity

verification by a year government ID to access completely legal content across most of the

internet.

Then there's COSA, a bill which will give state attorneys and the government the power

to decide what content will require identity verification just to access, which will be

against educational resources and life-saving healthcare information.

Looming over all of this is an ongoing push to repeal or gut Section 230, a law

which was specifically created to prevent the courts from misinterpreting the

First Amendment and it protects technology platforms from being sued

into oblivion just because they provide the tools which enable user-generated

content on the internet. These bills are moving fast and they're being fast-tracked

by lawmakers who are counting on the idea that people won't notice until it's too late.

If these pass, the internet will become a place where privacy and anonymity is illegal,

where encryption makes you suspect, and where the Americans ability to speak freely is determined

by the most restrictive state government in the country.

To break down these bills, and more importantly, what we need to know to stop them, my colleague

Nate sat down with technology journalist Taylor Lorenz.

Taylor has been covering the bad internet bills beat very extensively recently, and

I'm also going to link to badinternetbills.com in the description, and I highly recommend

you open that website up while you listen to this interview.

It lists the bills, it explains the threats, and it gives you the tools you need to contact

your representatives, which is super important.

This is one of the most critical fights for digital rights that we've seen in a decade.

Here's Nate's conversation with Taylor.

Hello, everyone. I am Nate from PrivacyGuides, and today we are talking to Taylor Lorenz.

Hi, Taylor.

Hi. Thanks for having me.

Thank you so much for being here today. Taylor's going to talk to us a little bit about some

upcoming laws in the US that are impacting the privacy and security of everyone. But

first things first, for our listeners who may not be familiar with your work, could

you tell us a little bit about yourself? Who are you and what's your background on these

kinds of issues?

I'm Taylor Lorenz. I've been a technology journalist for over 15 years now, crazy.

I cover mostly tech from the user side, so how people use technology, how people are

affected by technology, and I cover a lot related to tech policy and privacy issues.

So a lot about sort of free speech issues on the internet and also just surveillance,

government surveillance, corporate surveillance, the data broker industry, things like that.

And I have a podcast and a YouTube channel and a newsletter and yeah, I write about all this stuff.

Also, I worked in the mainstream media, so I covered a lot of this stuff for the Washington

Post, New York Times, The Atlantic, and a bunch of other, the Guardian, other mainstream

outlets.

So, like I said, we're here today to discuss a few laws that are currently in front of

Congress and they are very worrying laws.

There's actually a bunch of laws right now.

I think there's like 18 last time I checked.

But specifically, we're going to highlight three that we at Privacy Guides feel are particularly

concerning.

And those are COSA, the Kids Online Safety Act, the Screen Act, and there are, we'll

say some proposed changes to Section 230.

So I'm going to let you decide how we should kick this off.

Which one of those would you say is either most worrying or at very least you think we

should start talking about first?

Repealing Section 230 is insane.

And just to be clear, that's what the Democrats tweeted yesterday.

It wasn't just like we want to reform it, it's we're working on repealing it, which

Which is ultimately what these reform conversations are about.

They're about decimating it.

That's to me so terrifying because if we repeal Section 230, like all these other conversations

will be irrelevant because we will no longer have free expression, user-generated content

online.

I agree.

That's a good one to start with.

So for listeners, in a nutshell, and please correct me if I get this wrong, Section 230

basically says that platforms are not legally liable for the content of their users as long

as they make a good faith effort to moderate the content.

For example, privacy guides, we have our own forum.

And as long as we make a good faith effort

to remove any sort of illegal content,

that means that we're not responsible for anything

that slips through the cracks.

So from our perspective, this seems like a good move.

We don't wanna be hosting harmful content.

So what is the argument to get rid of 230?

I'm trying to wrap my head around that one.

- There is no argument other than mass censorship.

I mean, so let's be clear sort of how this all began.

Section 230 is a law that's part of the Communications Decency Act,

which passed back in the '90s when the internet was just starting out.

And when the internet was starting out, there was this big debate over what the internet should be.

You know, was it going to be like movies?

Like, was it going to be like, you know, where there's a rating system on every single piece of content

and content has to go through review before it's posted on the internet?

You know, should there be some sort of body that regulates what sort of information you can upload and access?

And back then, rightfully so, people who were proponents

of the internet were like, no, that's not

what the internet's about.

The internet is this free place.

We want it to be this global--

I hate the phrase town square, but we

want it to be this global information resource, right?

Where anybody anywhere around the world

can access information, connect with each other.

This is when email was just getting started.

Social media barely was just kicking off back then.

So this is years before even Facebook launched.

When social media started to get traction, obviously they rely so heavily on Section

230. As you mentioned, their sites like Reddit couldn't exist. Platforms like Twitter couldn't

exist because they rely on hosting user-generated content, which means that they, the platform,

they don't have to review every single piece of content before it gets published.

This is also how every single forum exists online and like even sort of like some email

messaging services to function, you know, rely on this landmark internet law that truly protects

the free internet. Now there's been all this nonsense about sort of harmful. The internet

is harmful for children. Now let me just be clear. Actually every single expert that studies

this topic is pretty consistent and has come out. Every top expert has said, this is a complete

moral panic. This is nonsense. Like actually there's no evidence at all that social media

is harming children. In fact, it helps a lot of their mental health. Like this is all just

sort of media driven moral panic. It's very similar to what we saw with video games,

television, comic books, we see these exact same arguments

come up over and over and over again.

That has given a lot of people that have wanted

to censor the internet, people in the government

that don't like what the public is saying,

an opportunity to say, ooh, we can sort of use this thing

of like protecting children to censor the internet,

'cause what the government ultimately wants

is to control speech online.

They don't want you criticizing the government.

They don't want journalists out there challenging power,

et cetera, especially not the government these days,

which is, I would argue, more authoritarian

than even a decade ago.

So they've decided that Section 230 has become this boogeyman,

where basically they're like, let's destroy Section 230.

That will destroy user-generated content.

We'll eliminate free speech.

And they claim, oh, we just want to reform it.

We want to make-- we want to hold big tech accountable.

You know, why isn't Facebook more accountable for the things

that are being said on the platform?

First of all, we have accountability.

Like, you can sue these platforms

if they're doing illegal things.

You can prosecute these platforms for certain types of harm.

Like, it's not like it doesn't exist.

If Facebook played an active role in doing something hard,

like Facebook has been subject to countless lawsuits.

It's not like you cannot sue Facebook.

But as you mentioned, we want to hold the people

responsible for the speech.

So if somebody is using a messaging app, for instance,

to mass message people information

that is considered harmful or is defamatory or something,

that person who says the speech should

be liable for the speech.

For instance, we don't go around, you know, making it possible to sue AT&T because you

said something over a phone call that made somebody else mad, which by the way, they

wanted to do that.

Let's be clear.

That's what the media was arguing for.

Back with landline telephones.

They were arguing that, you know, landline telephones, people were getting so heated on

conversations on telephones.

It was making people angry and driving violence in society and harming people's mental health

because they'd get really upset on the phone.

One guy tried to use it as a defense in his murder case, actually, that he was so angered

by this phone call that, and the media called on telephone companies to be regulated.

It's so crazy because the exact same arguments have been used against technology every single

time.

They argued this about newspapers, they argued this about novels.

All of that's to say is none of this is about protecting kids.

It's about the government's seizing control over online speech.

Section 230, like you said, it relates to a wide range of political issues.

And I think some listeners can probably already hear, you know, you mentioned like, you can

You can't sue Facebook and Facebook is still around, but if you sue some of these smaller

platforms like Signal or Mastodon, they don't have the same kind of money to withstand these

kind of lawsuits.

No, no, no, let me be clear.

Currently they don't have to have that money because they are not liable.

The people that post on them are liable for their speech.

You could sue those companies now if they were complete.

If they break the law, if Mastodon breaks the law or something, right, or does something

but you could sue them for doing something illegal.

Or you could sue-- again, there are these lawsuits that come up.

There are more protections for any platform

that hosts user-generated content.

You mentioned, if you revote Section 230, first of all,

yes, it would consolidate the power of Big Tech,

but Big Tech would have a lot more moderation capabilities,

so they would have the ability to host more content.

But pretty much every user-generated--

every small site that relies on user-generated content,

it's not that they just don't have money

with stand the lawsuits, is that they don't have the money to moderate every single piece of content.

They don't have the money. A small forum doesn't have the resources to read every single comment.

And then of course, they also want these platforms to verify the identity, the offline legal identity

of every person as well, which is a whole other issue. That's the problem with the other two

laws that you mentioned. Yeah, thank you for clarifying that. That's probably a better way

to get at that. How does section 230 relate to privacy specifically? And how do you think it

Our forum isn't even that big, relatively speaking.

And I couldn't imagine having our volunteer moderators

or even our staff, like that would be multiple full-time jobs

just for our forum to sift through that.

What are some other ways this might impact

privacy tools and privacy community?

- Yeah, privacy specifically.

I mean, you're just not gonna have a lot of privacy.

Like for instance, a lot of people choose not to use

the major social networks because of privacy reasons,

because these are massive data harvesting operations

and they don't wanna give their data to Metta

or Google or whatever.

They want to use some of these smaller sites,

or they want to engage in discussions around things

like abortion or their gender identity,

or things that they want to keep private.

And they don't necessarily want to log into a Facebook

account to have those conversations,

or log into Instagram and have to verify their information

to do that.

So they want to use some of these smaller forms.

Those forms will all be wiped out.

Disabled people rely on this as well.

They want to discuss medical conditions

without their insurance premiums going up,

because a data broker bought their info

from a social platform.

So even just the ability to participate in some of these smaller, more niche forums that

have really good privacy protections will go away.

You will have to use these major social, really just Google and Meta, which have no, they

have a horrible privacy record, of course, as we know.

Yeah, especially Meta, really bad.

Oh yeah.

I think it's really important for people to note that this is a bipartisan effort.

So there's not just one group that wants to do this, but the Democrats are really leading

the charge. And Democrats like have been so aggressive in pushing these surveillance laws

and censorship laws. It's scary. I think a lot of people associate Trump with these things.

And Trump has done a lot to attack the traditional media, but the Democrats are enacting policies

that are further to the right of Trump on the internet.

So we've been mentioning age verification and identity verification. And based on my

that one is largely, it is both COSA and the Screen Act,

but I'm mostly seeing it with the Screen Act,

which I have not been hearing about very much.

I had to really dig to find information about this one.

So again, correct me if I'm wrong,

but if I understand it correctly,

the Screen Act is an age verification bill.

Like there's kind of not much else to it.

- It's an identity verification bill.

- Identity, yes, thank you.

I should really start saying that instead of age verification.

So it would require platforms to verify users.

Is this again, because it's kind of hard to find information about this one.

Is this like just across the board, all platforms, or are they trying to sell?

This is one of those like, oh, only larger platforms or how is that one supposed to work?

Yeah.

So, um, this came out of anti porn groups and these extreme far right sort of

project 2025, you know, heritage foundation affiliated groups were, and so did

Kosa to be clear.

These are both originated in these far right anti LGBTQ circles where they

thought, okay, how can we get LGBTQ content?

off the internet, designated as profane, which they also

designate any sort of sex ed education as profane.

And then we forced it through Congress

by saying, oh, we'll protect the kids from porn.

What they mean by porn is sex ed, abortion information,

information about feminism, like LGBTQ content.

And by the way, I'm not making--

Mike Masnick at Tectord has done a great job

of the Heritage Foundation coming out and saying this.

Like, they're saying it publicly.

It's not even a secret.

They're like, yes, we can't wait to get this content

off the internet.

And so they want to do that through identity verification.

mentioned, the Screen Act is more overt than COSA,

but both of them have the exact same effect.

The Screen Act is just like anything

that has adult content, which is literally anything.

Because again, adult content is a completely subjective.

And if you look at, for instance,

that was some sort of Trump terror memo called NSPM 7,

where they designate who's hostile to America,

who's considered a terrorist.

And it's just like anybody that is

against quote unquote American values,

anybody that's against quote unquote Christian values.

These are all highly subjective things.

So it's similar with the Screen Act,

where it's like adult content, just as any content

that the government doesn't like.

So we saw this happen with the Online Safety Act.

We have here the Kids Online Safety Act.

In the UK, back in 2023, they passed the Online Safety Act,

which just went into effect a couple months ago.

And that's why you saw, for instance,

the subreddit for war crimes was removed

under these anti-poorin laws or whatever,

because any sort of police violence video,

people speaking out against sexual assault, rape victims,

alcoholics anonymous forms have been shut down.

Actually, all these places for kids to report

incidents of molestation and things like that

have been shut down.

So that is the type of content.

When you see adult content, content that's

quote unquote not safe for kids,

that's the type of content that they're seeking directly.

And yes, it would remove anonymity

from the internet basically completely.

- So this would require again, like all platforms

because you know, the common argument is like,

When I go by alcohol, I have to show ID, right?

- No, you don't.

Identity verification is nothing like showing your ID

in the real world, the IRL world.

First of all, you don't show your identification

to every single person that you interact,

to every single store that you walk in on walking in,

before you can even walk in, you have to show ID,

maybe to a bar, but certainly not to buy things

or to engage, to access information.

And then your ID, your government ID,

is not tied to every single piece of content that you read.

It's not stored forever when you go to the library

or when you go to a bookstore to buy a book.

And then your government ID is stored

and they can monitor every single word that you've read.

And then that can be used in a court case against you.

There's actually nothing like that.

Not to mention, it's a quick ID check.

A human being looks at it quickly,

doesn't memorize the info, checks there, and then moves along.

Here, it's stored in a database forever

But we know has absolutely no privacy.

These databases leak constantly.

This is a massive data privacy concern.

And that data is then used forever,

can be used to target, you exploit, you.

And often they're harvesting not just your offline address,

physical, they are harvesting biometric data

that is tied to it as well.

- One thing I noticed with a lot of these laws

is that they're vague specifically

when they talk about identity verification.

And they leave it up to the platforms

to decide how to implement this.

Do you think that's a better way?

Actually, maybe this isn't even a good question

in light of this conversation.

I think I know what you're getting at.

A lot of times lawmakers will leave

sort of specifics around identity verification in vague terms

to argue that they're not technically

mandating identity verification.

That's such a lie.

They know exactly what they're doing,

and they know what they're doing because the identity

verification lobby has basically written

half of these bills.

This massive-- they're about to get billions of dollars

if these laws pass.

But it's also just like a farce.

There is no way to verify anybody's offline identity

in any sort of platform, whether you're

a third party platform or a major platform,

without violating their privacy.

It just doesn't exist.

You have to either harvest their biometric data,

but even if you harvest the biometric data,

that biometric data and then your behavior patterns

can easily be tied to your offline identity,

or you have to verify things with your offline identity

and often confirm things in other ways.

no privacy forward way to verify your identity

because they have to harvest a lot of information

to verify your identity.

And any time you're trying to cordon off parts of the internet

for children or make things safe for children

and try to age gate any a thing of the internet,

a lot of people hear, oh, age verification.

OK, so kids will have to verify their ages.

No, in order to know who's a child,

everybody has to give up their information.

And in case you weren't thinking, oh, maybe you're still

giving them the benefit of it out, let me tell you.

I was at the Biden White House last August at this big day

that they had for content creators

to push their agenda, whatever.

Neera Tandon, a major Biden White House advisor,

got up on stage and said, we are going

to remove anonymity from the internet.

Don't you wish you could unmask every troll

is how they were selling?

Because they also try to sell these things

as some sort of answer to online bullying,

even though we know actually removing anonymity

doesn't help bullying at all.

Like literally go to the Facebook comment section

of any post.

You can see people are happy to bully under their government

names.

But they're explicitly saying it.

So this has been a goal of the Democrats for a while,

and the Republicans as well, is to completely remove

anonymity from the internet, in part to prosecute that.

Like you're saying already this happened

with people getting fired or in legal trouble

for comments about Charlie Kirk or comments

that the government doesn't like about Israel

or foreign policy, things like that.

I think this Screen Act is so insidious.

It's just as bad, if not worse, than COSA.

But COSA, because COSA has been this thing for so many years,

and there's been a little bit more activism around it.

People are just more aware of it and they're not as aware of the screen act.

The screen act is just as bad to be clear.

All 18 of these laws in this child online safety package are very bad,

very bad. There's the app store accountability act, which is also really bad.

That puts age verification on the app store level,

which actually is worse in a lot of ways. Like trust me, there is not a single,

all of these laws are very bad, but they, they have so many different names.

It's hard to keep up.

I pointed out recently actually on the topic of the app store one,

I said that I think it's really telling the even Apple and Google or specifically a lot

of these companies are like, yeah, we're all in favor of this age stuff, but we don't want

it.

We don't want the IDs.

We don't want the data.

Make Apple and Google do it.

And they're like, no, we don't want it either.

And I think that's really telling that all these companies are like, yeah, it's a great

idea, but not in my backyard.

Well, what's really scary, they say that, but then they pre-comply.

Meta and YouTube are already harvesting a huge amount of data.

They announced this publicly over the summer where they said, we're going to start age

things where we're going to start harvesting biometric data, we're going to start monitoring

more about how people use Meta and YouTube products.

So if you watch Jimmy Skibbity toilet videos, uh-oh, now you're classified as a teen and

you have to verify your identity.

I spoke to an undocumented woman who, this happened where she was using her main computer,

her child was watching YouTube.

It flagged her as a child.

Obviously, you can understand why someone who's undocumented is extremely concerned

about that.

That's already happening now.

And I think it's really scary because this

is what we hauled Mark Zuckerberg in front of Congress

for in 2017 and 2018.

Remember, he was like, sir, I sell ads.

They're like, Cambridge Analytica,

you're collecting all this data.

Now, they're mandating that they collect even more data,

and they're giving them complete cover

to start collecting huge amounts more data.

This is like six years or eight years later.

I'm like, what year is it even?

And they went from, hey, you're not

doing enough to protect users' data to, hey, yeah,

go ahead and harvest some, like monitor everything,

'cause we're gonna pass these laws anyway,

like as long as you give that info to the government

when we ask, collect all you want.

And the TikTok ban is part of this as well, of course,

because of course our data is actually less safe now

that under this new ownership structure

than it was previously.

- That will take us to the Kids Online Safety Act,

which you just mentioned a minute ago.

Like you said, our listeners are probably

a little bit more familiar with,

'cause this one has been in the public eye a little bit.

It was originally introduced in 2022,

And the original idea, again, on paper is that platforms like Facebook and YouTube should be

responsible for mitigating "potential harms." You mentioned that a minute ago too, to children

who are using the service. I don't even want to say that language because that's not what it does.

That's not what it says and that's not what it does. What it says is that platforms need to

censor more content and they need to censor more content in line with what the government perceives

as harmful. I just want people to understand that because I think some people read it and they're

Well, the goal is, or you see these headlines, right?

Congress passes law to protect children.

That's not what these laws do.

These laws harm children.

And we have research, actual scientific-based evidence

of researchers that have studied these things,

and we know that they actually harm children,

especially LGBTQ and marginalized youth.

But that's the guys under which it was passed.

- That is one of the questions I had written down.

You mentioned how these acts will do a lot of harm

to minorities, LGBTQ and these kind of people.

And so I think it's really easy for conservative folks to kind of,

they say like, oh, that's fear mongering or they may even agree with it

because they don't agree with those viewpoints.

But you've, you've pointed out, like this is a bipartisan thing.

What would you say are some of the risks that would get people on the more

conservative side of the aisle to realize like, no, this is bad for you too.

This is bad for everyone.

Part of project 2025 was about like censoring trans people off the internet.

Again, the Heritage Foundation has come out and said, here is how we plan to use

COSA to remove abortion content online.

Here's how we plan to use COSA to censor, you know, LGBTQ trans content off the internet,

all, all LGBTQ and women's rights content.

They don't, they want to block that.

So they're open about it.

They want that.

These Republicans, I've tried to explain to them like, well, what if a Democrats

empowered you guys had all this drama with Joe Biden saying that he was job owning over

COVID stuff and vaccines?

Like, don't you feel like I personally believe in vaccines, but I try to make this case to

them of like, well, would you want the government over sent?

I would argue that we don't want the government determining speech.

We want to hear from actual experts online.

We don't want government propaganda.

This is, again, what we always could accuse China and Russia

and authoritarian states of doing.

So now we're trying to replicate that exact system.

That's what you guys were supposedly against.

They're not actually against it.

Once they seize power, their whole thing is like, oh, well,

OK, we're going to pass it under us.

So they're going to bake into law where it's written in a way

that will be used to censor all the content

that they don't like, and then the left will never

get power again, which they're probably

right actually about that.

If they can effectively control the internet

and skew it so effectively to the right wing, which they have

already done in some ways, the left won't ever get power again.

So that's when you talk to these staffers,

they're like, well, there's not going

to be another democratic presidency.

We won't have to worry about it, which is bleak.

The Democrats are just happily going along with it

because they also want to censor people.

And they're like, yeah, we'll align with the Heritage

Foundation, because we hate when people criticize us online.

We hate when people say things that are anti-Israel.

I'm Chuck Schumer, and I hate that somebody says

I shouldn't give $500 billion to whatever Israeli defense

fund thing or whatever.

It's a lot of foreign policy criticism,

also a lot of criticism that the Democrats are not

fighting Trump hard enough.

They just want to shut-- they just also

want control over speech.

So actually, COSA has been very led by the Democrats.

And the Democrats, one other thing I'll say,

that's a little history for people to understand.

Obama was very protect.

So Obama was fully in bed with the tech industry.

He had one of the last events that he had before leaving office

was called South by South Lawn, where

he brought Uber, Facebook, all just the worst,

like, biggest tech companies ever to come out

and have this celebration of what they've done

and essentially get everyone in the White House jobs

in these major Amazon.

Like all these big Microsoft big companies were there.

I reported on it.

When Trump won, the tech lash started.

And that's when liberals started to realize, oh, wait,

maybe these platforms are not just all rainbows and sunshine.

They're actually being used for fascism and bad things.

So that's when, again, we're going

to haul Mark Zuckerberg in front of Congress

and really crack down on him, whatever.

It's also when you saw the rise of the Black Lives Matter

movement, Me Too movement, more and more social justice

movements that were challenging democratic politicians

being progressive enough.

And they hate that.

They don't want that.

They don't want any, they don't want anyone speaking out.

They want to do their corporate bullshit, whatever in peace.

And they don't want any backlash.

And they want to seem like they're tough on big tech because they feel like

they weren't tough enough on big tech originally.

And these laws, even though they're actually a massive reward to big tech.

And if you look at who backs this stuff, like big tech funds, like, I

mean, meta was one of the biggest lobbyists in DC last year and the year before.

They don't want to have this air of cracking down on big tech.

So that's why a lot of them sign onto it.

You kind of covered a lot of the, so my question here was, I saw your recent

interview with Ari Cohen, which was great, by the way, he mentioned that the

latest incarnation has no car routes for smaller platforms.

So again, my goal is I'm, I'm kind of trying to bring this home to viewers.

You know, again, privacy guides, we have a forum, a mastodon instance, a

peer tube instance as a US organization.

Exactly.

We, we would be subject to COSA compliance.

So I'll just say something too.

Like I think people, again, because there's this framing of cracking down on

big tech, people think like, oh, we're going to get that.

We're going to really stick it to meta.

We're going to whatever.

And they don't actually realize how many smaller internet

services they do rely on.

Because when you think of social media,

you think of Google and YouTube and stuff.

And I understand not everyone's on Mastodon and PeerTube

and things like that.

Those would go away under these new laws.

But they might turn to a subreddit for information

when they're looking about something.

They might end up on a forum.

They might just be on a website that has a contributor model that doesn't necessarily

moderate so heavily, or they might want to participate in a campaign.

You know, hey, let's all get, they're going to, I don't know, build a giant cell phone

tower in my backyard.

Let's all come together and do this social media campaign to prevent that.

You won't have that ability anymore, because all of your stuff will have to be approved

by some intermediary that is willing to take on the liability of your activism and your speech.

So there won't be any internet activism online. There won't be a way to engage in that because

nobody's going to like, I mean, any organization that would take on any sort of mass liability for

that stuff would just be sued out of existence. So it's just, it really, there's such a mass

chilling effect. And I think a lot of people don't realize actually how much, I mean, even platforms

like next door, right?

Like you might not think of these when you're thinking of main social media, but

like there's value in that people get about even like these marketplaces.

Like there's just a lot of user generated content online that people don't

think of as user generated content and that they engage in.

And messaging apps might be subject to this as well.

So it's just like, what about your WhatsApp group?

You know, like what is the threshold for that?

Is that counting as social media?

probably like you won't have be able to mass distribute information really in the same way

anymore.

One thing I want to add on to that that you I think the key there was mass distribute

information because I it frustrates me but I see this a lot in the privacy community where

you know for example Google says they're going to stop allowing side loading on their phones

and there's so many people that are like oh well I'm super tech savvy I know how to get

around that. And it's like, that's really great. Most people don't. And so, you know,

there are always the people that like, Oh, well, I'll know how to roll my own messenger

app and still be able to use it. And it's like, cool, you and like six other people.

And that's just not enough for the kind of mass communication that that you're talking

about there. Yeah. Also, just like, you have to think of who

who do we want to protect? We want to protect the most marginalized users. Like you don't

want to write regulation that censors or harms the people that rely on these platforms the

most activists, journalists, academics, like people speaking, challenging power.

Like again, this is what America always criticized other countries for criticized Russia and

China saying you guys don't have free and open internet.

Oh, Iran doesn't have a free and open internet.

You know, the government approves everything and, you know, LGBT, these people such and

such groups, marginalized people can't speak out.

Okay, we're about to do that here in America.

Isn't that like what they literally spent like 20 years fear

mongering about these other countries, which by the way, I don't support those

other countries.

I think those are authoritarian versions of the internet that I don't want.

Personally, I think to have the people that we currently have in power enacting

that authoritarian version of the internet is extra scary.

Cause I think there's a lot of people in power right now that don't value

civil liberties and don't value free expression.

I do agree.

And I really appreciate you pointing out that this is a bipartisan thing

because the privacy community is very diverse.

Like we have people on the left, we have people on the right.

And that's why I asked that question about like, you know,

realizing that this opens the door that even if there is a power shift in the future,

if the Democrats come into power in the future, now they're wielding this.

And, you know, this really is a bipartisan, like everyone is impacted by this,

regardless of whether you buy into the narrative for that's being sold to us or not.

Which the far right doesn't think will happen.

And I think they're just in delusion.

Like I think some of them, and I talk to these people for work.

And I just-- the Democrats are also delusional.

I talked to Democrats last summer

that were saying a lot of this out there, like, well,

there's no way where Trump would win again.

And I would just say to both of them,

you guys are both delusional.

You know, we do have this system.

At some point, someone you don't like will be in power.

And you should write laws so that when--

no matter who's in power, your rights are protected.

Because when you write these laws--

like, and I think the left and the right both

have become very anti-free speech.

Like, leftists just want to censor people on the right.

people on the right want to just censor people on the left.

And you see very few organizations--

this was criticism of ACLU and some other orgs--

you see very few organizations, except FIRE,

and I think some others, that have

done a good job of really being bipartisan and being like,

we're standing up for free speech, even speech

that we don't like, even speech that we find morally

reprehensible.

Like, we will defend speech.

Again, not if it's criminal, not if it's directly--

but you know what I mean.

Just protecting people's right to expression is very important.

And their right to privacy is very important.

I think that's the other thing is, is like, we deserve anonymity on the internet.

I think it's very dystopian to have everything that we say online and read and consume and

watch online tracked by the government.

Imagine the worst person that you can imagine, imagine the worst person that you hate on

the other political team becoming in power.

You know, is that what you want?

No, you don't want them to have control over your information ecosystem.

We've covered all the questions that I had written down, but I always like to kind of

open the floor.

anything that didn't come up that you're like, no, I really want to make sure we talk about

this before I go.

Well, I just would tell people to get involved.

Like I see a lot of nihilism online these days where people think, oh, it doesn't matter.

Oh, what can I do?

Truly, I promise you, I've talking to people in these offices, these congressional offices,

it does make a difference.

Fight with them.

I've had people call and say, oh, and they said that, you know, the staffer dismissed me

and they say, oh, there's not doing identity verification.

Yes they are.

Like, don't let them gaslight you.

Call them, make your voice heard.

There's a really great website called bad internet bills.com, which fight for the future

of a digital rights organization put together.

It's so great.

It makes it so easy.

You can sign their form.

You can send a letter to Congress.

They tell you exactly what to say.

They have overviews of all of these bad laws.

I really encourage people to go to bad internet bills.com and just make their voices heard.

As you mentioned, we actually were able to stave off COSA before.

We're able to stave off.

to make it clear that we the public do not want this gross invasion of privacy.

Yeah, we've been pushing that website a lot on our weekly live streams for sure.

It's great.

And I've been seeing it pop up on a lot of what I was doing research for this.

I'm seeing it pop up in a lot of other places too, which makes me really happy.

It's so great.

I'm so glad that they put it together because it's just very easy.

Yeah, fight for the future rocks.

They're doing great work.

Taylor, thank you so much for your time today.

You are very active in continuing to discuss these kinds of issues and talk to

experts about it.

So where can viewers and listeners continue to follow your work on this stuff?

Yeah, I'm on YouTube just at Taylor Lorenz. I have a series called Free Speech Friday,

where every Friday I talk about these issues and starting live streaming soon too to talk about

because there's just so many issues. I can barely one video a week is not enough. But yeah,

you can find me on YouTube or my newsletter, which is just user mag.co.

I think that's all we got. Thank you so much again for your time. We really appreciate it.

We want to thank Taylor again for making time in her busy schedule to come and talk to us.

This interview came together on very short notice due to the pace of current events,

so we really appreciate her being flexible and lending her expertise.

PrivacyGuides is an impartial, non-profit organization that is focused on building a

strong privacy advocacy community and delivering the best digital privacy and consumer technology

rights advice on the internet. If you want to support our mission, then you can make a direct

donation on our website, privacyguides.org. To make a donation, click the red heart icon

located in the top right corner of the page. You can contribute using standard fiat currency via

via debit or credit card, or opt to donate anonymously

using Monero or with your favorite cryptocurrency.

Becoming a paid member unlocks exclusive perks

like early access to video content and priority

during the This Week in Privacy livestream Q&A.

You'll also get a cool badge on your profile

in the Privacy Guides forum

and the warm fuzzy feeling of supporting independent media.

Thank you for watching and we will see you in the next video.

(gentle music)

Creators and Guests