֭

Table of Contents

‘So to Speak’ podcast transcript: Elon Musk, PayPal, and is New York trying to destroy Twitch?

So to Speak: Elon Musk, PayPal and Twitch

Note: This is an unedited rush transcript. Please check any quotations against the audio recording.

Nico Perrino: Welcome back to So to Speak: The Free Speech Podcast, where every other week we take an uncensored look at the world of free expression through personal stories and candid conversations. I am, as always, your host, Nico Perrino. And we’re doing something a little bit different today, we’re back in ֭’s Philadelphia offices. It’s the first time we’ve been in this studio in I think three years since pre-pandemic. It’s a tight studio.

And so, it was hard to set up the tables and chairs and all that. But we used to do that. So, when Aaron, who’s behind the camera over here, was talking to me, he’s like, “How did we used to do this? How did we used to get six people, or five people into this office, into this studio to record a podcast?” So, we had to kinda go back into the cobwebs of our memory to figure that out. But Will was here, I believe, the last time we had it –

Will Creeley: Yeah, I was. It was an honor then and it’s an honor now. It’s cozy in here, I like it.

Nico: It is cozy in here. Will Creeley, of course, is ֭’s legal director. And we also have a new guest on the podcast, Aaron Terr, he is a senior program officer here at ֭. And it was in June that I believe I came to our podcast listeners and explained ֭’s expansion. I was with Greg Lukianoff, our president and CEO, I was with Alisha Glennon, our chief operating officer. And we were talking about what we were doing. And one of the things that was kind of always part of the plan, but has become a really fun part of the plan are these rapid response meetings that we do every morning at ֭, 9:30 a.m. FIREsenior leadership is there, Aaron is there with us as well working on the team.

And we discuss, essentially, the free speech news that has come in over the last 24 hours. Having worked on college campuses for 23 years, we kind of know the issues, right? And we know our take on the issues. Of course, new things come up. But for the most part, we know where we’re going. Off campus, we don’t always have the work done on a lot of the issues that we’re gonna be confronted, or asked to comment on, or potentially litigate. So, we got these 9:30 meetings in place with senior leadership to kind of work through them, to debate, to discuss them.

And for this podcast, I wanted to re-create that feel now. A lot of the stuff that we’re gonna discuss today, we’ve discussed before. And we sort of have ֭’s take on. But there’s been a lot of debate, some disagreement within the organization about it. And so, I’m hoping to kinda re-create that and look at these issues that we’ve discussed in the rapid response meeting, from all different angles.

Things like Musk acquiring Twitter, PayPal threatening to fine people $2500 for spreading misinformation, the New York attorney general seeking to ban live streaming in the state of New York. Charlottesville, Will was in the Daily Progress in Charlottesville, Virginia, talking about a new personnel policy they have there for city employees.

And then, potentially, if we have time, getting into some of the new laws – well, one of the new laws that was passed in California, protecting artistic expression in criminal trials and potentially some of the jaw boning that the government has been doing with social media companies surrounding election misinformation. So, Will, welcome back. Aaron, welcome for the first time.

Aaron Terr: Yes, thanks. Good to be here.

Will: It’s been a party, Nico. It’s been a wild three months. I’m glad they use the word fun in describing those morning meetings because my experience, and Aaron, I imagine you feel the same way, is that it is fun. 9:30 a.m., everybody’s got that cup of coffee, and you’re looking at headlines. You maybe saw something last night before you went to bed, you maybe saw something that floated across the feed on the way in.

And the experience of just getting together and hashing it out with folks, who you’d know and like, right, your colleagues and your trusted voices. And even then, exploring the nooks and crannies, it’s been fun. The way I’ve described it, folks, is like – what’s the expansion, I say, “Well, some days it’s like being a kid in a candy store, right?” Because for 16 years working on campus, it’s kinda like campus, free expression issues are very interesting. And it’s been very exciting and satisfying to do that work. I feel honored and privileged to have done it for so long –

Nico: And it’s been kinda like the leading free speech story in the United States for the past decade, right?

Will: Exactly. You got it. And I’ve always said that whatever you see on campus, you’re gonna see it on campus first, and then you’re gonna see it elsewhere. But at the same time, for 15 years, been kinda looking over the fence, just off campus thinking, “Boy, that’s kinda fun. I wish I could get in on that.”

So, on the good days, it feels like a kid in the candy store. We’ve got the wide range of free expression issues to talk about and engage with. And on the tough days or the overwhelming days, there aren’t too many tough days. But overwhelming maybe is the way to do it, you feel like the dog that caught the mail truck, like, “All right, here we go. You wanted off campus, you got off campus. Let’s talk five issues that are all front page.”

Nico: Well, talking about the issues isn’t the challenge with thenexperience, that’s a fun part, that’s why I came to ֭, right? The

challenge is the capacity and organization building that comes with

it, that happens off camera, people don’t see that. But it’s very necessary for us to take the discussions from those rapid response meetings, and actually implement them as part of our mission to advance free expression in America. And that’s required a lot of late nights, a lot of time spent diligently hiring the right people, getting them on the bus.

But the rapid response meetings are, of course, a bright spot at 9:30 in the morning. They’re supposed to last 15 minutes, they often go for 30 minutes. But what are you gonna do? This morning, what did we talk about? We talked about Dick Durbin, senator from Illinois. I’m from Illinois, outside of Chicago myself. He had a tweet, because everyone’s talking about Musk and Twitter, right, that said, “Free speech does not include spreading misinformation to downplay political violence.”

And it was kinda funny because Twitter itself fact checked it, you know they have these context adding boxes that they put beneath certain tweets, that says, “No, this sorta speech is protected by the first amendment.” And then, a leak to Wikipedia’s first amendment page. I thought that was kinda funny.

Will: Yeah, it’s a nice way to do it. With all respect to the senator, it’s just not the case, right? And as I understand it, from our colleague, Adam Steinbaugh, in the 9:30 this morning, there was kind of a bipartisan, across the political divide response to Senator Durbin, saying, “Actually, senator, here’s the deal, right, it is protected.” And as we commented in that meeting, boy, the word downplay in that tweet could do an awful lot of damage.

Nico: Yeah, it sure can, right?

Will: Downplay is elastic as a rubber band. You could really stretch that one.

Nico: Well, a lot of our discussions these past couple weeks, months, has obviously been about Musk and Twitter, right? That’s driving a lot of the free speech conversation. So, Aaron, I’m curious what your thoughts are on the whole situation. Not just the thoughts around free speech per se, but also just kinda how it’s all played out, right? It’s been a whirlwind.

Aaron: Yeah, yeah. I’m trying to be cautiously optimistic about the new regime over at Twitter, with Musk at the helm. I think he said a lot of things that give reason to be optimistic about the direction that the platform is gonna go in, the direction of a more free and open speech environment. And I think that would be something to celebrate. On the other hand, he’s also made some comments that maybe give a little cause for concern.

He’s mentioned a couple times about having Twitter follow the

laws of a country in which the service operates, which makes you think, “Okay, well, what about activist and dissidence in countries under authoritarian regimes? Does that mean that their posts are gonna be removed, or perhaps even worse, that Twitter would comply with the government’s request and turn over information about dissidence?” So, that would not be very good for free speech, obviously.

Nico: Yeah. But one of the questions about that is like, so, what do you do if you wanna operate a platform for free speech in some of these foreign countries where they have laws that are restrictive of free speech? I just saw something going around the news. The CEO of Rumble, which is this kind of upstart conservative competitor to YouTube, a video streaming service. France has this law, or they got reached out to by French authorities asking them to take down certain posts or to not allow uploads of posts of a certain category or type, I don’t remember exactly what it was.

And they essentially said, “Well, if that’s what you’re gonna ask us to do, we as a platform that values freedom of expression, we’re just not gonna operate in France.” Now, that’s not something that a lot of these social media companies that say they are for free expression, recall Twitter said it was the free speech wing and the free speech party early in its heyday, when it was first started. That’s not a call that they’ve made. They kind of bowed, in a certain sense, that now Twitter, not always, but certain companies for sure. But they don’t do what Rumble does.

Will: Two quick things on this point because this is fascinating. Aaron, and you teed it up nicely. Over the summer, I had the pleasure of reading Jillian C. York’s most recent book. She’s at the Electronic Frontier Foundation. And she wrote a really fascinating, in-depth, almost semi-biographical piece about her experience navigating activism, digital activism, and the kind of birth and rise of social media over the years, overseas. So, she was talking to contacts in the heat of the Arab Spring, that promising moment where it looked like social media really would fulfill –

Nico: 2011.

Will: Yeah, fulfill this promise of expanding democratic access and providing printing press for everybody, et cetera. And she talks about the way that things start to change, right. That her contacts at Twitter stopped answering her calls, the ways that Twitter kind of walked away, in addition to Facebook, and others walked away from that promise of being the free speech wing, the free speech party.

And it’s really fascinating, it’s very deeply reported, it’s based on personal relationships. Highly recommend it if you’re out there interested on this. The second point I wanna raise, with regard to Musk, particularly, and the idea of law of the land and what this means for platforms beyond domestic borders. Check out Matt Yglesias’ post from last week about Musk and his relation–

Nico: On his Substack?

Will: Yeah, on his Substack. It’s just Slow and Boring is the name of the Substack. On his relationship, Musk relationship to China and Tesla, and what that might mean for Twitter and this kind of I think uncomplicated notion of Musk as this free speech savior. It’s an interesting post. And Yglesias takes Apple to Task, he takes American Corporation Task generally for talking good game, but then when it comes to access to that Chinese market, kind of –

Nico: They’re not doing what Rumble did, right? Just leaving France, right?

Will: Right.

Aaron: I had no idea that Apple TV just had a policy against any of its shows, there being any kind of content that’s anti-China or –

Nico: Yeah, the CCP, yeah, right.

Aaron: Yeah.

Nico: Well, Musk’s in it to run a business, right, and to make a profit. I think they’re gonna say he overpaid for Twitter, $44 billion –

Aaron: He says that’s not his motivation, but –

Nico: Yeah, but Tesla’s his baby, right? And I think he’s kind of over- leveraged on Twitter, and that could cost Tesla if he doesn’t start making a profit on Twitter. And one of the ways to make a profit is to, at least the argument on Twitter happening right now, is to sensor a certain case or to make it a place where a lot of people wanna be. So, he’s got the strategy.

And this is something I wanna get your guys’ perspective on. Because I do remember 2011 and seeing the Arab Spring and all the takes that social media is democratizing the world. That went away. And I wonder if it’s because people have gotten more toxic on social media, and that could be part of it. Or, there really is a bot problem, a troll problem, where authoritarian governments or bad actors have figured out a way to manipulate the conversation by creating inauthentic counts that amplify the trolls as a way to sew discord.

I’m just curious what you guys think about it because when Musk and his Twitter team, on Twitter right now are talking about solving free speech. Part of it is eliminating the bots.

Will: Right. Aaron, you wanna first swing at this? And I’ll jump in, either way.

Aaron: Yeah. I think when it comes to bots, I do have concerns about that. I do have –

Nico: On the anonymous speech front? Just trying to authenticate accounts?

Aaron: Well, yeah. So, the argument on one side, right, is that if we authenticate accounts, then we ensure that we don’t have these kind of a lot of inauthentic accounts, or that you have one person or one government that’s creating a bunch of accounts, that don’t actually each represent a real individual person. And there’s an argument there that to restrict that isn’t anti-free speech because

it’s kind of an artificial scenario, where you have all these – but of

course, but then there’s inauthentic accounts raises concerns about the right to anonymous speech.

If you make it easier for the platform and foreign governments to identify users, and then take action, retaliate against users for their speech, that’s a big problem. That’s why the first amendment protects anonymous speech. It’s so that people can speak out against the government, against power, without having to worry about political or economic retaliation and harassment.

Nico: Yeah, especially if you’re operating one of those foreign countries, the authoritarian foreign countries you were talking about before. I do wonder, so Greg’s Lab in the Looking Glass theory is it’s important to know the world as it actually is. I do wonder if the existence of bots manipulates the world. So, it’s hard to know the world as it actually is, you’re not actually hearing from other citizens participating in a democratic process, you’re hearing from Joe Schmo who’s martialed a bot army to make it sound like his minority viewpoint is a majority viewpoint.

Will: That’s right.

Aaron: Exactly.

Will: And I think that’s true even without bots of Twitter, generally, right? How fascinating has it been for the past five years, when you see those estimates of the number of registered accounts on Twitter and the number of people who actually speak, right? Most folks on Twitter are lurkers. And Nico, I just sent you yesterday, and Aaron, I’ll forward it to you today, that recent Charlie Warzel post he writes over at The Atlantic about tech and social media, et cetera. And he had this interesting post about what he called geriatric social media.

Social media is just fundamentally conversations between human beings. And sometimes, conversations turn boring, right? Sometimes you’ve just been talking to the same people for a long time. And he made this point about Twitter, most of the people he knows, who are what he called Twitter power users, they’re now power lurkers because there’s a script, right, in 2011. It was new and fresh. In 2022, as Warzel says, you know what everybody is gonna say, you know there’s gonna be a main character for the day.

You can predict what these folks on the left are gonna say, what these folks on the right are gonna say, and it’s just all kinda this charade at this point. It’s very kind of, I don’t know, been done. It's predictable. And at some point, young people, and this is where you and I got into it, Nico, are just gonna say, “Why the hell would I wanna waste my time participating in this increasingly boring, increasingly old, increasingly insular community?” If you ever spend too much time on Twitter, and I, for the record, quit.

Nico: I don't know, Will, I hear from you about things you've seen on Twitter. So, you’re lurking –

Will: I will log on once in a while. People will send me things. But I took the advice of Calum Flanagan a year ago and quit, basically, mostly. So, whenever you try and talk about Twitter to people who aren't on Twitter, you sound like a crazy person, right? “Here's the latest thing. And then, so and so said this.” And if people aren't on Twitter, they don't know about it.

And I think the vast majority of people aren't on Twitter. The vast majority of young people are signing up for Tik Tok and other more interesting things, right? So, maybe Twitter is increasingly unimportant. And if that's the case – I mean, it still is important, right? Lots of powerful people, big media folks on there, academics, et cetera. But I'll be just very curious to see where Musk's bet goes, right?

Nico: Yeah.

Will: Because I don't know, if you’re trying to make something safe for advertisers, safe for users, and I mean safe, like quote unquote safe.

Nico: Yeah.

Will: We'll see. We'll see.

Aaron: The people who, you often hear they complain about Twitter, as just this hellscape, right. And it's just terrible, it's so toxic. And by the way, those people are saying that now, that it's gonna happen with Musk. But people were saying that constantly before Musk took over. So, I don't know, maybe we're just going to a lower circle of the hellscape now –

Will: Yeah, hellscape, hellsite, since at least 2012.

Aaron: Yeah, yeah. No, but also, when I hear that, I actually like Twitter. But because you do have a lot of room to curate your experience on the platform, right? So, if it's so toxic, maybe you should stop –

Nico: But now, they’re recommending a lot of stuff for me –

Aaron: Well, that's true.

Nico: I used to be able to follow accounts and my feed was those accounts. But now, most of what I see in my accounts are someone who I follow liked this, and now it's in my feed. And maybe there's a way on the back end to curate that. But it's not the experience I've curated for myself.

Will: As soon as you stop being able to see things in chronological order from the people you follow and have that be it, I thought, “Yeah, here we go. Look out.”

Nico: Yeah.

Will: Whereas, something like TikTok, right, I think this is a Warzel piece. They said, “The video…” And hello to everybody who is watching on video, right, the video is such an informationally dense medium, that it communicates all kinds of subtext and interesting points, without even the written word, without even the verbal acknowledgement of what's going on. That that is so much more powerful, as particularly for younger folks who are spending more time on it, that Twitter might just, you know, all social medias sites have a lifespan. You know what I mean? We’ll see.

Aaron: Well, isn't it just true, in general, over history, that once anything becomes too popular, or among older people than the younger generation, best to move onto something new.

Nico: Well, that’s Facebook.

Aaron: Exactly.

Nico: The only people I see posting on it are people over 60. It's like my grandma is on there all the time sending me messages.

Will: Yeah. I got an eight-year-old. Is my eight-year-old ever gonna have a Facebook account?

Nico: Probably not, I would–

Will: Is my eight-year-old ever gonna log on to Twitter? I don't know. And my five-year-old, let's say definitely not –

Nico: And I think the same thing is starting to happen with Instagram, particularly over the last couple of months. It's no longer the platform it was. TikTok’s taking off, FIREcreates a lot of content for TikTok, that's only put on TikTok. So, I encourage our viewers to check it out.

Will: Yeah, speaking of that, I should say everybody should subscribe to FIREsocial media accounts, which are not dying, which are alive and well –

Nico: And then, which Will Creeley is a main character.

Will: Right. Yeah, cheers.

Nico: Of course, in our lawyer video series. But I'm curious to see where the Musk stuff all goes. I think he's gonna be pressured on the financial side, to not take the free speech maximalist position that he was articulating in the spring. Do I think it'll probably be better than what it was before, where they're shutting down stories about Hunter Biden's laptop, or tagging theories about COVID as misinformation, probably not.

And might some of the people who got banned from Twitter be brought back, like Jordan Peterson? I'd say probably. But I don't think Musk will take that maximalist position and he's already started to walk it back. We'll see. He's a mercurial guy, right, he waves with the wind. But he's also a very successful businessman. So, Twitter might end up becoming something entirely different from what it is now. He talked about making it the x.com everything app. We'll see where that goes.

Will: We shall indeed.

Nico: But we could spend this whole conversation talking about Elon Musk and Twitter. I wanna move now to PayPal, which was another big story over the past month. PayPal put out a new, what was it called? User agreement?

Aaron: Acceptable use policy.

Nico: Yeah, acceptable use policy.

Aaron: Proposed changes to their [inaudible – crosstalk].

Nico: Well, it wasn't proposed changes, right Aaron? Because they sent an email to all of their users, that said that they were making updates to the –

Aaron: Or, let's say intended changes.

Nico: Yeah.

Aaron: They were never actually implemented. But yeah, they sent notice of upcoming changes.

Nico: This was at the end of September. These would go into effect on November 3rd. And the new policy, and this is something that you wrote, Aaron, for our blog. And blog is titled, “PayPal is No Pal to Free Speech.” It would dramatically expand PayPal’s power to take action against users for activity on the service, involving disfavored speech. That includes, “Any messages, content, or materials, that in PayPal’s sole discretion are harmful or objectionable, depict or even appear to depict nudity, depict, promote, or incite hatred or discrimination of protected groups, present a risk to a user's well-being, and promote misinformation.”

Or, in PayPal’s opinion, and this is the exception, as well as the rule, it’s otherwise unfit for publication. And that's kind of a cobbling together of all the quotes, but it's directionally correct. Those are the categories of speech that would have been prohibited under this acceptable use policy. And anyone who violated it would be subject to a $2,500 fine, deducted from their accounts. Now, PayPal quickly walked this back, Aaron. So, can you talk a little bit about that, and how FIREwas thinking about this issue in our rapid response meetings. Because you've done a lot of the work on this front for us.

Aaron: Right, yeah. So, like you said, they walked it back. There was a lot of criticism –

Nico: The stock tanked.

Aaron: Yeah. And there was criticism from FIREabout this. So, I was the primary author on a report that we recently – it's on ֭’s website, about free speech and online payment processors. This is an issue with the expansion that we've been focusing on more broadly, the role of online intermediaries. And how they can create a very restrictive speech environment because they exercise so much control over our lives, essentially. Our ability to access the internet, in the case of payment processors like PayPal and Venmo, our ability to send and receive money. And that’s –

Nico: Or, with Cloudflare, which protects against denial of service attacks, the ability to stay on the internet when you're subject to denial of service attacks.

Aaron: Right, right. Yeah, domain registries, all sorts of online infrastructure. And with payment processors, it's essential to so many things. Online content creators, right, they use services like PayPal to raise money and make a living for what they do. Nonprofit organizations like ֭, right? We use PayPal to raise money. And just everyday Americans, right, they use these services all the time to buy and sell things online.

So, if these companies get in the business of policing users’ speech and viewpoints, then that can be very corrosive to a culture of free expression, even if they have the legal ability to do it. And so, ֭’s position here is that even if you have the legal ability to do it, you shouldn't. And there are good reasons why they shouldn't get into that game. And one thing I like to point out is that even if you're kinda comfortable, and this goes for social media I think too, even if you're comfortable with the current regime of censorship that's taking place because maybe your views aren't the ones that are being targeted –

I wouldn't rely on your favorite tech company having a CEO who's sympathetic to your views for all the time. And case in point Twitter, right. All the people that are now worried about Elon Musk taking over because his perceived lurch to the right side of the political spectrum. Well, if Elon was gonna start instituting policies or practices that censor based on viewpoint, what kinda viewpoints do you think he's gonna censor? So, yeah, I think it's really concerning with PayPal and companies like that –

Nico: Well, it's no surprise that their policies, which are vague, broad, and opaque, are being applied to their users in opaque ways. Free Speech Union, which is a UK group started by Toby Young, had its PayPal account suspended or taken down. Toby Young had his personal account taken down, and then their news and opinion website, The Daily Skeptic, had their PayPal account taken down. And PayPal didn’t give them a reason why.

And when PayPal spokespeople were asked for comment, they sort of just gave a broad platitude to free expression, but also, protecting diversity in various forms. But other things, you have PayPal’s Greatest Hits in your broader piece, but it includes things like shutting down writer, Colin Wright's account shortly after Etsy had banned his account. And there seems to be a snowball effect, like a domino effect where one online platform does it and then all the others follow. They also shut down, or gave a warning to an eBook distributor to remove certain works of erotic fiction.

Will: What about the heavy metal band? Nico, get that one in there –

Nico: Yeah, PayPal suspended a user for buying a t-shirt from ISIS, the heavy metal band. So, they kinda have a hand fisted algorithm that probably does some work for them too. But you can't get answers from them. If you ask them, I believe it was Colin Wright or Toby Young asked them why their account was suspended and they said, “You either have an attorney or a law enforcement officer submit a legal subpoena.”

Aaron: Yeah, I think that was Colin Wright.

Nico: Right?

Aaron: Yeah.

Nico: So, you can't even find out. And in the meantime, your money is locked up. And some people, when this acceptable use policy came up, tried to shut down their PayPal accounts. And we were seeing reports that they couldn't, they were not allowed to do it. So –

Aaron: Yeah. Yeah, and you raise a good point about there's due process concerns, too, where users don't receive meaningful notice about why the service took action against them. They don't get detailed reasons. It's usually just, “You violated our policy.” And that's it. And then, they may also not have a meaningful opportunity to appeal the decision, and present evidence from their side. If you don't know why you were banned, then it's kinda tough to appeal the decision. So, I think that only kind of exacerbates the situation.

Nico: Yeah. Well, Aaron, and I wanna ask you about this, Will, you mentioned kinda these are online intermediaries and they have the legal right to do it, at least for now. But they shouldn't. How do you think about these online entities, like social media platforms, internet intermediaries. How should we think about their duties to free expression more generally, Will?

Will: Well, that's a great question. It's one that Aaron and I have been hammering around internally with you, Nico, and FIREpresident, Greg Lukianoff, and our director of legislative policy, Joe Cone, Ronnie London, our general counsel, who worked on kinda these issues for years over at Davis Wright Tremaine. And it's been kind of the fun freewheeling debate that we've had over the past few months, of how to differentiate or distinguish between internet infrastructure, like Cloudflare – Cloudfire? Cloudflare?

Nico: Cloudflare. Yeah.

Will: Thank you very much. Old man over here. And Twitter and social media –

Nico: And you're not the only one that confuses that. I hear that confused all the time. And there might be actually another company called Cloudfire.

Will: Well, so, the Cloudflare CEO, again, shoutout to Jillian C. York, who co-authored a great EFF statement, Electronic Frontier Foundation statement on internet infrastructure and the importance of viewpoint neutrality and clear terms of service for internet infrastructure, like PayPal, like your Amazon Web Services, like the nuts and bolts of things that make the internet work. And I think you can put those in one camp, and then put the social media platforms and other things in another camp, right? One of the concerning things from that Fifth Circuit opinion in Netchoice v. Paxton for me was the idea –

Nico: This is the case that we talked about, we debated two weeks ago –

Will: Right. Exactly, with Ilya and –

Nico: Ilya and Bradley Smith.

Will: Bradley Smith, thank you so much –

Nico: Yeah, former FCC chair.

Will: Of course. No, but the idea that the social media platforms, as advanced by Judge Andrew Oldham, in the Fifth Circuit opinion are kind of common carriers, right? If you're a big, powerful social media company with X number of users, the state of Texas says you gotta let everybody come in, you're gonna deal with certain viewpoint neutrality rules enforced by the state, and we're gonna essentially treat you in the same way that we treat bus services or phone lines, right. You're available to everybody, you can't censor speech.

And I think that's really concerning on that front because I think that what Twitter does and may continue to do, as we were just talking about, is provide some curation, provide some editorial discretion, provide some work on the algorithm side of things, or sometimes even on the manual side of things, to make sure that users’ feeds reflect their interest and reflect a certain kind of site that Twitter wants to create, right. I think that's a big difference from PayPal.

No one's using PayPal because they think PayPal aligns with my values or PayPal has some expressive message right? Or PayPal is somehow involved in communicating an idea. We use PayPal because it’s a bank and it's a easy way to pay people. That's the bottom line. And likewise, you use Amazon Web Services or Cloudflare because you want your site to load. And I think we get into really dicey territory if we, first of all, start conflating the two. Right, if we start thinking of Twitter and PayPal as the same thing. They're not the same thing.

And if we start allowing the state to come in with heavy-handed hand fisted solutions to questions about viewpoint neutrality, or equal application of Terms of Service or user moderation policies between the two. So, again, check out this Jillian C. York co-authored piece for EFF. But her basic point was, you need transparency and consistency and viewpoint neutrality, with regard to internet infrastructure. And I think that's correct. And when it comes to websites that are social media platforms, I think you think of them more like, this is an imperfect analogy, but more like newspapers, where they're doing some editorial function that is protected by the First Amendment.

They've got their own capacity as a private business to pick and choose which users comment. If you don't like it, you start Rumble, or you start GETTR, or Truth Social, or whatever. I think that's kinda more the First Amendment consistent, First Amendment principle that arises out of the case law, arises out of our common understandings of free speech. That's where I'd like to go with –

Nico: Well, it's interesting. Is there any comparison, analog comparison, to some of the debates happening around Masterpiece Cakeshop case and the 303 Creative case? Although, it seems like the political valence of those issues are flipped, when you bring it into analog, as opposed to digital.

Will: Right.

Nico: When you think that in the digital side, you have conservatives, not all of them, of course, but some of them who are arguing that social media companies operate as common carriers, right? And should be allowed to be regulated by the state. And the left is saying, “No, these have a editorial function.” They have a particular message they're trying to send, they need to have the freedom to associate and express themselves under the First Amendment –

Will: But if I'm a wedding website designer, can I put up a message saying that I don't serve same sex couples?

Nico: Conservatives would say yes.

Will: Conservatives say yes and folks on the left would say no. Yeah, no, I think that the through line is yes, you're allowed to put up that message. The market will sort it out, right? There are plenty of other graphic designers or website designers in the state of Colorado, and you go find the one that will serve you and would be happy to serve you. I mean, that's –

Nico: And then, the analog is, of course, there are places of public accommodation, which do not have an expressive component. Hotels, banks, right, that is comparable to the intermediaries, or the – what have we been calling them internally, the middleware.

Will: Yeah.

Nico: In the digital environment.

Will: We’ve had fun fights at FIREabout this, right?

Nico: Yeah.

Will: When we come to the public, we have a unified message. But man, making that sausage can be a lot of fun internally. Because we all come with the general understanding and principle that we're here to protect expressive rights, but the contours of that and the nuts and bolts, or how we think about the doctrine and its growth and where we'd like things to go and what our plan is as a freedom of expression advocacy group that believes in not just law, but also culture. That's fascinating. We've had fun debates in here.

Nico: Yeah. Well, I wanna put a bow on this, Aaron. I want you to kinda bring this PayPal story full circle. So, we talked about how PayPal reversed after intense public backlash, taking stock price, it's updates to the acceptable use policy. But that's not the end of the story because they eliminated the fine, the $2,500 fine for spreading misinformation. But some of the other stuff that was in that acceptable use policy still remains in what, a user agreement or something? They've got multiple policies that is confusing the shit outta people.

Aaron: Yeah, yeah.

Nico: So, can you tell us where that’s all landed?

Aaron: Oh, yeah. There's been a lot of what you might call misinformation going around. I think because the policies are kinda confusing and yeah. So, there's two different policies that people have been scouring for bad provisions. And one is the acceptable use policy, that’s the one that we've been talking about, that they were gonna make all these changes, including adding the prohibition on promoting misinformation.

The other one is the PayPal User Agreement, which you have to click and agree to use the service. And so, people found that after PayPal walked back the changes to the acceptable use policy, that, well, actually, the user agreement has a ban on – it says, “Users may not provide false, inaccurate, or misleading information, in connection with their use of PayPal services or in their interactions with PayPal, other PayPal customers, or third parties,” which is kind of weird and vague. So, people are saying, “Oh, look, they actually brought the misinformation thing back.”

But this was actually a provision that had already existed in the User Agreement, that I think people were just discovering. And I do think PayPal would do well to revise that provision. Because it's not clear, when you look in the user agreement, that is surrounded by prohibitions on fraudulent activity.

Nico: Like commercial fraud.

Will: Yeah. So, it might just be intended to say, “Well, when you're selling a product through PayPal, you can't state false information about it, which is commercial fraud.” But the way it's written, it could conceivably be interpreted to reach broader, so-called misinformation, all the political misinformation that's become a trend now and trying to fight back against that. So –

Nico: And you would be right to be wary of that, considering how it’s been exercised against people like Collin Wright and Toby Young in the free speech union, and the purveyors of erotic fiction.

Aaron: Right, right. And the other thing to mention is just that the acceptable use policy also already had a couple provisions in there, that are still there now, basically a hate speech prohibition is in there. There's a prohibition against certain sexually oriented materials.

Nico: Is that clause about otherwise unfit for publication still?

Aaron: I think that was just one of the new ones. That's not in there –

Nico: Yeah, that was a crazy one –

Aaron: Yeah, basically saying anything.

Nico: So, we'll see where that story goes. One of the things that surprises me about the work we've been doing in rapid response, and we kind of anticipated the tech and the Electronic Frontier, to take a phrase from our friends at the Electronic Frontier Foundation, was gonna be kinda where the cutting edge, free speech discussions were happening. But I didn't anticipate it would be this much of our conversations every morning. Which brings me to our next topic, which is the New York Attorney General recommending restrictions on live streaming.

So, the shooting happened in Buffalo, in October, the tragic shooting that killed 10 people. And coming out of that, the New York Attorney General and lawmakers in the state of New York kinda figured out ways to try and prevent that sort of thing from happening again in the future. And one of the things that happened during that shooting, if I’m understanding the facts correctly, is that the shooter live streamed the shooting for two minutes before Facebook caught it and took it down.

Aaron: Or Twitch, yeah.

Nico: Was it Twitch?

Aaron: Yeah.

Nico: Yeah. And took it down. And so, coming out of that, the New York Attorney General has recommended changes to the law that would limit live streaming abilities on certain platforms. The report recommends, for example, that live streaming be limited to people who have been verified, or who have a sufficient number of followers, or that they implement a tape delay like you see in some, but not all broadcast television. And that there would be civil liability for individual and platforms who do not abide by that law.

The report also includes recommendations that Congress, for example, reform Section 230 to require platforms to, “Take reasonable steps to prevent unlawful violent criminal content, and the solicitation and incitement thereof from appearing on the platform and in order to reap the benefits of Section 230.” This recommendation hasn't been adopted by the state legislature yet.

So, nothing's happening with it. And the New York AG has no control over Section 230, which is a federal law, of course. But it is concerning, right? It does raise some significant free speech questions. For example, that report conflates inspiring criminal acts with inciting them. The latter, of course, being an unprotected speech if it meets the Brandenburg standard for incitement to imminent lawless action. What else concerns you about this –

Will: Well, there's a lot. So, first of all, I wanna say, as a proud native of Buffalo, New York, I was just in Buffalo last weekend. Yep, I got my Bills cup here and I have friends who have family, who were among those killed. So, this feels personal. But one thing that anybody who works in civil liberties knows is that after a tragedy, and this surely was one in many awful respects, the urge to restrict civil liberties is always at its strongest. I think we have to do something.

So, this feels like a well-intentioned, but misguided effort to do something. Isn't there something we can do? Because obviously, live streaming a mass shooting is a horrible act. But as we know, civil liberties, when they are threatened, it's at their weakest point. And so, think, what could the possible value be of streaming something like this? Well, possible value, if you think about it for a minute, what about police brutality? What about the ways that capturing images, or live streaming stops, or public demonstrations in real time have changed the accountability for police brutality or for law enforcement, over-enforcement, et cetera.

So, you think about what would be lost with a policy like this. And the hands in which it would fall if we allow the government to control live streaming this way, then we lose the ability to act as a check on government abuse in serious ways. And also, just to say, just to the expressive media is threatened. The right of folks to get out there and spread their message is threatened. So, you lose a lot of core political important speech, that we think we'd all agree is protected by giving the government the keys in this particular way, to drive this vehicle in this particular way.

So, I understand it, and I feel the hurt from which it is sprung. But, again, that's when folks who care about civil liberties need to be on their highest guard, right after a tragedy. Somebody once said, “Be wary of any law that's named after somebody who has been tragically killed.” Because it might have been passed in a rush, it might have been passed at a moment where emotions are understandably extremely high. And there's often an opportunity for encroachments on civil liberties at that particular moment, right?

So, I think this is one of those cases. I think that the threat to free expression is real here. And I think if this was passed, it would be challenged, and it would lose that challenge. Those are my quick thoughts on it.

Nico: Well, Aaron, you're working on a piece for FIREon this law and sort of the general issue. Wouldn't this effectively shut down streaming platforms like Twitch, that require, for their effective function, real-time engagement? You’re playing a video game, users need to be able to respond as you're playing the game, right?

Aaron: Right.

Nico: And so, it would just render the platforms unusable as a practical matter.

Aaron: Yeah, for a lot of streamers at least. Yeah, it's true that a lot of live streamers on the platform, Twitch, on YouTube, you use YouTube Live. They kinda rely on that engagement with their audience. That's part of how they build a community and how they get support and followers, is through that real-time engagement. So, you can't have that if you have a forced tape delay, right?

Nico: Yeah.

Aaron: Or a broadcast delay. And another concern –

Nico: And you were working on a piece on this, right? Or was it –

Aaron: Yeah, with –

Nico: Ryan.

Aaron: Ryan.

Nico: Yeah, Ryan Weiss.

Aaron: Here at ֭. And the other thing is that the recommendations are to restrict – impose this broadcast delay, just on unverified users or users who don't have a threshold number of followers. So, you're essentially discriminating against unpopular streamers and handicapping their ability to –

Nico: Become a popular streamer.

Aaron: To become a popular streamer. Yeah.

Will: And it's also a mirror image, again, with different motives and a different impetus, to the same kind of state oversight and mandates that we are wary of in the Fifth Circuit case, that we just talked about a minute ago. With the state of Texas saying, “If you get to be this big social media platform, you have these restrictions now imposed on you by the state, about the kinda content you can publish and the kinds of rules that you must impose.” Same thing here, right? And it's equally troubling, I think.

Aaron: Yeah. I don't wanna get too much into whether these restrictions would actually be effective in achieving their goals. I personally don't think that they would be.

Nico: Well, Twitch took it down after two minutes. They saw it, they flagged it.

Will: I mean, listen, I was a high school senior in 1999. And that's Columbine. And there was no streaming then and there have been shootings since then. The streaming and the copycat nature of it existed before. I mean, the copycat nature of school shootings existed before streaming. And I don't think you take away streaming and all of a sudden, we're good.

Aaron: Yeah, that's the sad reality of it.

Nico: Right, right.

Will: This feels like dealing with a symptom, rather than an underlying cause.

Nico: Yeah. And it also assumes that the people who would be liable to copycat are happening to watch that particular stream real time. Because you just have to assume, once this happens, or after it's shut down, the video is gonna be taken down. It's not gonna be there. So, who is it, to use their words, inspiring?

Will: It’s a awful, ugly thing. And again, I understand the legislative impulse here to do something. I get it. But yeah, I think it'd be ineffective, in addition to being unconstitutional.

Aaron: Yeah, just one other thing that I think is important to point out about this inspiring versus incitement. There's a reason the exception of the First Amendment is incitement. Imminent incitement or speech that's directed to causing imminent lawless action and is likely to do so. So, these restrictions wouldn't satisfy that imminent requirement.

Nobody watches one of these videos and immediately grabs their gun and goes out and commits a copycat crime. Now, could it inspire people to eventually do something down the line? I mean, maybe. But there's good reason why we draw the line where we do. I think talking about inspiring violent acts is kind of reminiscent of the old bad tendency –

Will: I was gonna say, we get back to Whitney v. California and Justice Brandeis, right?

Aaron: Well, the test that the Supreme Court discarded almost as quickly as it adopted because it was restricting speech that would have a bad tendency to produce acts that the government has the power to outlaw. But how was that applied? It was applied against socialists distributing pamphlets, calling for the overthrow of our capitalist society. It was applied against people protesting World War One –

Will: And Eugene Debs locked up for talking to steel workers, or for train workers, saying, “They have fit for something better than cannon fodder.” Yeah, no, you don't wanna give the government the –

Nico: Yeah, they sort of amorphous inspiring. I mean, that's what was used to go after Ozzy Osbourne for suicide solution, right? The more the arguments change, the more they're the same, just in different contexts.

Will: Right. And on that point, that's a great point because it gets back to Hudnut, and Professor Catharine MacKinnon, and Andrea Dworkin, and their idea that if you allow pornography, pornography will warp people's brains in a certain way, that violence against women and violence against others will be the natural result. It’s always that line. And that's a great point, Aaron, that it has to be imminent. That we say it has to be imminent and likely to produce the unlawful action that the government can regulate. And here, turning off the stream or having a tape delay on the stream, yeah, doesn't meet it.

Nico: All right, last topic for today. It is a mercifully not [inaudible – crosstalk] topic. This is an old school prohibition on public employee speech. So, on Monday, October 17th, the Charlottesville, Virginia City Council issued a new personnel apart policy, regarding on and off duty conduct and speech for city employees.

This came in the wake of one city employee joining the January 6th riot at the Capitol. The employee was investigated, never charged with a crime. The city had a lot of public pressure to fire this employee, but ultimately did not. Presumably, because they thought that the policy that they had in place for on an off-duty conduct didn't reach this employee’s activity. So, they implemented a new policy that presumably was done, so that they could target that sort of conduct.

Aaron: Yeah, it was definitely done. Yeah.

Nico: So, they issued this new policy regarding conduct, and the term conduct as used in the section includes internet and social media communication. So, as is often the case with conduct, it reaches expression. And it says, “That employees should refrain from the following conduct on and off duty: conduct that impairs discipline or harmony among coworkers, conduct that impairs the performance of the employee’s job duties, conduct that impairs city business operations, disclosure of confidential or sensitive government information, and conduct that undermines close working relationships that are essential to the effective performance of an employee's job duties.”

Now, Will, you had an op-ed in the Daily Progress, which is a local paper in Charlottesville, Virginia, saying, “One, this is a bad idea. And two, this is unconstitutional.” Will, so why is it both of those?

Will: Yeah, it's a bad idea, first of all, because, again, it's just kinda the common theme of our conversation here at this point. Maybe this is a common theme of free speech work in general, you're handing the state a pretty big hammer to wield against dissenting or critical voices. Let's say you work for the city of Charlottesville, and you have an opinion that your boss doesn't like.

Maybe you're a Black Lives Matter bumper sticker person and your boss is a Blue Lives Matter bumper sticker person. And you drive your car and your boss sees your bumper sticker, and maybe you put your bumper sticker on after your boss did. And your boss says, “Well, wait a second. I don't like your views.” At this point, maybe you have “impaired discipline or harmony among coworkers.” And if you have this free-floating policy out there, then you may be subject to discipline for it. And the idea that this policy is constitutional because some of these factors track case law.

Well, the problem is, is that the case law says that if you are engaged in conduct that has impaired discipline and harmony, post facto, right? That past tense impaired, and you've come to work and they can say, “No, no. You impaired discipline and harmony. Here's the problems, here's how it's impacted your job. Our interest in efficient operations outweighs your first amendment interest here.” That's all done post facto here. The problem with these kinds of policies, and Charlottesville is a great example of this, is that it acts as a prior restraint. Right? It takes effect and tells employees what they can say before they've even spoken. So, that –

Nico: Which is the sort of conduct or the sort of speech restriction that the First Amendment frowns on the most –

Will: The most.

Nico: – it can receive strict scrutiny, it’s –

Will: Government's got a very heavy burden to bear. Very heavy burden to bear on these. And in a case called the National Treasury Employees Union v. United States, the Supreme Court said that these kinds of far-reaching bans on what public employees can say off the clock, these regulations off the clock, in that case it involved honorariums for government employee speech off the clock.

And the plaintiffs were folks who wrote about, for example, I think one of them was a lecture on Quaker issues and was sometimes paid modest sums for his public lectures. And somebody else, I think, wrote advice columns, like those kinds of things, right. They're getting small sums, but they would have been banned by that policy. And so, the Supreme Court said, “Well, this is a prior restraint.” And so, absent any kind of evidence that any of these things have happened, you can't just preemptively say, “You can't talk about this stuff.”

And there's a great case from the Fourth Circuit that I found, that's directly on point here, out of Virginia, where you've got two cops who are talking on social media, on Facebook, about what's happening at their department. And one of the cops is saying, “You know, these days, all these rookies, all these young cops are getting elevated into leadership positions, and it really is putting the public in harm's way because they are being counted on to save people's lives, but they just don't have the experience.”

And they got dinged on a policy that looked quite a bit like this one. And they went and they challenged the application of policy and the policy of constitutionality was kinda this broad reaching social media policy. And the Fourth Circuit agreed with them. They said, “Look, the government cannot, in the name of efficient operations, shut down public employee speech off the clock before it happens.” And this is a great example. Why? Because that speech is about a matter of public concern.

They're talking about community safety and issues of departmental organization operation that are really important. So, yeah, we'll see. The thing is there is a lot of policies like this, this looks a lot like, frankly, the policy that we are currently litigating against in the Collin College case in the Michael Phillips case, where you have a public employee, in that case, a community college putting out a broad ban on all kinds of speech that the employer may not like and the government employer may not like.

And yeah, I just don't think courts are gonna be very friendly to it, and nor should they be. Whether you're working for the community college police department or the city of Charlottesville, when you're off the clock, you're off the clock. You should be able to engage and debate about matters of public concern, without your employer breathing down your neck or preemptively telling you to shut up.

Nico: Yeah.

Aaron: Right. And –

Nico: Go ahead.

Aaron: Oh, no, I was just gonna say, in including criticism of your very employer. So, if you work for a government entity, the people who work there are often gonna be the people in the best position to know what's wrong with it. What can be improved, or act as whistleblowers for misconduct within the organization. But it would be easy for the government to turn around and say, “Well, by engaging in that kinda commentary, yeah, you're interfering with our ability to run an efficient operation here.”

Will: Right. The seminal modern government employee speech case, Pickering, is the high school teacher writing the letter to the editor about the school board and criticizing their spending choices. And he would know because he's there. Yeah, it's an interesting case. And I will say too, I was at a law review symposium at a Case Western Law Review last week, and credit to all the folks who did the great work, it was a great symposium.

But there was a local practitioner, Emily Spivak, who's a lawyer there and was detailing some of the types of incidents that she's had with high school teachers and their social media commentary. And they maintain a policy just like this one. So, I think there are a lot of these out there, and we shall see what happens to them. I think more litigation is coming, as the local municipalities and government entities try to regulate preemptively their employees’ speech.

Nico: Yeah, that case, that Fourth Circuit case that you mentioned is Liverman v. City of Petersburg, and it was a unanimous panel that said, “We do not deny that officers’ social media use might present some potential for division within the ranks, particularly given the broad audience on Facebook.” Facebook, which, as we discussed, the top might be one of the geriatric social media –

Will: This is like a 2016 opinion, too, right? Yeah, it's been a little while. It already seems kinda old. Yeah, right.

Nico: But they continue, “The speculative ills targeted by the social networking policy are not sufficient to justify such sweeping restrictions on officers’ freedom to debate matters of public concern.” But even more than that, I mean, just kinda trying to understand our hyperpolarized age, where people are increasingly retreating to their ideological or political camps.

You can easily see how a minority political opinion, say Aaron, you hold a minority political opinion that I disagree with vehemently, could and has all across our society, within families and elsewhere, affect harmony, affect the relationships between people. And under this policy, it would justify that minority political opinion, that person getting fired, which I do not think is the right thing for a government to do, for people participating in the give and take of democratic debate. So, we'll see where that one goes. As Will says, it’s probably not the only policy like that –

Will: No, we’ll see more.

Nico: – and will continue to be litigated on –

Will: I like that we ended on the analog example, that is an old meat and potatoes free speech issue, right. There’s nothing – Well, I guess there is kinda something digital about it if you're posting on Facebook, but it felt old school in a way. And that just goes to show the kind of three months we've had.

Aaron: Yeah, it's been a ride, man. More to come.

Nico: So, we put together these memos about all these different issues. Carrie Robison, who's our rapid response director does that, and it kind of memorializes our position. And we use them as talking points to go out there and talk about these issues, so staff knows where we come out. But this is just four of them, I think we've got dozens and dozens of them. But you need to pick and choose when you only have an hour to do something.

And I’d ask our listeners, who can reach us at sotospeak@thefire.org, if you enjoy this sort of format, I think it's interesting we get to cover a lot of different topics over the course of an hour, rather than just one, which is what we traditionally do. So, please reach out to us. Again, sotospeak@thefire.org if you enjoy this and we'll try and do more of them. The other thing I want to mention is our FIREstudent network has this new program going on called FIREScholars, where we bring in five different students at colleges and universities across the country and work with them in an in-depth way, to kinda generate free speech programming on their campuses.

And I just wanted to make a plug for one of these programs that’s being put on by Rohan Krishnan over at Yale. He's got a new podcast out called Voices of the World, which interviews international Yale students about all the free expression issues abroad. So, I just encourage our listeners who enjoy this podcast to go and check out what Rohan’s doing. Great guy.

Will: Really great guy, yeah.

Nico: Excellent podcast. Again, it's called Voices of the World, and I'll link it in the show notes. Check it out. And if you're interested in the FIREScholars Program, you can email us. We've already got our class for this year, but perhaps future classes, we can bring you aboard, we provide training and support, as you're kind of putting together speaking events on campus. I'm speaking at one, it’s either at Brown or Northeastern. I'm speaking at both colleges on back-to-back days, but one of them is a FIREScholars Program. So, reach out to us, again, sotospeak@thefire.org.

Will: I didn’t know, but I'm glad you mentioned that because Rohan is a thoughtful dude. I will definitely check that out.

Nico: Yeah, Voices of the World. He sent me a Spotify link, so I know it's there. But it's probably on other platforms as well. Guys, I appreciate you taking the time to do this. I know we've all got meetings coming up here.

Will: Who knows what lurks in our inboxes when we get outta here.

Nico: I'm always like, “Oh, boy. What did I miss?”

Will: We could keep talking, I'm good with this.

Nico: So, this podcast is hosted and produced by me, Nico Perrino. And recorded and edited by my colleagues, Chris Mulkey, who's behind camera two, one, I don't know what we’ll call it. And Aaron Reese, who's behind the other camera, which I'm looking into. Hi, Aaron. To learn more about So to Speak, you can subscribe to our YouTube channel, which is linked in the show notes.

Most of our episodes, including this one, can be found on our YouTube channel, which is now distinct from the FIREYouTube channel, for strategic reasons, which we can get into. Namely, that we're trying to curate a certain type of content on our FIREchannel and a certain type of content on our So to Speak channel, to drive YouTube subscribers. And if you don't subscribe to ֭’s YouTube channel or to the So to Speak channel, please do.

Will: Get on that.

Nico: Get on it. Get on it. But we're also on Instagram, can find us by searching the handle, free speech talk. It's also the handle on Twitter or on Facebook at facebook.com/sotospeakpodcast. And again, feedback at sotospeak@thefire.org. If you enjoyed this episode, please consider leaving us a review, wherever you listen to podcasts. As we talked about with the social media companies, if you leave a positive review, the podcast gets put in front of other people who might be interested in this podcast. So, it’s the best way to get this show and the messages that we send to new audiences.

Will: And if you didn’t enjoy it, my email is will@thefire.org. I'm always on, just shoot me a line and we can talk it out.

Nico: That's actually his email, too.

Will: Yes, will@thefire.org.

Nico: There you go. See, that's one of the things about ֭, is if you're old school, before a certain time you just get your firstname@thefire.org.

Aaron: No, there’s too many Aarons.

Will: Yeah, there’s a couple Aarons, there’s a couple Wills. You’re still the only Nico though. You still got that going –

Nico: I’m still the only Nico. Yeah because there’s multiple ways to spell it, N-I-C-O, N-I-K-K-O.

Will: But will@thefire.org, bring me whatever you got. We'll talk it out.

Nico: All right, let's close this one up. Until next time, thanks again for listening. Cheers.

 

Share