Introducing the Exformation Podcast

Welcome to the Exformation Podcast

“Effective communication depends on a shared body of knowledge between the persons communicating. In using words, sounds, and gestures, the speaker has deliberately thrown away a huge body of information, though it remains implied. This shared context is called exformation.

Exformation is everything we do not actually say but have in our heads when, or before, we say anything at all – whereas information is the measurable, demonstrable utterance we actually come out with. In this new podcast, we utilize Exformation to discuss new ideas, analyze current events, and explore the possibilities of the future. Welcome to the conversation!” -Will Rinehart

 

Episode 1

In this first episode of Exformation, host Will Rinehart and Co-Host Caden Rosenbaum review and discuss the recent decisions made by the controversial Facebook Oversight Board and how those decisions will impact the future of free speech, the internet, and antitrust.

 

Episode Transcript

Will: This is Exformation brought to you by the center for growth and opportunity at Utah State University. I’m Will and today I’m joined by my colleague Caden Rosenbaum and we’re talking about Facebook’s oversight board which just had their first series of decisions, five decisions, I think recently but yeah before we get into that let’s get into this context, right? Like what’s going on man? How are you doing today? Before we’re even talking about this.

Caden: This is the entire mindset of social media and free speech and the first amendment but it’s all come together in a regulated first-type of scenario. I think it’s worth discussing. I don’t know about you.

Will: Oh, no. No. No. I think this is very much what everyone is trying to figure out, right? I’m kind of surprised there hasn’t been more attention paid to what happened with these decisions.

Caden: Yeah, exactly.

Will: They didn’t really get very much news. So I mean, I’m hoping today to talk through some of these other elements, these finer elements. But yeah, let’s talk about this first beginning and all this and like this context, right? I think that there’s this really big problem that we’re seeing with Facebook and really with all the social media platforms, in particular, is that they’re pretty much besieged on all fronts, right? You’ve seen especially last year and the two years beforehand really since Cambridge Analytica which happened in 2018.

Caden: That was not good news.

Will: This entire [crosstalk].

Caden: Not good news was it?

Will: Yeah. Yeah. No, not at all. No, not at all. But yeah, no. What are you thinking about it, this far as this content moderation is concerned? I mean I know you’ve been following this closely. What is it that you’re seeing within content moderation that particular irks you were [?] or has been interesting to you within this space?

Caden: Well, on the one hand, you’ve got those that say there’s too much content moderation on Facebook. You had a massive conservative swarming the parlor and whatever the other apps are I don’t fall in closing off because I don’t use them but the [inaudible] in one hand says we don’t censor enough or we’re moderating too much and then you got the other hand where it’s you’re not doing enough, right? Because there are some Facebook groups that led to the Capitol riots, the insurrection at the Capitol. That was through a group on Facebook. So there’s got to be a balance there. But as far as public outcry goes, it’s just anarchy, right? It’s just one side versus the other and there’s no in-between, it doesn’t matter because we’re all just on Facebook commenting on each other’s posts. There’s really no resolution to that issue without [crosstalk].

Will: Do you think there’s ever going to be a balance though? I mean, this is something I’ve been wondering kind of extensively. Is that like, I mean we often say, I mean, you’re right that there are these elements to say there’s too much moderation or there’s not enough moderation, right? But each person and how they see whether or not moderation is correctly done or incorrectly done is always a very— it’s a very individual understanding of the goodness of moderation content practices, right? It’s like each individual is always going to think that the moderators are against them in some way or another and we’ve actually seen this in a lot of other social media.

Caden: I mean it gets down to like the Liberals are running our education and there’s no conservative voices or moderate voices teaching our kids, right? There’s always going to be this thought that the platform is against you.

Will: Yes.

Caden: And [inaudible] that kind of thought, that kind of narrative isn’t so much moderation issue as it is just the systemic issue that we all have, right? You go to a restaurant you try to order first and there’s some other guy in line because he just created his own line and all of a sudden you think you cheated but then you look down at the floor and there’s a line here signing.

Will: Yeah, exactly and he’s right behind it.

Caden: That doesn’t mean you’re not frustrated.

Will: Yeah, exactly.

Caden: I’m always frustrated when it happens, but there’s really no good solution. It’s just a problem that we have to be aware of.

Will: I mean, but Facebook, in particular, has been and Zuckerberg has been trying to solve this problem through or at least has been calling for regulation in order to solve this problem, right? Yeah, and that to me is also a worrying part of all of this. It’s like Facebook and I think that’s obviously and what we’re going to get into is this context within the oversight board, which I think is really interesting, which is where did it come from? But in part, you see all of these things happening because Facebook, and all these other social media sites, really just want the bleeding stopped. They want all of the social pressures that they’re facing to stop and the part that’s coming due to a desire to do or not a desire, but there have been pressures by Zuckerberg in particular that said we should have new regulation particularly on privacy and they’re far willing to even talk about section 230 issues which at some point I feel like in the future we’ll talk about but I mean that seems to me to be part and first of all of this it’s like well, how do you stop the bleeding? If you do stop the bleeding you do it through some sort of agency action, some sort of new piece of legislation. It just kind of exports the problem that they’re trying to face, which is that they feel like they’re not doing the right thing, whatever the right thing should be.

Caden: I mean, I sort of view it differently, right? Like what is the bleeding really because it’s not money. They’re definitely fine on money, revenues still coming in. The bleeding is just political and when we’re talking about what’s causing the bleeding, it’s this content issue, right that we just went over and so there’s this quote from Mark Zuckerberg, private companies shouldn’t make so many decisions alone when they touch on fundamental Democratic values. What he’s doing is, he’s saying US government, please take the blame from me for all these content moderation decisions so that they don’t hate me for blocking them out of Facebook. They’re mad at you and they can do what they will with you. I think that’s a fundamental way to look at it. I don’t think it stopped the bleeding in terms of they’re going to have any actual fallout from this if there’s no regulation or something that comes through. I think it’s just in terms of placing the blame somewhere else so that they don’t really have to deal with it. That’s sort of what concerns me whenever Mark Zuckerberg or Jack Dorsey come out asking for regulations. We need to be regulated. This is not something we can do alone. It is, it absolutely is but they don’t want to blame for it.

Will: Yeah, they don’t want to blame for it. That quote that you just read was also really fascinating because then it gets into this democracy point which I think is kind of the subtext to this entire conversation about the oversight board and you see this in a couple of different cases that the oversight board is actually getting into but it’s like at the end of the day, I constantly think, wait for a second, if Facebook is the place in which people are maintaining and supporting and extending democracy, what kind of view and theory of democracy do we really have? That is not a good place to be as far as a society is concerned, right? But at the same time that’s not what democracy is; democracy is a whole bunch of other things in which in part Facebook is important to that but I really think that it’s been completely overblown that Facebook’s taking down some of these people or Twitter taking down some of these people or it’s just like it’s a complete and in a massive affront to democracy and free speech. We can talk about that later because that is…

Caden: Well, I mean it goes into that naive notion. I read a piece by Evelyn Douek on Lawfare and she says that notion that connecting everybody together was going to remove all these barriers in society was just sort of naive and that’s [crosstalk].

Will: But that was pretty endemic. That was a very— that was like the goal, right? That was the goal throughout the 1990s. That was like the utopian ideal. I feel like people have kind of forgotten that the thing that Facebook has done and a lot of these social media sites have done is to actually enact a kind of utopian ideals that we long wanted and there are some really big positives to this and in particular, I very much enjoy my time on social media. Limited [?], there’s a lot of things I do in order to not get the kind of content I don’t want on social media. I use extensively like filters, I do the the the news killer. I always forget the name of it, but it’s like the newsfeed killer or whatever it’s called. Yeah, on Facebook it completely gets rid of your newsfeed, right? So whenever you’re uncertain, yeah, when you need to work, you throw it on and therefore like kills it and so you don’t interact with the actual website, that’s not really used extensively by people. That’s not a very well downloaded app.

Caden: I’ve never even heard of it.

Will: See, exactly. This is exactly— and your— you know what’s going on. I mean like you’re very adept at these issues. So like those sorts of tools to me also aren’t part of this conversation but anyway, we’re kind of getting slightly off track. So let’s talk about the oversight board. Yeah, so they’ve had what nine cases so far that they are taking on five of them have been decided and that first truant [?] really came down last week and just by way of context just to make sure that everyone understands as we did in fact end up filing in one of these cases, the Joseph Goebbels case, which I think is— it was the only one really to apply to the United States. There was another one in Myanmar. It was dealing with Myanmar which now is having basically, I mean Myanmar is right now under a coup. So who knows what kind of content is going to come out of that? But yeah, let’s talk about the Goebbels’ case unless you want to talk about some of the other things that are effectively made moot but I don’t necessarily know that that’s a really huge point.

Caden: I mean, I’m sure I could get to them later and I know I’m going to talk about them. But [crosstalk] first, right?

Will: Yeah. So this case. Do you want to do a real quick overview of the case?

Caden: No, not so much.

Will: Did you not prep for this Caden?

Caden: I did [crosstalk].

Will: I knew you prep. Look you got your little notepad and everything. I knew you prep for this okay.

Caden: I just wrote down a Zuckerberg quote and everything [crosstalk].

Will: Yeah. Exactly. Exactly. So Goebbels’ case, basically, there’s a guy who posted a picture of Joseph Goebbels and it was talking about the one big lie, right? This idea of Goebbels with the big lie context right, which is this idea that it’s like you don’t tell people small lies, you tell them one big line, you coalesce ideas around it. And so this poster who interestingly enough I was going through the materials that Facebook eventually put out, right, on each one of these cases and I don’t know if it is correct, but he was the poster and he actually had like a little bit of a one-pager. He did his little one-pager and I kind of explained his position on it.

Caden: Yes. He did comment to the [crosstalk].

Will: Yeah. Yeah. I thought that was kind of cool.

Caden: I read that. “Yes, I am the one who posted this”.

Will: I’m the one— I’m the poster which is yeah. So anyways…

Caden: It wasn’t a wacko comment either it was so pretty well known, I thought…

Will: No. So he was commenting about Joseph Goebbels, right? He was talking about Joseph Goebbels and this idea of the one big lie and he was relating it to a larger broader context related to issues he had with the GOP, with Republicans, with Steve Bannon, and with this constant issue, which he has seen about misinformation. And that’s what I think is really interesting about this piece, which is that in part, we really don’t have the context and there’s this kind of really interesting context that would be needed for the post.

Caden: Right.

Will: Yeah, and in our filling in which our colleague Chris really pointed out, it’s like you got to have a lot more for us to even understand what’s going on, right? We really do actually kind of need the post itself to see.

Caden: Right. They sort of just gave us like a hundred and four words something like a hundred and four and that was all we had most of it was about Facebook taking it down.

Will: Yeah.

Caden: That’s nothing to comment on. What kind of substantive comments could you possibly get from a one hundred and four-word description of what happened? If you expect people who know about Nazism and misinformation to comment say hey, yes, that is a bad thing. You’re not going to get it because they’re not going to know what you took down without seeing it. Well, on the other hand, if you’re looking for experts like you and me who are looking at content moderation and what kind of policies should be in place. You’re really not going to get it because all you’re asking us is whether the Joseph Goebbels quote was good or not. And if yes, we’re going to say no, of course, but that’s not the point.

Will: Yeah.

Caden: The point is, the post, the context, and how that should be construed?

Will: Yes. So related to this, so they took it down— Facebook took it down because it related to this what they call dangerous individuals and organizations policy which basically means that individuals can’t and do any sort of posting related to these dangerous individuals or organizations, which I think is kind of interesting, right because then the others, the side comment to this is that you can’t even comment about the things, you effectively can’t even really have any subtle disagreement with these dangerous individuals and organizations and I think that’s also a really broad problem because then you’re looking for a very sort of weird set of historical figures where we would care a lot about like Joseph Goebbels and obviously, the Nazis for very good reasons are triggering historical characters and yet just as monstrous individuals aren’t necessary— they’re not getting critiqued in the same sort of way, right? Maybe that’s because there is again more context to Nazism and Joseph Goebbels in a way that say, Genghis Khan or Pol Pot or even Chairman Mao who killed a whole bunch of people, those things wouldn’t kind of even though those are yeah, he’s a deep dude [?] Chairman Mao was a very dangerous individual. I don’t want to die on that one.

Caden: Okay.

Will: So yeah, I don’t know how to place it in context though, this question or even what the policy this dangerous individuals and organizations policies should be. I mean that one I think would probably need a deeper dive.

Caden: Yeah. I mean it would and there’s got to be some question of how do you distinguish dangerous individuals posting, people posting about dangerous individuals, and people like idolizing dangerous individuals, right? Because if we’re looking at the Joseph Goebbels thing in a very narrow context, let’s just dumb it down to a really simplistic layer and we’re saying that this person posted a Joseph Goebbels’ quote and it was criticizing him and because there’s tension there with Nazism in all of that. We’re just going to take it down altogether, right?

Will: Yep.

Caden: If that’s the way they wanted to go then anyone posting about Genghis Khan’s massacres and escapades is also going to be taken down but not like in ancestry.com, who’s going to post about all the descendants of Genghis Khan because that’s a very fluffy piece.

Will: That is actually— yeah, that’s fascinating, right?

Caden: When you look at it in that context and this isn’t my idea. This is Chris Coons [?] idea, but when you look at it in that context, it becomes really clear how difficult this whole issue is and then if you want to move into the oversight board, we could talk about it broadly here. When you look at the oversight board taking things case-by-case. They’re basically just another content moderator, another person in front of a screen looking at content saying yes or no, except they have this sort of independent superpower thing going on, right? They can tell you yes or no on this content in the comments that they’re receiving as feedback from the public aren’t necessarily all that substantive because they haven’t given enough information.

Will: Yes, exactly, right? And I think that’s one of the things that we were pretty critical about within the oversight board of which I hope that they’re going to expand out which is just more context about the actual post itself and the content itself. And I think you’re exactly right, which is just how do you even apply this policy consistently? However, on a small element before we really get into the oversight board itself and really kind of think through that. I think that this is really one of the things that I’m still trying to figure out within the oversight board and I really don’t know that I have any great answers, which is that there is this real inherent tension between the two things that are effectively happening, right that the oversight board is making changes to individual cases whereas what we really do care about is kind of there’s two things we care about, right? One thing that we care about is the overall process of which the rules are determined, right? The process by which the dangerous individuals and organizations policy comes about, that’s one sort of process that we would care a lot about which the oversight board effectively doesn’t have any decision power over and then there’s this kind of the second question, which is how do you make a decision or how do you make a rule determination within that particular overarching rule? So for example, how do you actually apply the dangerous individuals and organizations of policy, and what are the strategies of applying that sort of policy in all of these kinds of weird cases even like the Genghis Khan case that we were talking about.

Caden: At least give some certainty, right to anyone who’s posting in the future that way whenever something is taken down you don’t have Ted Cruz or Josh Holly banging their fists saying someone’s been censored but instead understanding why they’ve been censored at least if they go bang their fists that time we all know that it’s just political show anonymously [?].

Will: Yeah. Yeah, and there’s actually some really good research that we can put in the liner notes on all this. Back years in years ago, I was actually working— some of my colleagues over at the University of Illinois are actually working on a lot of these questions when they first came about, right? Right around 2012, 2011 when you actually do make or take down content because this is really when a lot of these policies started really kind of ramping up was actually 2011-2012 to me.

Caden: I was still in high school.

Will: Yeah. I’ve been in this— in yeah— I’m slightly older than you are. Yes, and I’ve been working on this now for almost. Yeah in 10 years in fits and starts ride in [?]. It’s been fascinating to see the entire industry change. But when I was still in the academy and doing work in kind of content moderation and content issues writ large, and this is a very very very beginning of it. This is one of the things that we were— that at least some of my colleagues are trying to understand is how would individuals respond to two content moderation takedown, and how do they respond to the fact that you have this kind of opaque system that exists that you really don’t understand that’s taking down your content. And one of the things that you find is that if you give individuals or you tell them like hey, here’s why we’re doing this then, in fact, one of the things you typically— what happens long-term is one, they usually feel a little bit more positive about the actual experience they have in it, right and then secondarily, they usually— they oftentimes to do become better actors or it’s highly likely that they’ll become better actors because no longer is it like this system that there— they don’t feel like it’s a game anymore and that was one thing that at least that I’ve kind of theorized. It’s like they no longer feel like this is a game of cat and mouse but it’s that there is certain kind of expectations that are on these social networking communities or whatever this community may be and that the more that there is kind of this give and take in information about actually how these decisions are being made instead of just kind of a, here’s the determination, X determination, you violated the policy but not really explaining or having some sort of way to define why the policy is being violated. Which in fact might be difficult to do. I’m not gonna lie, I mean there may just be a time slot there.

Caden: Yeah, absolutely difficult.

Will: I’m sorry, you were saying?

Caden: I said yeah, it’s absolutely difficult but I am totally with you. I mean something other than your content has been removed and then they Google why and the first thing that comes up is just like algorithms and algorithmic bias because I’m guessing if their content was removed maybe not but probably they were also looking into some things that were a little racy and the way that Google’s advertising metrics work, they’re going to see more racy content.

Will: Yeah.

Caden: That will just kind of confirm the bias that Facebook is trying to censor them which isn’t true. It’s not a concerted effort. It’s just either a mistake or it’s just they violated a very clear policy that no one has explained to them.

Will: Yeah. So getting into kind of the last part that we want to hit up on the question is, oversight board, we have these cases. We have these five cases. We have another one that’s coming up is on Trump, which is going to be super interesting. Basically, the oversight boards trying to determine whether or not the takedown of Trump’s content and his page and all of that is going to continue which I think is going to be really interesting. But yeah, what are your thoughts about the effectiveness, the usefulness of the oversight board writ large is kind of an institutional body because I think that’s really the most important thing, right? It’s like why all of this? why all this— why does it matter? Why does it matter that we’ve got this oversight board?

Caden: Well, I mean if we go back to the very beginning of what we started talking about, we’ve got all this public outcry. We’ve got regulatory calls. We’ve got antitrust lawsuits filing down now. So they really needed this independent body, if it wasn’t going to be the government taking the blame off of them. It had to be something else. So they’ve set up this oversight board that is supposed to be independent. It’s funded by an independent trust. It has no binding to whatever Facebook wants technically, but if you get down to it, the board is sort of set up or to sort of set up like an early Supreme Court was, right?

Will: Yeah.

Caden: So if you look at the Supreme Court and has its origins in the Constitution Article 3 except that didn’t exactly create a court, all it did was give authority to review cases. So then you have the Judiciary Act of 1789 which actually established the Supreme Court and let them go sit in session and hear these cases. So what the charter does is it establishes this board and says here is the constitution’s mandate which is actually still part of the charter just like this is what you’re going to be working on. So the kind of cases you’ll be deciding and then from there sort of what the board has done in these last five cases really for that wasn’t moot is sort of say, okay well, this is our scope and it is sort of been expanding that scope much like the Chief Justice Marshall Court was back in the early days wherever Thomas Jefferson, well, this is before but Thomas Jefferson would say to the Supreme Court, “hey, I want to put all these ships out here, is this good?” and the court said back we’re not an advisory committee, we don’t tell you this unless there’s an actual conflict at play. So that was number one, right? That’s the scope and then Chief Justice Marshall would take a case that was not really what anyone thought the court would handle and he would say no, no, see in the constitution, it says we have this authority so we’re going to broaden our scope. We’re going to cover this and that’s sort of what the oversight board has been doing at least trying, right? It’s sort of this regulatory reviewing body and in the regulatory since it takes comments in and it says “hey public, what do you think about this? Are we missing something?” It’s got its own little board of experts and in an administrative agency since this would be more considered a technocratic agency because the experts, the head of the board but in a general sense, it’s got good intentions, right? It’s got good foundations in historical thought and development of jurisprudence up until the point that you get to the actual effectiveness of the board. How the decisions of the board actually affect Facebook? Now, on a case-by-case basis whenever you’ve got Joseph Goebbels content or the breast cancer awareness content, they can say guess you have to put it back up or no, you are right you go ahead and keep it up, overturn it or hold it. But then the more policy context in order to distinguish themselves from just a case-by-case regulatory body, they sort of proposed these policy changes that would clarify or limit Facebook’s policies as they are especially if they’re not clear, but the thing is Facebook doesn’t have to follow these policies. So whenever I think about the oversight board, I really look at it and I think that it is sort of the same as let’s look at Subaru share the love think, it’s a really great thing they that do, right? They sell cars, they donate money and it’s great.

Will: Yes, Subaru.

Caden: [Crosstalk] Whenever they advertise it. It’s just an ad. It’s PR. So the oversight board may be founded on really good intentions, but at the end of the day they’re doing this because it’s good PR, it gets people off their back. That’s sort of my view on the oversight board as it is. Now, I don’t think that that’s going to stay the same. I think that they could prove me wrong by actually implementing some of the policies, Facebook could prove me wrong by implementing some of the policies the board proposes. But at this very moment, this was just advertising. This is just good, press.

Will: Yeah and be very very clear. I think that there’s important just to be very clear about the fine-grain distinction between the two, right? That there is a policy recommendation, right? You have a policy recommendation that is generally separate from the content moderation decision, right so that the oversight board really only has power over the content moderation decision itself, whether or not to keep up the Joseph Goebbels post or not, right? That’s the only thing that Facebook really has or the oversight board within the charter is given true power, but then there’s this…

Caden: Awaiting the bylaws [?] of course.

Will: Yes within the bylaws but then there’s this kind of…

Caden: They’re coming, they’re coming. So we’ll figure that out later. But for now, yeah for sure you’re right?

Will: Yeah, but then there’s also these overarching policy recommendations, which is what the oversight board is also doing, right and I think the big question is whether or not Facebook itself is actually going to adopt them and how often they’re going to adopt.

Caden: The best example, is that the COVID-19 case, right? It’s 2020-006 is the case if you want to go look it up and it was a post in France and it had, to the effect of saying that there is a cure for COVID and I guess it is [crosstalk].

Will: Of course, it’s in France, right? It’s probably cheese and wine, the baguette.

Caden: Don’t feel barren [?] about that.

Will: Yeah, no, no, no and a baguette, that’s super cheap because they regulate the price of bread and baguettes in France. I will say French baguettes are delightful, but yes.

Caden: I’m an uncultured fool so I have no idea what you’re talking about, but it sounds awesome to me. Anyway, so this post was basically saying that there’s a cure and I’m guessing it was bad timing with that. It wasn’t a cure yet.

Will: Yeah. Bad timing. Okay, yeah.

Caden: They took it down because it was supposedly misinformation and so whenever the board got its hands on it, it said hey, your misinformation policy is not clear. So we’re going to overturn this, put that back up because there’s no way to figure out if they’re going to violate the policy then it said hey, you need to put out some guidelines that kind of clarify this and what Facebook said was okay, yeah, we’ll put the post back up and like hey, we’ve consulted with all these people and we know all the stuff and we’re just not going to do it during the global pandemic. We’re just not going to change your policy. Tough. So then there’s that question of what is the board does now right, with the Supreme Court, whenever they say hey, you need to change this law like the Civil Rights Act or something. They actually have two other branches that have to enforce that policy. They don’t themselves have an army and it’s the same thing with the oversight board. The oversight board doesn’t have an army. It’s just relying on Facebook to say yes, okay, fine oversight board, I will go ahead and clarify my policies and that’s a big issue. If you really try to look at the oversight board as an end-all-be-all to this content moderation problem. It’s definitely not.

Will: Yeah, no, and I think— no, that’s very very astute. I think that’s exactly right? The other question would be which I constantly have been thinking about with regards to all of this policy question, right? How do you— and just to go back to what I said earlier? How do you determine— what— how the rules are decided as compared to how the individual case or case are decided within those rules. Those are two different things, right? It’s like the decision within the ruleset is different than how the rules themselves are determined. And the one thing I wondered with the oversight board is how often might they be able to effectively overturn decisions, local decisions such that it forces the hand of Facebook and if that thing becomes pretty consistent, then it’s like well then the power for them is just the ability to just keep on saying well no, we’re just basically going to flatline or like going to get rid of all of these decisions that you’ve made and that I think is obviously that’s going to be interesting because they’re limited in the amount of cases that they can take as well, but to at least push back on that a little bit right and I think the COVID thing is really really really interesting and especially since the science of COVID is changing, right? I mean there is— we have to understand at least respect that the science is itself is not a singular process, right? Science itself is an open dynamic process, which is continually changing which makes it difficult because people are looking for singular truths to hold their hats on which is not what science provides. But more interesting to this is this question that I think that what we’re trying to tease out and what Facebook and the oversight board and what all of this is trying to tease out which is how do we get back to a place of legitimacy? I think that’s really ultimately what Facebook and a lot of the big tech companies are trying to shift back towards, right? Is how do they at least signal one way or the other to users, to the people that they interact with, to policymakers, to lawyers, to everyone how do they signal that they are now a legitimately good actor again, and that’s actually a thing is going to be very very very difficult.

Caden: The question is how did they lose that reputation as a good actor? I have— all I’ve seen from Facebook is just we’re open to norms of free expression. Like they’ve never actually violated any laws. They’ve always been open to this norm. I said norm, not a law of free expression and openness up until a point whenever it’s clearly dangerous, right? So if it were somebody speaking terribly and threatening violence and trying to incite a mon in a public square, we’re probably crack that down and they would go under the first amendment because that’s a whole issue but the government would probably win. They will still have to go to court.

Will: Yeah.

Caden: Now, with Facebook, it can act a lot quicker and the problem that people have is that it’s just heavy-handed when it comes to that stuff. Like you have a shooting in New Zealand and Facebook is going to try to take it down as fast as possible. That’s a whole other issue if we talk about the response time, but there’s all this content out there that Facebook takes down because it’s clearly not good and they don’t have time to go to a court and say hey can we take this down and that’s sort of the beauty of an internet company. Because you don’t have to wait to take down things that are clearly bad.

Will: Yes.

Caden: But then you just get back to the problem of okay, well, define bad.

Will: Exactly. Exactly. And especially since the one thing that I guess that has worried me within all of this is that this is still an oversight board for effectively for Facebook and it really isn’t looking at content that comes from other sites, right? The way that you would actually have, not to say that we want to implement a kind of a case or court system within social media content moderation, but this idea has been floating around which we’ve kind of hinted out here as well in this conversation of this idea of a self-regulatory organization, right? The oversight board does kind of act like an SRO as they’re called. There was this idea of floating around least in 2016, 2015 and I know our colleague Chris has worked a lot on this, a couple of my other colleagues have worked pretty extensively on this and we will try to get one of these guys on in the future, one of these future podcasts and actually just talk through with them. But this idea of having a broader kind of content moderation conversation to me seems to be I don’t know that it really solves this legitimacy question which is endemic within all institutions, it’s not just big tech, honestly, if we’re going to be very honest about it big tech was the holdout in the fact that almost every other institution, corporate institution, government, and newspapers and all these other kinds of institutions that existed for the longest time where well-respected up until effectively fifteen to twenty years ago and it seemed that big tech for a very very long time until about two years ago really buck that trend, right, that they were the respected or at least these kind of knew nascent tech companies where the trend of respect and legitimacy and trust whereas everything else was effectively not trusted and not had all of these kinds of endemic legitimacy problems. I think to me that’s personally going to be the really really big set of questions going on in the next five to ten years is really how you either rebuild that or if there’s any institutions that can kind of rebuild that. I don’t know, I think trying to put back the pieces is going to be almost impossible and maybe this is just kind of a wave sort of thing where people say, oh the issues that we were dealing with that existed in this time period maybe go away. I don’t know if that’s going to happen, but yeah.

Caden: [Crosstalk] but have you seen the progress of 2021? I just put out a piece that was like hey be positive and then there was this Mayan Marcou [?] [crosstalk].

Will: Whoopsie daisy. Yeah, a coup. Yeah, major. Yeah, major [crosstalk]. They shut down the internet and yeah.

Caden: Oh and don’t forget about Robinhood. Yeah, the whole GME thing shutting down the free market if that’s what you want to call it. I don’t know. I don’t know enough about that. But anyway back to the idea of the systemic problem, what did you say? Was it endemic? [crosstalk] I don’t know how you put Humpty Dumpty back together again, right? I don’t know if you can make things get better [?] like McDonald’s, right? People just stopped caring, something had to have happened there.

Will: But with McDonald’s, yeah. That’s actually a really interesting kind of analogy.

Caden: They have a big Burger.

Will: Yeah. Well, it’s when they start well, but it could have been when they started selling salads, right? Now, the major component is the fact that McDonald’s does sell salads.

Caden: But the biggest component was whenever McDonald’s started selling breakfast all day and all night. So it really just shifts the culture of what we care about. And so I guess the biggest thing for tech now, I think what I would project and I’m not an expert in sociology or anything, but I would just say we would have to have a shift in culture. We would have to care about something more than Facebook and social media regulation and that’s not looking very— it doesn’t look like it’s coming down the pipeline anytime soon.

Will: Yeah. No, I think that this is definitely something that is not is yeah. No, you’re exactly right? I was going to try to make a joke about the fact that the pancakes were also got— they also got rid of it in the McDonald’s breakfast, but I’m not going to make any comment about that and I think that…

Caden: You know what. It ok like to the pancakes, good for you.

Will: It was sugary and just so buttery. I love them. That was the thing.

Caden: I’m a McGriddle kind of guy. So, I’m really…

Will: Really?

Caden: Yeah. Sausage McGriddle.

Will: You got those little tiny packets, they were just pure bliss. Oh, yeah and you put like four of them on there with a whole stick of like you got those little packets of butter.

Caden: Yeah, then you just drove straight to the dentist and that was your morning.

Will: Whoopsie daisy. Yeah. Yeah, I don’t have any cavities but I do hate going to the dentist.

Caden: How many times do you see a big burger documentary anymore, right? Nobody’s eating McDonald’s for like fifty days straight anymore because it’s just people know [?] and we accept it and we still like the convenience of a McDonald’s.

Will: Yeah.

Caden: That’s just an interesting thought.

Will: I think we’re out of time. As always great chatting with you. This has been Exformation. You can find all of our content if you go to our website, which is located at theCGO.org. That’s with the at the very beginning. My name is Will and I’ve had Caden talk with me today about Facebook. Thanks, Caden for taking the time to chat.

Caden: Thanks for having me.

[END]

CGO scholars and fellows frequently comment on a variety of topics for the popular press. The views expressed therein are those of the authors and do not necessarily reflect the views of the Center for Growth and Opportunity or the views of Utah State University.