Why Government Regulation of Tech Censorship is Unnecessary and Undesirable

Cato Vice President John Samples on his role on the Facebook Oversight Board.

Bob Zadek
17 min readJul 10, 2020

--

We are witnessing yet another purge on major social media networks from Twitter to YouTube. Conservative and libertarian voices are among those being banned or suspended for vague violations of the terms of service, including Stefan Molyneux, whose 15-year old channel was flagged for promoting “hate speech.”

Meanwhile, the general climate of free speech and expression is perhaps at an all time low, with cancel culture accelerating to the point that we might question how much longer our 1st Amendment protections will be worth anything.

However, I’ve taken a stand against the government regulation of social media, despite the de-platforming of voices that are quote-unquote “on our side.” The reason has to do with these companies’ status as private actors, who themselves have speech rights. To force them to publish content against their will — no matter how mild or offensive — cuts against the Framers’ vision of the 1st Amendment.

John Samples, Vice President of the Cato Institute, is part of a solution that I can get behind. As a member of the Oversight Board, he is among a diverse group of academics that will hold Facebook accountable for its policies to maximize freedom of expression without allowing its users to post harmful content. Admittedly, this is a fine line, but the power always resides in the end with users who are free to exit the platform for “freer” alternatives.

As we’ve seen with this past week’s exodus of conservatives from Twitter to the knock-off app Parler, the problem of content moderation runs much deeper than the ideological leanings of its founders. Parler quickly earned a reputation for being trigger-happy with bans, despite its claim to allow any speech that would be allowed “on the streets of New York City.” Those targeted appear to be people who dared to question that app’s less than transparent terms of service.

Twitter proves that whoever invents and sustains lively discussion on the online equivalent of the public square can get rich. This explains why the big companies are now outsourcing the role of moderation to credible outsiders like the Oversight Board, and have a vested interest in solving the problem.

This Sunday, Samples joined me for the full hour to discuss the process the Oversight Board will use to balance concerns about free speech against the need to prevent certain forms of speech which have never been protected (i.e., inciting violence).

The market of ideas is alive and well on the show of #ideanotattitude.

TRANSCRIPT

Bob Zadek: Hello everyone and welcome to The Bob Zadek Show the longest running live libertarian talk radio show on all of radio. Always the show of ideas, never attitude.

President Trump rules over approximately 330 million people. Governor Newsom is a distant third. He rules over about 40 million people, which is the population of California. This morning’s guest looks after the rights and protects the rights of 60 million people.

Who is this person? This morning’s guest is John Samples. He is Vice President of Cato Institute. That itself makes John a significant public figure. John is a member of a very exclusive group called the Oversight Board of Facebook. John has been asked to serve on the Oversight Board, an independent organization. John’s responsibility, along with other members of the Oversight Board, is to protect our rights — not rights in the Bill of Rights sense, or in the legal sense, but our right to have our views heard by others in the public domain. It is a very important role. The Oversight Board was created probably for political as well as for somewhat profit-making reasons, but his position is somewhat unique. It is important to all of us. John will help us understand the complex relationship between Facebook, its customers, the government, foreign affairs, and the American electoral process.

John, welcome to the show this morning.

John Samples: Thanks for having me, Bob. I didn’t know I had all this, these responsibilities. I have to say, I think, if I am representing 60 billion people, I was appointed, not elected. If I have to stand for election, I’ll be going home. This was a group of people with different backgrounds and who were in some way related. Nobody was left out, we’ve got another 20 members coming that might have felt left out the first time around. It is not strictly speaking representative, but it has to have some relationship. Ultimately, I do think if this thing’s going to work, everybody is going to have to represent all Facebook users as well as possible.

Exploring Facebook’s Business Model

Bob Zadek: So John, let’s start with the basics. First of all, Facebook is a corporation for profit. It is owned by its shareholders. It is managed by a board of directors like every for-profit company. The board of directors is elected by the shareholders, the board of directors hires officers and officers hire employees. It is straightforward stuff. However, Facebook is representative of a business activity which is social media in the broadest sense. They are a platform. We start with Facebook being a company. Just like Amazon, just like General Motors. It’s a company, but in many ways its business model makes it different. Help us understand the meaning of the business activity called “platform.”

John Samples: Economists talk about multi-sided platforms, two-sided platforms, and a lot of the new Silicon Valley businesses, the high tech businesses, fit this model. On the business side, which you mentioned, Facebook is not like GM. It doesn’t produce a product and sell it to its customers. What it does is provide a service really. At one time Mark Zuckerberg was talking about it as a utility that attracts people to it to use the surface. And then when people use it, the data, their behavior and their actions online are known and then are used to better direct advertising to them. So that’s the thing you see in your Facebook newsfeed, is actually the heart of the business. Every business has to generate revenue some way.

This is the business model that Facebook has in many of these other platforms. They connect people. There are users on one side and advertisers on the other. Without the users being there there’s not going to be any more revenue. If you don’t have as many users as you can get then you’re also not meeting your obligation to maximize the value for your shareholders. So that’s the basic idea.

Now, there’s another thing that follows from this that from the business that people don’t often note. People wonder why some kinds of speech is excluded. The reason is that if you do not moderate the content, and you don’t exclude some things from the platform, you’re going to have a problem attracting users. You are going to have a situation where some users are repelled by it. We know for example, that there are sites on the internet that don’t moderate the content, or that have low moderation. Over at Reddit, there is not heavy moderation. That attracted political issues but also it has never been a big money maker. If you let everything online and let everything on Facebook, you’re going to alienate users and you may have fewer users.

Bob Zadek: A good analogy in terms of business model is if we think of something which is on the wane, like network television, it offers a product which is free entertainment. It provides content that is free. How does a business get to offer free content? Because now they can ask advertisers to pay them.

You are assured of getting eyeballs. Not only that, but based upon our studies, we can tell you what the demographic of the eyeballs are. This is a married woman. This is a single woman. This is male, etc. This is kind of what Facebook has mastered by being very granular. So they offer the service free, like television, to the users. The users disclose everything about their life or a lot about their life, which Facebook then can organize and can deliver — not just young kids or married women, but very granular eyeballs and ear ears to the advertisers. So that’s the model. So now we have Facebook providing content to draw eyeballs or other senses to its attention. Is that a pretty fair summary of the Facebook business model? It basically provides potential customers to advertisers. That’s what it does.

John Samples: The crucial thing is the granular part. This is the best form of directing advertising in the history of the world. It can speak to small groups.When you advertise on TV you waste a lot of the effort because a lot of people don’t care. Facebook in the past, the number was as small as 500. The other day I got an advertisement on my Facebook that had my name on it, it was directed at me. That’s a pretty granular piece of advertising.

Section 230 of the Federal Code: Distinguishing Editorial Content

Bob Zadek: So that is the business model. Why is it of the interest of the government? Why does anybody care? Why do Senators get apoplectic and why are there hearings? Why are they inviting social media to Washington to draw a great amount of voter attention? If all they are doing is providing customers to advertisers, that is kind of benign and why would the country care? Now there is this fear of regulation and of litigation. Section 230 has a very special meaning to you doesn’t it?

John Samples: It does. I have to admit it’s very off-putting and it sounds ominous. It is just a section of the federal code. When 230 was passed in 1996 it was written with bipartisanship. They were responding to a situation. There were two law cases prior to the passage of 230 that explained what was going on. On the one hand, there was a company called CompuServe that had bulletin boards up. At one point somebody came on to one of the discussion groups and defamed another person. They libeled them which is a tort. The person who was defamed filed suit against Compuserve as well. So the question was whether Compuserve was liable. The answer was no. Compuserve just hosted a discussion forum and did not edit them.

On the other side was another company that no longer exists called Prodigy, which had a site and discussion forums including one on money matters that was very popular. Someone got on there and defamed an entire company. It was a serious matter about finances and so on about libel. So the question was presented again. This is a different part of the country. But Prodigy had filtered the discussion board for decency. So the court said that Prodigy did stuff and made them partially liable, so they could be sued for defamation. So there were contrary court cases here.

Now 230 decides the issue. It takes it away from the courts and says that these companies are not liable for what people say on the website. During the 90’s there was a great deal about obscenity and pornography online. So the idea was you wanted to empower the companies and give them the right to do content moderation, and to remove things that were violent and distasteful on the site.

Congress made sure that they have that right. That’s part of section 230. So there are two sides to it. One is that the companies are protected from being responsible for defamation and other things that the users do. And then on the other hand they have the power to remove anything from the site, including speech protected by the first amendment, if they want to. They have that power and you cannot hold them liable for doing that. So that’s the whole story.

Bob Zadek: You mentioned free speech. Free speech doesn’t apply because the Bill of Rights protects citizens from having their speech limited by the government. Free speech has nothing to do with transactions between private, non-governmental actors. The Bill of Rights has nothing to say about private communications. The distinction is that under Section 230, the law of the land now, if you exercise editorial control over what appears in your newspaper or magazine or online blog, then you own part of the content and if that content is attacked for defamation then you can be liable as well. If you do not exercise editorial control, like a telephone line where the common carrier cannot control what people said, then they cannot be liable for defamation.

Editorial control makes one a publisher. And being a publisher makes you responsible for the content, and liable for libel, which can make the business model fail.

There are a whole lot of customers who are baring their lives and providing granular detail about their lives, which makes the advertising very efficient and very effective. And Facebook has all of this information. It is of course proprietary and it rules the roost. Now we have Facebook connecting potential customers with potential sellers, all capitalism as it’s supposed to be. So how does politics get involved? How do we start to care about Facebook, a commercial activity, which is doing nothing other than connecting buyers and sellers, a benign activity?

John Samples: Itt is a lot like campaign finance and money and speech. There were no laws prior to 1968. Then Eugene McCarthy raises a bunch of money and drives the President out of office. He showed how money and politics could affect elections. Fast forward to 2016. Lots of people come to the view that these platforms can affect elections. People in Congress look at that and worry that it can lead to them departing Congress.

Bob Zadek: Obama was the first president to really effectively understand and master some specialized and effective knowledge about how to use social media. So now politicians, not only people selling products to business, who are selling their brand of politics. They now discover that they have all of these millions and millions of potential voters. So now politics discovers this highly efficient delivery system to deliver not customers, but voters. Politics gets involved and to quote Pericles, “Just because you do not have an interest in politics doesn’t mean politics doesn’t have any interest in you.” So now we have “politics discovers the internet.”

John Samples: My friend Arnold Kling has this idea called “Fear of Others’ Liberty.” So there’s a fear on the left that the right is on Facebook with all this kind of dangerous speech. So you fear their Liberty. And on the other hand, the right fears that all the people on Facebook are left wingers who want to suppress their speech. You’re tied up at the congressional level with politicians worrying about elections. So it is not surprising that we have all this stuff going on when you think about all of that.

Accusations of Facebook Bias from the Left and Right

Bob Zadek: Both the left and the right have appreciated the enormous power of social media. What is the suspicion that the political class has, and therefore voters perhaps have, about how Facebook and others are a “bad thing” in politics. What is the fear?

John Samples: For the politicians it reduces the cost of information, which for incumbency is a tremendous advantage, because you have name recognition.That’s why Mark Zuckerberg talks about the ad-police and not wanting to restrict it, because it really does work against incumbents. So that’s one side.

Bob Zadek: What is the accusation about suppression?

John Samples: There’s a widely held belief, particularly on the right, that they are taking down conservative speech in a biased way. On the left the belief is that to serve the business model or because Zuckerberg is a secret Trumpy, or for some other reason, stuff is staying up and should be taken down because it is a threat to democracy. What those two views share is that Facebook, in pursuing its business model, cannot provide an independent and fair judgment by itself of what should stay up and what should go down. Assuming as we must that some things can’t stay on the platform or the share price drops like a stone.

Bob Zadek: Therefore, even if Facebook is suppressing speech, there’s nothing wrong with Facebook doing it. It may jeopardize the business model, but that is Facebook’s issue, not the government’s issue. So even if Facebook is suppressing and censoring speech, even if it’s doing all of that, if we stop right here in the conversation, there’s nothing wrong with it. Facebook may find its business models start to fail as users who want to be exposed to that content find they are not. They will become unhappy and they will leave, but that’s Facebook’s economic problem. It’s not a political problem. So even if Facebook is doing that, it is a private actor and it can’t possibly threaten democracy. Private actors are allowed to do whatever they please.

John Samples: For the libertarian, when you sign onto Facebook, there’s an agreement made and a bargain and you get access to Facebook and its services, but you agree to abide by the rules. There is a consent element right there. That’s the end of the story. And certainly Facebook has a right under section 230. The real question is, is Facebook so ubiquitous that it’s actually “the public square.” Or is it a private business?

Bob Zadek: Facebook is allowed to exercise some editorial discretion, some moderation for content that is harassing or too sexually explicit. There are all those community standards without crossing the line to becoming a publisher. And then there are these catch all phrases in 230 which gives Facebook great latitude to moderate content under the protective umbrella of “otherwise objectionable.” So there is suspicion that Facebook is editing content so that if I am interested in getting conservative points of view, I may be denied it in a way I am not even aware of.

What is Facebook’s Oversight Board?

Bob Zadek: So now there is the fearful threat of government regulation that gets social media’s attention. This is what causes the hearings and the formation of the Oversight Board, which is probably a way to get Facebook to respond to these issues themselves. John, do you agree that the Oversight Board is done to preempt the need for government regulation? What does the Oversight Board do and what is its mission?

John Samples: I think to understand that a social media site like Facebook cannot be a publisher. It just can’t be. It’s true it does some editorial things but it’s not like here at Cato where we go in and edit the content before publishing it. With Facebook you can simply throw things up without permission. Social media would lose its character of coming on board and just starting and saying what you want if Facebook had to decide what was right or wrong.

You do have content moderation after the fact, after people have come there, so how do you make it legitimate? There are going to be people upset about what is taken down even though they’ve agreed to it when they came on the site. That’s the problem. A lot of politics gets involved and the company becomes distrusted for various reasons, which I think have to do with politics in a lot of ways. To make this work you have to have some kind of institution that people can believe is making fair judgments about what is up and down online, and that is separate from Facebook..

Facebook sets up a trust and so on and puts money towards the Oversight Board. It has to lay down 130 million for the first six years that can’t be taken back. The people on the board cannot have their salary reduced or be removed for the judgements they make. So they go through this process to create an independent space for the board. There are several reasons for this. The company wants people to believe in the product. There has to be content moderation, but no, it doesn’t have to be unfair or against the values that Facebook professes. So that’s the point, but it is ultimately so that people can feel like this is a good product that I want to use.

There’s ultimately a reconciliation between the business mission of Facebook in the long run and setting up this Oversight Board. That doesn’t mean that the Oversight Board is going to be worrying about the quarterly returns for Facebook or is going to be involved in business decisions. It just tries to give people the feeling that they can trust Facebook. They can trust the decisions to be made because people have a right to appeal when their posts are taken down and they have a right to go to this Oversight Board and say they don’t think they violated the rules of Facebook. The board can make Facebook take back that action. Facebook then has to comply legally. It is an innovation that is trying to give the users trust and trying to gain legitimacy. Taking down expression in the United States is a big deal in this country which is historically very committed to free speech.

Bob Zadek: Facebook wants to have the public believe that there is a system in place that even if the employees of Facebook carry to their job a bias, Facebook has set up a process by which somebody who feels their content has been denied access to the customers, can appeal outside of Facebook. The suspicion is that most of the Silicon Valley is left wing. That is bad for business. The employees have no say in this process. You are structurally independent.

An Attempt at “Universal” Representation

Bob Zadek: For Americans, who would you feel more protected by? John and his colleagues or Josh Hawley and his colleagues? The answer is pretty clear to me. So that is the Oversight Board. Facebook is a worldwide enterprise. As you pointed out, about 10% of the users are Americans. To apply standards of fairness to somebody in Kenya, as well as fairness in somebody in Indonesia and fairness to somebody living in some red state in America, it is a pretty challenging task. So tell us in a few words, how is the Oversight Board composed?

John Samples: There are 20 people that have been selected by Facebook. Another 20 are coming probably in the next year or so. They will be selected by the Facebook board members now. So it’s a self replicating board. There’s also a staff for the board that was hired earlier in the year. And that staff is hired in consultation with board members. I should say that there are four “first among equals” on the boards of so-called co-chairs, which include two Americans, Michael McConnell here at Stanford law and then Jamal Green at Columbia. So we’ve got a couple of law professors from the United States that are the heads of it. And then you’ve got a former Danish prime minister who is another co-chair. And finally, a law professor from Columbia who has been very active in international organizations protecting free speech.

Then you’ve got another 16 people from various parts of the world.When everyone meets online, it is a very difficult undertaking because people live everywhere. Six in the morning here is 11 at night in Australia. They tried to run the gamut because there are 3 billion people on Facebook. The other interesting thing is that this is an attempt for a universal and single set of rules. It’s a pretty extraordinary experiment.

Bob Zadek: In the introduction to this morning show, I said you have the largest constituency in America behind Donald Trump and above Governor Newsom. You are a free speech advocate. You are highly informed in matters of political speech. Facebook tries to have a balanced Oversight Board representing all points of view. So you represent us, the listeners of this podcast and of this broadcast. On the assumption that that might very well be 60 million people who are like minded. Thus, I said at the outset that you are my representative. So you have a very august responsibility, John. You better be on top of your game every day and don’t let anything slip by because 60 million of my buds out there are counting on you.

To what extent should Americans be concerned that Americans could be harmed by Facebook and others? In exercising content moderation and denying Americans access to certain conservative free speech. Is it a meaningful fear or is it all overblown?

John Samples: I’m very skeptical that it has a great effect. Political scientists found out how very difficult it is to change public opinion and to move people. The Russian effort was small and had little effect. It would be very hard to do that. Actually, advertising in the general election, even if there were no Facebook, might not have a great effect on elections. So that’s advertising speech in general. As far as the bias issue, I am the libertarian voice there. If you don’t like it at Twitter for example you can set up your own. At some point, if these companies want to pursue strong content moderation in one direction they are going to lose people. I will be impressed by the argument when there really are no other alternatives.

To the extent you’re unhappy with it, instead of turning to the government, there is venture capital out there. There’s money that can be made for doing alternatives. My concern there is not great. It’s not even clear that Facebook will be dominant in a few years.

Bob Zadek: I think you’re a hundred percent, right. I hope that Facebook succeeds actually in their experiment with the Oversight Board because it keeps the government away. I firmly believe that Facebook will be subject to lots of competition as soon as the suspicion exists that they are managing the information that people receive. I believe the free market will overcome any alleged Silicon Valley bias.

Originally published at http://www.bobzadek.com on July 10, 2020.

--

--