A Victory in the War for Freedom of Speech | Jason Fyk
Jason Fyk had 38 million followers on Facebook, until they killed his page. And the reasons had nothing to do with content. He has since founded SocialMediaFreedom.org and is challenging the section of American law which is allowing content hosts…
SUMMARY KEYWORDS content, publisher, means, jason, section, communications decency act, called, case, court, unlawful, fourth circuit court, apply, statute, government, facebook, speech, pages, people, law, bill SPEAKERS Will Dove, Jason Fyk Will Dove 00:13 Many of us are rightly, very concerned with Bill C 11. The censorship bill which will soon see its third and final reading in the Senate before being passed into law. We recognize that whatever they may claim this Bill is for, the reality is that its true purpose is to stifle freedom of speech online in Canada. But Canada is hardly alone in this. I have with me today, Jason Fyk, an American who at one point had 38 million followers on Facebook. He lost them all overnight, when Facebook took down his page, and the reasons had nothing to do with the content. Since then, Jason has founded socialmediafreedom.org, an organization which is fighting censorship in the US, and specifically in regards to Section 230 of the Communications Decency Act, which was passed in 1996. However, section 230 is being used just as our own pending Bill C 11; as a tool to censor online information. Jason, it's a pleasure to have you on the Show. Jason Fyk 01:11 Thank you. Will, thank you for having me. Will Dove 01:13 I'd like to start with you giving some background because of course, our Canadian viewers aren't familiar with your story. 38 million followers on Facebook, and you were making something like $300,000 a month, and they just killed it. But it wasn't because of the content. Jason Fyk 01:27 No, no. Content is generally not actually the issue they just use that as an excuse to hide behind section 230 of the Communications Decency Act. In fact, what had happened was, I got in very early like 2010. And I got very aggressive in growing pages. And by 2012, I had somewhere around like 17 million fans. And 2013 was really the breakout year where we started making a lot of money. In fact, you know, we were just marketing other products, pretty much the same exact way that you know, Facebook, Google and Twitter, were making the money, they take advertising to show you content, right? Well, in 2014, right around January, their business model kicks in where now, what most people don't realize about these business models is that because they're so unique, they're not banner ads are not side ads, they show you content that they're paid to show you. They show it in the newsfeed which of course, creates a rift between the site and the users, because they're competing for the same space. What happened was as the highlights section 230, claim your content is you know, it's somehow improper, or offensive or whatever. And they take you down. Now,.that's exactly what happened for me in 2014. Except they didn't delete you know, there's there's a couple of different ways that content gets, you know, restricted, one of which is unpublishing and the other is deletion. Unpublishing means a page just isn't seen. Deletions, it's gone. But I had something, which effectively was the reach that the ability for it to connect with other people just got diminished to basically nothing. I had, you know, like you said, we were doing a little over $300,000 a month, in late 2013 and the following month, we made $6, it absolutely annihilated my business. And that was because they were competing with users like me, and they so they needed to diminish, guys like me, Well, fast forward, now, I had no real way of saying my content was proper or improper, because they just got rid of it. They didn't say why they don't do they didn't do anything. But in 2016, we really caught -- 2016. By this point, you know, we had built up, you know, 25 to 30 million names, it was crazy. And you know, the reach that we had, although the reach didn't connect to people, we did have the numbers, so to speak. Well, 2016, in October, they shut down six of my pages overnight, without any notice any warning or any reason, just shut them down and unpublished them, which cumulatively was about 14 million fans give or take. And I was basically I was stuck. My other pages, you know, virtually ceased working, there was no reach on them whatsoever. And the simple reason was, is because they don't need me, you know, there's no benefit, you know, to keep me there. So fast forward a few more months, and I had I had, you know, worked with this other company who got a fairly big, they were doing a lot more business with Facebook, and that company had spent, you know, millions, like 22 million in advertising. And of course, they had Facebook reps that directly worked with them, like came to their office. So we were still friends, but competitors. And I said, Hey, would you mind seeing if Facebook would reinstate my pages, you know, finding any way to get back on the game, you know? So they said, Sure, you know, and they went to Facebook, and they asked them and Facebook said, No, we're not going to do that for Jason Fyk. But if you own it, we'll do it for you. And what this meant was that all they cared about was, who owned it, and what value that person was to them. Sound familiar? Like, what kind of information is important to them? So what ended up happening is I sold it, you know, at a fire ourselves nominal compared to what I could have made out of the pages had they not infringed upon those pages. I sold it at a fire sale. And sure enough, three days later soon as you know, the contracts were inked. But when the whole my content was no longer improper, it was all turned back on again. People ask me, Well, what did you post what was it was funny memes stuff. But doesn't matter. Whatever it was when it was shut down, was the same content when it was turned back on. The only difference is it wasn't in my possession anymore. It was in the possession of another company. That tells you everything you need to know about content right there. Will Dove 05:35 Yes, I think we should take a moment here to explain for our Canadian viewers what section 230 is, and the Communications Decency Act. And I'm just going to read a couple of lines directly from your website, so that we can fill people in this: "In 1996, Congress passed the Communications Decency Act to help rid the internet filth and protect our children from harmful content." Sounds good so far. "In section 230, Congress sought to protect interactive computer service providers basically hosts or or people like Facebook, when acting as a, 'Good Samaritan voluntarily restricting offensive materials on line' ". In short, what this is saying, and please correct me if I am wrong, Jason, that they legislated for big tech, the right to kill any content without indemnity because they said it's offensive or it's dangerous. Jason Fyk 06:28 Okay, so that's a very complex answer, because it's not as simple yes or no. So explain it like this. So the title is exactly that. It says protection for good samaritan blocking and screening of offensive materials. The intent is to remove offensive materials to protect children. That was its original intent, okay, it was to help protect children. But they have to act as a Good Samaritan and the courts over the past 26 years have basically ignored what is formally known as an intelligible principle, but also known as a general provision, which is they have to be a good Samaritan. Of course, they're not being good samaritan so that the motivation is not being followed in general, right. But as far as can it was it written to restrict all material it was not. So there's a comp, there's a complexity here, there's two aspects of section 230: 230.C.1, and C.2, just like an outline, right? Under C.1, they say that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another. Now, the problem is actually, in 230.C.1, most people think it's 230.C.2, that it's overly broad, because they say that they can remove objectionable content? Well, it's a standard defer. The reason being is because the original court that read section 230, read it wrong, they have actually done, it's a very subtle mistake. But where it says that they shall not be treated as the publisher, the publisher is actually preceded by the word of God, right? "The" is important here because "the" is what's called a definite article, it defines who the publisher is. And it generally means that we're referring to somebody already known in the story, which is another information content provider, meaning under its correct interpretation, textually, it means that they cannot be treated as The Publisher, someone else - that is, another information content provider. Pretty simple, right? But they misread that. And they applied it synonymously with you can't treat them as a publisher, period. Think about the change it went from you can't be treated as the original publisher who's someone else, you just can't be treated as a publisher in general, even when you are one. So that became effectively, sovereignty, all publishing, and distributor liability disappeared entirely. We know that that can't make any sense. It doesn't make you know, it doesn't fit. They call it harm. Harmonious cannon right, then when you're building a law, it can't make like a nonsense kind of situation. It can't be redundant. Well, if you can't be treated as a publisher, wouldn't removing objectionable content be publishing? It becomes redundant, it's useless. So we knew that there was something wrong with it. But the thing is, everybody is trying to find all sorts of ways to explain it. Right? They've they tried to reconcile this thing, and then come up with some bizarre theories. And we simply just took it. We did something called looking at a statute de novo, right? It's a tie in for a new, we looked at it from the beginning again, so forget the precedent. Let's look at the whole thing again. We did that. And we came up with a whole different idea for the way this thing works, it's real simple. The first one says, First 231 essentially means you can't be treated as someone else for somebody else's publishing actions and their content done. It's distributor liability protection, that's it. The second one is when they actually are a publisher. It gives them certain duties from Congress. And it says that they can remove lewd, lascivious, excessively violent, harassing, or otherwise objectionable content. Now, there's a booth there too, because otherwise objectionable can be objective or subjective. It can be otherwise objectionable, whatever they choose is objectionable. Or there's a cannon that says, that should apply to what they previously stated, which was lewd, lascivious, excessively meaning. It was supposed to mean, illegal unlawful stuff. But that got drifted into something completely different. So this thing was a mess. And so far, Will. We had no success. Not at all, like, I've been dismissed by the California Court seven times. But there's been an interesting turn of events recently. Jason, just before we move on to that, I want to make sure that I understand for the, for the benefit of our Canadian viewers, if I'm getting this correct. Section one is essentially saying that a company such as Facebook is not the publisher, and therefore is not responsible for the content that appears on Facebook. But then with section two, they're contradicting themselves by giving them the rights of a publisher to remove content. Correct. It's what's called an irreconcilable statutory conflict, internally, C.2 conflicts with C.1 because as soon as they, as soon as content is considered in any fashion shapes, form, they become a publisher. Right? Because that's what a publisher does. So if they can't be and if they become a publisher, they are a content provider, if they knowingly allowed something, therefore,C.1 wouldn't apply. But then, by proxy, if you consider what to restrict, you're also considering what to allow, it creates what we call an irreconcilable loop. It's this big circular thing where it's just a mess. And, you know, challenge that, Will Dove 12:00 Well, I gotta move on, then to that point, where you were at where I interrupted you where you said, you had your case thrown out of multiple courts. I'm not a lawyer. I'm not an American. I've sat here for five minutes listening to you explain this, and I can see the problem. You've got a contradiction in the Section of the Act. You can't have it both ways. guys. Either you're the publisher, and you're responsible for what shows up on your site, but you're not the publisher, and you have no authority to remove it. Correct? Jason Fyk 12:28 Exactly. What it means is that we have to actually isolate the sections to what they mean, which is that were 230. So one can only apply to when they unknowingly host content. That means that all of the actions are all of the conduct all of the content, everything to do with publishing, the only aspect that it has anything to do with publishing with the site is the service itself is doing it, meaning the functionality. Now, I do want to I want to read this quote real quick, because this is the interesting turn of events. The Fourth Circuit Court was the court that actually decided a case called Zahran versus America Online, it was the first ever case it considered section 230. And they apply 230.C.1 unfortunately, as a publisher, not the publisher, and it became overly broad protections. Now, what's interesting is the Fourth Circuit Court did what the California courts didn't do in my case, but they should have done and we did they consider the statute, DE NOVO, completely new. I want to read you a couple quotes. These are just the most important quotes that have come out of this Fourth Circuit decision on November 3. What they have done here is they have actually clarified the original precedent that was overly broad and, frankly, wrong. And in doing so they've changed the internet. Nobody knows yet. I mean, this is this is big stuff, right? So they said, "Zahran's list of protected functions must be read in context. And that context, cabins that list to merely editorial functions cannot be stressed to include actions that go beyond formatting and procedural alterations". In other words, passive hosting functions and change the substance of the content altered. Those are active content moderation decisions. In other words, 230.C.1 only applies to the service function, the hosting, that's it. They clarified that they went on, actually clarified a few other things that were pretty, pretty important here. "An interactive computer service provider becomes an information content provider whenever their actions" -- remember actions is a key word here, because if it's past them, they're not involved. They're not responsible, obviously, right? "Whenever their actions cross the line into substantively altering the content ID issue in ways that make it unlawful" -- meaning if they do anything that's unlawful, they're on the hook, that's what we've been saying all along, or action is removed. Like content to listen to new owner, and then reinstate that content for that other that makes it unlawful. What was it was anti edited was unfair competition was extortion, fraud, a whole bunch of things. Right. They went on to say, Zahran then said that section 230.C.a prevents suits that "tasked the defendant in the same position as the party who originally posted the offensive messages". From what I said, they can't be treated as the publisher, because the publisher is someone else. So if they're tasked as the publisher, not as a publisher, but the publisher as the same position of what that publisher did, not what they themselves did. They got it texturally and, you know, misinterpreted, I'll give you one last one: "...thus for 230.C.1 protection to apply, we require that liability attached to the defendant on account of some improper content with their publication". It means the 230.C.1 only applies when you're casting them in the same light as the defendant who posted the improper content. Let me ask you, we started out this conversation. I told you that they took down my content because it was supposedly improper. But they put it back up again, and made it proper again. So what about my case has anything to do with improper content when the content is literally irrelevant? Right. See the problem? Will Dove 16:29 Yes. And of course, the obvious answer is it has nothing to do with content. It has everything to do with the profits of big tech. Jason Fyk 16:38 Will, a lot of it has to do with the courts just not doing their job. I mean, the predominant weight of this problem actually falls in the courts not fixing this problem over two and a half decades. The Fourth Circuit Court I mean, I'd be like, well done. Finally they did the job. They went in, they looked at it correctly. They looked at it de novo and they got it right. Now, what's interesting is I'm actually just about to head to the Supreme Court with what's called a circuit court conflict, and the only court that can actually resolve a circuit court conflict is the United States Supreme Court. We've got them caught. Will Dove 17:12 Right? Because you've got, you've got the they passed this law, where they had contradictory sections. And now you've got an for up until, as you said, November 3, I think so this is this for two weeks ago, this decision was made very, very recent. So we've had, we've had 26 years of this contradictory law, enabling organizations like Facebook, to take down people's content and be indemnified against doing that against violating freedom of speech in New York case, yes, fraud. Yeah, you know, coercion, whatever, a long list of crimes. And now you've got this, this court that has said, Well, wait a minute, no, these two sections are in contradiction to each other. Correct. And if the Supreme Court upholds that, now what happens? Jason Fyk 18:05 Well, that fundamentally changes the world. And that is not hyperbolic. If section 230 were to be applied correctly, and these are the challenges we're making on this section 230.C.1 is what's called an as a prop, as applied problem, meaning it was applied incorrectly by the courts. That's what created 95% of the problem. Most people don't know that. But the vast majority cases are actually dismissed under C.1 as absolute immunity, when they don't even apply to the content moderation of C.2, that's the way it's sitting currently, if we sort that out, and the supreme court respects what the Fourth Circuit just said, and well, we have said, for four plus years, C.1 would become immunity when they're not involved in any content, moderation meaning, go, you know, Joe Blo, you know, put something on the lot on the internet, and they are unaware of it, they shouldn't be held accountable for it. Because if they take that away, that would be catastrophic for the Internet, right? You know, so I can't be responsible for everything. But as soon as they consider the word consider is in the C.2 but as soon as they consider you know, the content, C.1 disappears, because now they become publishers, right. They become a content moderator. And that's fine. We, you know, that the law said, we can give you that job, but you're supposed to do that job in good faith. Now, you have a measure of good faith in, you know, at at issue here, right. So the second portion of it will then apply to content moderation cases, which I think will help dramatically now, as a separate aside, the constitutional challenge that we have going on which a lot of people are unaware of we filed a case against the United States. That case brings into question the entire constitutionality of it, because in truth, the government is not allowed to restrict lawful speak, they're just not right. It's not a power that they can exercise. The only time that they can restrict speech is if it is recognized as unlawful meaning like child pornography, there's, there are certain categories that are limited. We know they should be limited, like a threat upon somebody's life. Will Dove 20:11 So I think you and I came up with a very good definition before the interview, we were talking about this. And Jason and I have agreed, and I think most of you would agree that, you know, we don't want to see anything being posted online. If of course, it portrays breaking a law, or it encourages the breaking of a law, it should not be allowed. But beyond that freedom of speech should apply. So once again, if I'm understanding what you're saying properly, Jason, if these laws were then changed, to refer to make sense to not be contradicting each other, and say, somebody posted a snuff film on Facebook and Facebook could provide reasonable evidence in court that they didn't know it was there. Now they can't be held responsible. Jason Fyk 20:52 Bingo. Exactly. That's the purpose of it is if they didn't know it was there. But like, for example, there was a case Doe versus I think it was Twitter. Yeah, it was Twitter. So there was child pornography, the they, somebody on the internet had saved it, they put it on Twitter. And the parents told Twitter to take it down. They didn't take it down. They went to the police, the police told him take it down, put it in, take it down. And then he went to Homeland Security. And finally Twitter took it down. In that period of time 143,000 views occurred. That is gross negligence, it's contributory negligence, because they considered the content they are now providing, because as soon as they considered it, and allowed it, they have made a substantive contribution to it. They've said we are allowing it and in doing so they are knowingly distributing illegal content. And of course, my opponents will go well, they can do whatever they want. No, they can. Matter of fact, there's another conflict that if reading properly, Justice Clarence Thomas actually brought this forward. Why would if Zahran v America Online in line was correct, that it removes all distributor liability, so they can allow, remove, it would eliminate all distributed liability. Why would they impose distributor liability in the same exact statute? It would impose and also eliminate all distributed law, because section 502 of the Communications Decency Act, same statute says that it is unlawful to knowingly distribute child pornography, even if it's third party third party provided. Well, that makes no sense. Because they haven't done it right. We're trying to get it set straight. And then we still have to look at 230.C.2, which still has an element where it says they mean, the last words are very telling of the statute, of 230.C.2 where it says even if the materials are constitutionally protected, the government gave them the ability to remove a lawful content. That's not a power that the government has. Our government can't do that. So they can well they say, well, they're a private entity. Well, no, actually there are corporate entity, a little bit different, right. corporations that are held publicly held open to persecuting, regularly held open to the public. Okay, there's a case called Pruneyard in your mall. They're subject to the Bill of Rights, because they're corporate entities. They're not private entities. They're privately owned, but they're still corporate. In that circumstance, it says they have to respect the Bill of Rights. But here's the kicker that nobody is aware of. Everybody keeps using the term state actor. I am sure you've probably heard of it. Right. Are they state actors? Well, the statute asks them to block and screen offensive materials, right? Is that a job? It's an obligation says, Hey, we want you to do this. In return, we'll give you protection. So they they've got an obligation with compensation in the event that they do the job? Well, that's what they give to a commission. Right? All of the Commission's have the same kind of thing. They have an intelligible principle, like Good Samaritan, and they're asked to do a job. And that commission makes them government agents, because they have agency power. And when that agency, private or public, doesn't matter, takes any action, it still has to follow the Bill of Rights. So we've also challenged it that way and said, No, we don't care if you're a state actor acting at the directive of an outside source like the FBI, which of course, we know is now happening. We're saying is right in the statute, where the job if you're seeking protection, you had to prove that you acted within what the government asked you to do. Right? That's an agent. Will Dove 24:34 And we end up with the same problem here in Canada with Bill C 11, which is purportedly to give the CRTC authority over Canadian content on YouTube. But if that goes through at that point in time, YouTube becomes an agent of the Canadian government. Jason Fyk 24:48 It does it absolutely, you're exactly right. That's what happens is it transitions them into an agent of government, and they're acting as an agent of government. Now, I'm obviously not as familiar with Canadian laws and how they do that. But But effectively, that's what they're trying to do here is that, you know, legislatures, a lot of the United States are trying to find a way to reconfigure section 230, that it becomes a much more bolstered speech weapon. I mean, it's bad enough now, but because they're laundering that, that sort of action through a private entity, corporation, whatever you want to call it. But the fact is, is that, you know, we have, luckily, we have a constitution, you know, in the Constitution says that, you know, we are the power, not the government. And matter of fact, that's actually where some of my cases turning or we're, we're now turning to common law and saying to the judges, you lacked jurisdiction over me the you're violating my constitutional rights here, because I'm being denied on my fifth amendment, I'm entitled to a hearing to redress my grievances, and all I've gotten so far is thrown out of court seven times. That's ridiculous. Like what they did, effectively is an illegal taking. They took my property, they took my liberty, and the government wrote a law that insulates that taking, meaning the government's laws unconstitutional. We're calling for it. And, you know, we made you know, like I said, I have had zero successes. And everybody, of course, throws precedent in my face, which of course, is wrong. And I've hit it, I've hit a wall, everywhere I go, nobody wants to listen. But now the Fourth Circuit Court confirmed four years against all odds, everybody telling me I'm wrong. We're right. And this thing is going to come to blows, I think, in the Supreme Court in the coming three months, Will Dove 26:27 Wow, yes. Well, I wish you the best of luck with that, because of course, we're fighting the same war. And I shouldn't it's obvious to people who are listening to this, that if this Bill C 11 goes thru and YouTube, and other content providers, under the same umbrella become agents of the Canadian government, well, now the Canadian government can control the content, if they don't want people like me, and we're not even on YouTube, because just about anything I put up there gets taken down. But it's gonna go beyond that, because they don't want to stop at YouTube. They want to control all social media. So that means channels where we've been able to publish freely, like, say on rumble, will suddenly now they're agents of the Canadian government. And the government says, You take that down, we don't want that content distributed. Jason Fyk 27:14 Correct. Well look what happened here, the truckers, you know, but you know, they're trying to take speech, here's a Will they try to take all their, you know, their bank accounts? I mean, when does it stop? I mean, you know, first it's speech, then it's your life. They need you. That's how governments have functioned throughout time. And that's why they collapse, because they do not respect that the speech that they don't want to hear is the speech that you protect most. Yeah, that's, that's why it's protected. Will Dove 27:45 And another thing that Jason and I agreed on prior to the interview is that freedom of speech is the foundation of all of our freedoms, whether you're Canadian or American doesn't matter. Once they take away your freedom of speech, your other freedoms won't be far behind. So we absolutely this must be the hill we die on. Freedom of speech, we must protect it. This is, Jason, I wish you the very best with the Supreme Court case. Please stay in touch. Let me know what happens. Because one way or the other, I'd like to have you back to discuss this and the ramifications of it. Jason Fyk 28:19 Absolutely. Now, I would also welcome anybody, if you want to understand more about this, we have a lot of information hosted on on our site, socialmediafreedom.org, there's specifically human events argument or see me an op ed that I wrote, that really explains what just changed. Now, this is before I found out that I actually was endorsed by the Fourth Circuit, and I will actually most likely have an op ed coming out with the Federalist, they're, they're looking at producing it very, very, very soon. But again, also, I would also tell him, this is an extremely expensive fight. I burden most of it myself. And if you have the means to contribute, you know, this is a real fight and it is for you. This keeps me fighting them so that I can continue on there you can donate to our cause. And it is to, you know, maintain freedom online. So.... Will Dove 29:11 Yeah, and as always, folks, the link to social media freedom.org will be directly beneath this interview on our website. I do encourage you to visit it. There's a great deal of very good information there. Once again, Jason, thank you for your time. Thank you for having me,. Will