The Eleventh Circuit Opines that Much of Florida’s New Regulation of Social Media May Violate the First Amendment, in Contrast to Recent Orders of the Fifth Circuit to the Contrary Now Awaiting Emergency Review in the Supreme Court


NetChoice, LLC and Computer & Communications Industry Association, d/b/a CCIA v. Attorney General of the State of Florida, et al., No. 21-12355 (11th Cir.) Order and Opinion issued May 23, 2022, affirming in part and vacating in part an injunction issued by the United States District Court for the Northern District of Florida.


Several states, including Florida and Texas, have enacted legislation aimed at compelling social media to be open to all, without banning, de-prioritizing, or de-platforming entities or posts because they present disfavored views. Texas’s law applies to the general practices of large social media sites, while Florida has addressed access by political candidates and journalists.

When the U.S. Court of Appeals for the Fifth Circuit refused to enjoin the effectiveness of Texas’s statute, NetChoice and CCIA presented an emergency petition to the U.S. Supreme Court on May 13, 2022. Just as briefing closed on the emergency petition, the Eleventh Circuit issued its opinion, which has been added to the record of the emergency petition as supplemental authority.

Unlike the Fifth Circuit, holding its legal cards close to the vest, so to speak, and issuing a non-unanimous order without opinion, the Eleventh Circuit has published a 67 page opinion examining whether it is likely that NetChoice and CCIA will succeed in demonstrating that Florida’s law is unconstitutional. Concluding that it is likely that the law will be shown to be unconstitutional, and observing that ongoing infringements of First Amendment rights are presumed to cause irreparable harm, and noting that neither the state nor the public has any interest in enforcing unconstitutional law, the Eleventh Circuit has upheld most, but not all, of the injunctive relief granted by the Northern District of Florida.

Principles Endure. The Eleventh Circuit opened its opinion by noting that new principles are not necessarily needed when new technologies emerge. The First Amendment continues to prohibit government interference in speech while protecting the speech of private actors.

‘Not Really Private’ Private Entities. Florida asserts that social media platforms are not truly private entities and has enacted legislation prohibiting de-platforming political candidates, de-prioritizing messages about political candidates, or removing content provided by an “journalistic enterprises” because of its content.

The Eleventh Circuit Disagrees. The appellate court has found that social media entities are private actors that enjoy First Amendment protections. Editorial judgement about content are protected. That protection would be unconstitutionally burdened by Florida’s legislation, not only in its editorial and content-based directives but also in its demands for disclosure of a rationale supporting any and all content moderation decisions. These observations support enjoining aspects of the Florida law.

How It Works. The Eleventh Circuit has offered a ‘primer’ about what social media platforms are” collectors of others’ speech, broadly defined to include text, photography, and video “posts” published to others. Platforms may have billions of users or exist as smaller sites for specialized interests. Several social media platforms are household names: Facebook, Twitter, and YouTube.

Private Enterprises, Private Choices. No one is obliged to avail themselves of the content social media entities provide. The government cannot restrict citizens’ access to social media platforms but that right of access attaching to citizens does not include a right to compel the platforms to accept or consume any content.

Whose Speech Is It? Much, if not most, speech on social media platforms is not created by the platforms themselves, but some speech belongs to the platforms, as is the case with publishing terms of service or community standards defining what is permitted, or creating addenda or warning, or publish a platform’s own content.

Neither Conduits nor Storage Devices, but Curators. Social media enterprises are best seen as curators and arrangers of content according to users’ wishes, while at the same time removing content that violates the terms of service or community standards.

These activities make the platforms active intermediaries who have created virtual spaces where participants can be both speakers and listeners.

The Eleventh Circuit views content moderation as curation that promotes the creation and development of niches and communities, and promotes values and points of view.

Why Florida Sought Legislative Intervention. Florida’s social media legislation was intended to address perceived silencing of conservative views by technology ‘oligarchs’.

Florida perceives social media platforms to be akin to public utilities which, as common carriers, are to remain accessible to all and to viewpoints.

Sweeping and Problematic. The Eleventh Circuit notes that Florida’s law, while aimed at “big tech oligarchs,” as defined by size and revenue, does sweep in smaller sites, such as Wikipedia and Etsy. An initial specific exclusion of Disney Corporation was repealed.

Three features of the Florida legislation are problematic, in the appellate court’s view: content moderation, disclosure obligations, and user data retention.

Strict in Theory, Fatal in Fact. The Eleventh Circuit perceives that Florida’s legislation regulates speech within the meaning of the First Amendment, and its content moderation provisions are subject to strict scrutiny, making it unlikely the legislation will survive.

Pre-Emption Awaits Another Day. As the court based its analysis on the First Amendment, it is not necessary to consider the issue of federal preemption of the Florida law by 47 U.S.C. Section 230.

Gutting Editorial Discretion. Denying social media platforms the ability to prohibit some posts, as the Florida law does, impairs the very exercise of discretion that the First Amendment prohibits, the Eleventh Circuit observes.

Not an Indiscriminate Host. The notion that by opening a social media space to some — essentially serving as a host to speakers — a social media enterprise must open that speech to all, following historic decisions, failed to persuade the Eleventh Circuit with respect to the Florida legislation.

Social Media’s Own Speech. If the issue of mandating open doors and open access were not enough to impair the social media companies’ editorial discretion, and by extension, their First Amendment rights, the Florida law, in the court’s view, impedes the platforms’ capacity to exercise their own speech rights.

Common Carrier Analogy Fails. Seeking to minimize the impact of First Amendment review, the state has relied heavily on the notion that social media platforms are common carriers indefensible to society, an idea rejected by the Eleventh Circuit notwithstanding that the court was uncertain whether the state asserts that the common carrier status has already been attained or whether the state would legislate that status into existence.

Social media platforms do not behave as common carriers available to all to transmit communications of their own choosing, the Eleventh Circuit observes. Social media platforms may appear to be open to all but in fact users must accept the platforms’ terms and community standards. Moreover, Supreme Court opinions have not considered cable operators to be common carriers, and the Court has declined to place online media on the same footing as broadcast media for supervisory and regulatory purposes.

The Eleventh Circuit sees that online platforms as analogous to cable providers that retain editorial discretion over their offerings.

Finally, Congress has specifically distinguished and exempted internet services form other communications media in the Telecommunications Act of 1996 and within the same legislation has protected social media from liability for publication in ways not extended to common carriers that must serve all, the Eleventh Circuit reasoned.

What Part of “Constitutional Guarantees” Did Florida Not Understand? If the social media platforms are not already common carriers, which the appellate court finds they are not, the state possesses no power to legislate the platforms’ First Amendment rights out of existence by nomenclature. Even if the social media platforms’ vast market powers suggest that they ought to be treated as common carriers, this would not carry the day. Legislation cannot create in social media the fundamental characteristics inherent in and required of common carriers to hold themselves out to the entirely of the public, without exception. While some entitles may come to be a means of rendering services of public interest, marketplace success in itself will not compel forfeiture of First Amendment rights.

The exercise of expressive editorial judgment by the social media platforms means that those platforms are not common carriers. Any imposition of limits on their First Amendment rights must survive strict scrutiny, which, with some exceptions, is not the case with Florida’s law.

The Nature of the Violations. Florida’s law would restrict editorial judgment through forbidding de-platforming political candidates, manipulating the presentation of content by or about candidates, and censoring or manipulating journalistic enterprises. Legislatively requiring consistency in decision- making and imposing time limits on restrictions present similar, if less obvious, impositions on social media platforms.

Permitting users to opt out of the platforms’ curation would interfere with the editorial processes and discretion exercise by the platforms to those users.

Compelled disclosures of platform activities inherently burden editorial judgment, but such commercial disclosures are subject to lesser scrutiny.

The Eleventh Circuit finds no First Amendment issues arise with respect to requiring platforms to permit users to access their stored records for at least sixty days after de-platforming.

Gimlet Eye or Casual Glance: Standards of Review. Content based speech regulations must survive strict scrutiny. While the state has admitted that the aim of its legislation is to address perceived mistreatment of conservatives and conservative views, this does not persuade the Eleventh Circuit to adopt the technology associations’ argument that this causes the entirety of the legislation to fail.

The state’s motivation in enacting legislation is not outcome determinative in review of an otherwise facially constitutional law. Moreover, the applicability of the law to some social media platforms and not others, while of concern, is insufficient to condemn the legislation in its entirety.

The Eleventh Circuit’s Reasoning. The appellate panel has concluded that NetChoice and CCIA may succeed on the merits of their content moderation claims. As some provisions refer specifically to content messaging, those trigger strict scrutiny, whereas de-platforming and opt-outs are neutral.

The “consistency” demanded of the social media platforms partakes both of content-based and neutral regulation. Because at their core they involve expressive activity, intermediary scrutiny is triggered, but even at that level, they are not likely to survive.

Disclosure of factual information in commercial settings need not meet even intermediate scrutiny, and may be reviewed on a rational relationship basis, making those regulations likely to survive.

The Eleventh Circuit has concluded that none of the content moderation measures would survive intermediate scrutiny and that the ‘explanatory’ disclosure requirements — why decisions were made — is likely unconstitutional. However,there is no likelihood of success on the merits of the rest of the legislation.

When intermediate scrutiny is applied to the legislation’s content moderation restrictions, the court is asked to consider whether the content moderation restrictions are narrowly drawn, that is, no greater than is essential, to further a substantial government interest unrelated to speech suppression.

The content moderation restrictions do not, in the court’s view, further any substantial government interest, which does not seem to have been seriously argued by the state. (Slip op. at 53.)

While it might be that the state, had they pursued such arguments, would claim an interest in curtailing private censorship, or in fostering use of of the internet, the government has no interest in “leveling the expressive playing field,” nor may it intervene where there is no right to a social media account.

The idea of restricting the speech of some to enlarge the voices of others is “wholly foreign to the First Amendment,” the Eleventh Circuit has concluded. (Slip op. at 59, quoting Buckley v. Valeo, 424, U.S. 1, 48-49 (1976)).

The assertion of a state interest in “promoting the widespread dissemination of information from a multiplicity of sources” would fail, as social media platforms do not act as gatekeepers, exercising control over most or all information. (Slip. Op. at 49, quoting Turning Broadcasting System v. FCC, 512 U.S. 622, 662 (1994).) A wealth of communications resources exist and are available to speakers Even if they are not of the magnitude of the social media platforms, this does not justify inhibiting the speech rights of private social media companies as the Florida law would do.

Moreover, the appellate court thinks it unlikely that the government has an interest in private utilities’ consistent application of rules or in prohibiting users from changing messages within certain time frames, in addressing sequencing of content, or in permitting or precluding participation in these processes.

Even if a substantial government interest were found, there is little likelihood that the preclusive restrictions and mandated activities are “no greater than is essential to the furtherance of interests.” (Slip op. at 61, citing United States v. O’Brien, 391 U.S. 367, 377 (1968).

Prohibitions on “deplatforming, deprioritizing, or shadow-banning” would make it impossible to address obscenities or terrorist threats, and indeed raises the specter of minors’ access to pornography. (Slip op. at 62). This wide a sweep stands the narrowness constraints applicable to legislation of speech regulations on its head, the court concludes.

Compelled disclosures. Disclosure requirements will survive constitutional scrutiny if as commercial speech they are related to protection of consumers, which is a recognized state interest, and are not unjustified or unduly burdensome, effectively chilling protected speech. (Slip op. at 63, citing Milavetz, Gallop & Milavetz v. United States, 559 U.S. 229, 250 (2010).

An exception to the likely unconstitutional disclosure requirements is requiring that information be provided to consumers about the terms of access to the platform and that the content moderation policies are not misleading. The court observed that there has not been a sufficient showing that publications of standards or that providing information about rules changes, views, and advertising information would be unduly burdensome.

The court has agreed with NetChoice that requiring detailed justification for and notice of each content moderation is likely unconstitutional even under commercial speech standards. The time constraints, compliance burdens, and prohibitive fines for insufficient “thoroughness” compound those burdens.

And in Conclusion. The remaining factors requiring review to substantiate injunctive relief are easily met, the Eleventh Circuit has determined. Ongoing First Amendment violations are presumptively irreparably harmful, and neither the state nor the public has any interest in enforcing an unconstitutional statute.

The district court’s order will be upheld in part and vacated in part, and the case remanded.

WHERE MATTERS STAND. JustLawful is not sage enough to know what the Supreme Court will do now that there is an apparent, if only partially articulated, conflict between two federal circuit courts of appeal. Others’ prognostications are welcomed.

In a Nutshell. Here is a link to the Eleventh Circuit’s synopsis of its parsing of the Florida statute.

Summary 11th Cir. Opinion

And in Full:

Here is the entire opinion.

NetChoice v. Florida No. 21-12355 (11th Cir.) Opinion May 23, 2022

 

Federal Officials Cannot Evade First Amendment Constraints on Speech Suppression Through Intimidation and Collusion with Internet Platforms, or Creation of an Unauthorized Disinformation Governance Board, State Attorney Generals Assert in Suit Against an Array of Federal Officials


Missouri and Louisiana v. Biden, et al.., No. 3:22-cv-01213-TAD-KDM (W.D. La.).  Complaint filed May 5, 2022.

Missouri and Louisiana v. Biden, et al., No. 3 22-cv- 01213 (W.D. La.) Complaint filed May 5, 2022

Missouri and Louisiana Attorney Generals, claiming injury to state constitutional interests and to state citizens’ speech freedoms, have filed a complaint against President Biden and multiple executive officials and federal agency heads, asserting that the Biden administration has colluded with technology platforms such as Facebook, YouTube, and Twitter in order to suppress and censor information unfavorable to federal government aims.  The recent creation of a bureaucratic governing board to manage removal of disfavored speech only advances these unconstitutional practices, the state plaintiffs say.

Plaintiffs seek declaratory relief declaring the administration’s actions violate the First Amendment as well as injunctive relief forbidding further unconstitutional activity.

The First Amendment serves as the cornerstone of the free exchange of ideas of information, without which competent self governance is impossible, the states say.  The federal government is constrained by the First Amendment from interfering with the guaranteed freedoms embodied in the First Amendment, including speech freedoms.  The government cannot escape its obligation to refrain from inhibiting speech by engaging private entities to censor speech.

Although the First Amendment does not ordinarily reach private actors, acts undertaken at the behest of or in collusion with the government may violate the First Amendment.  This is particularly so, the plaintiffs state, where the federal government has coerced private entities to cooperate with the government by means of threats of antitrust proceedings or revocation of immunities enjoyed under Section 230 of the Communications Decency Act of 1996.

Truncating the flow of information to suit federal officials’ aims impairs states in protecting the interests of state citizens, particularly where state constitutions may secure more expansive speech protections that the United States Constitution, plaintiffs claim.

The Complaint filed on May 5 in the United States District Court for the Western District of Louisiana details instances in which, either directly or in collusion with technology platforms, federal officials have acted to suppress speech, serving their own political ends to the injury and detriment of the public, frequently cloaking their actions as attempts to guard against undefined and opaque “disinformation.”

Threats of antitrust actions or threats of loss of immunities have ensured technology companies’ compliance with federal officials’ dictates.   The adoption of facially private governing documents and policies that in fact are employed to serve the government, and which may operate in collusion with the government, cannot be interposed to shield either private or public actors from liability for suppressing and chilling speech.

An atmosphere of intimidation pervades social media sites, plaintiffs observe. Undertaken in fear of or in collusion with federal officials, the private companies’ practices of banning, shadow banning, limiting publication, and outright removal of social media account holders create unconstitutional prior restraints, chilling participation lest a similar fate ensue.

The state plaintiffs’ Complaint provides a chronicle of activity asserted to constitute First Amendment violations. If true, the plaintiffs’ allegations paint a picture of a government intent on serving its ends and not those of the public they were elected or appointed to serve.  Digital media fail to behave as an ‘electronic public square’ where those media represent an unparalleled “concentrated control” of speech.  Complaint, para. 53, citing Knight First Amendment Institute, 141 S. Ct. 1220, 1221 (2021).

Federal officials have conferred with private digital platforms to advise the platforms about content that ought to be flagged for removal, plaintiffs state.

Online platforms accomplish speech monitoring by means such as mechanical algorithms or outright speech suppression by permanent banishment of disfavored speakers, the plaintiffs offer, thereby denying the exiled any ability to communicate publicly.  Such measures not infrequently censor core political speech, to the detriment of political opponents and to the benefit of those directing the private companies’ actions.

Examples of digital platforms’ interference with First Amendment speech guarantees, undertaken to please or to appease federal officials have included suppression of information about location of the President’s son’s laptop, said to contain damaging information, on the eve of the Presidential election.

Plaintiffs aver that open discussion of the origins of the Covid-19 virus was precluded where, by agreement with a social media platform, a federal official who had been engaged in funding gain of function research abroad provided messaging favoring a government narrative which insulated the government and the official from review.

Relevant evidence that would permit public evaluation of the efficacy of face masks and government edicts demanding home confinement was also suppressed, plaintiffs submit.

The promotion of narratives favoring voting by mail, a methodology traditionally dismissed as inviting voter fraud, has also been alleged to involve social media.

Both the Executive and the Legislative branches have threatened technology companies directly and publicly, at times demanding removal of political opponents’ statements.

The recent creation of a board to govern “disinformation” is an Orwellian measure intended to withhold content from the public and to insulate the federal government from criticism, plaintiffs insist.   This has been done notwithstanding that there exists a constitutional guarantee of free speech, such guarantee not to be interfered with by curating and removing from public discourse that which disfavors the government.

Similarly dystopian, plaintiffs observe, is the view that speech is not speech but infrastructure, and thus susceptible of government regulation and oversight.  To this has been added the opinion that the public reacts emotionally and thoughtlessly to speech, and that speech is linked to violence, requiring online policing to protect the public.  One legislator has suggested that the public lacks the capacity to discern fact from fiction, a circumstance not to be addressed by providing more information, but instead, in the view of current federal officials, less information or none at all.

These activities, whether singularly or in combination, violate the First Amendment and severely damage public discourse, the plaintiffs say, causing sufficient danger to open discourse as to merit an injunction against further constitutional violations.

Public Figures, Private Law: Facebook Oversight Board Upholds Initial Removal of President’s Statements and Presence but Condemns Facebook’s Failure to Articulate Standards or Time Limits


Case No. 2021 -001 – FB – FBR.  Facebook Oversight Board, May 5, 2021.


Facebook is an online social media platform that welcomes all except those determined to have acted badly according to its internal standards, which are described generally in its Terms of Service, with which users promise compliance.   For the errant poster, Facebook may administer rebukes, suspend or terminate service, as well as removing content it deems unsuitable. 

Facebook thus administers and enforces rules of its own making by its own employees.  In light of persistent concerns about this insularity, Facebook founder Mark Zuckerberg created a board of review, funded by Facebook but administered independently.  

This week the Facebook Oversight Board issued an opinion unsigned by its constellation of prominent international figures that concluded that Facebook did not err in removing statements of then-President Donald J. Trump at the time of and concerning violence that erupted on January 6, 2021 in the nation’s Capitol following a rally of Trump supporters.  

While correct in the immediacy of its removal and ban in light of the circumstances at the time, in which the then-President’s words were perceived to have incited insurrection, the Facebook Oversight Board condemned Facebook’s failure to articulate the reasons and applicable standards supporting the removal and ban and the apparent eternal silencing of Facebook account holder Trump.  

The Facebook Oversight Board sent the case back to Facebook for further proceedings. 

The decision is no small matter and some have deemed it a landmark of equal stature with Marbury v. Madison, 5 U.S. 137 (1803), the first enunciation by the United States Supreme Court of its reason for being and its power of judicial review.  

This proceeding can be seen as a foundational attempt to provide some structure for review of platform provider’s decisions.  

This matters greatly (“bigly”, some might say) because internet service providers are almost entirely immune from suit for questionable decisions and at the same time the government of the United States cannot intervene to regulate online speech as it is constrained by the First Amendment to the Constitution of the United States.  

Section 230:  the good, the bad, and the sometimes ugly. When widespread public adoption of the internet was in its infancy, Congress sought to inhibit unprotected speech while protecting internet service providers from liability for statements not of their own creation posted on platforms.  Section 230 of the Communications Decency Act of 1996 preempts federal law and precludes suit against any platform provider who does not create content.  The platform is free to remove or to otherwise police its product without losing those immunities.  

This would leave a user without recourse unless the platform’s actions could be challenged in court in contract, which in limited measure can be done, or through internal review with the platform provider, as is the case in this week’s opinion.

The creation of an international body not necessarily bound by the laws of any one nation cannot be other than a major inflection point in modern law.  Prominent First Amendment authorities question whose law should govern such cases.  

It is far too soon to tell whether this new thing is a good thing, and much is lost in cheers and jeers attaching to personalities, whether that of the former President or of the founder and CEO of Facebook.  What is to the Facebook Oversight Board’s credit is that the reviewing body articulated not only the facts determined but also the standards embraced.  The virtue of its reliance on standards drawn from international human rights declarations, which remain aspirational domestically if not adopted by the United States, awaits further reflection.  

Links to the decision and to other materials are posted below. 

The Facebook Oversight Board opinion:  

2021 001 FB FBR Oversight Board Opinion

The Facebook Oversight Board announcement and overview of its opinion:

Oversight Board Upholds Trump Suspension While Finding Facebook Failed to Apply Proper Penalty

The composition of the Oversight Board:

Facebook Oversight Board

A primer on the creation of the Oversight Board and a reflection on this week’s opinion:

Lawfareblog: About the Facebook Oversight Board

Lawfareblog: It’s Not Over: Oversight Board Trump Decision is Just the Start

Reflections on jurisprudential questions prompted by the Facebook Oversight Board determination:

Volokh Conspiracy: Whose Rules Should Govern How Americans Speak with Other Americans Online

Responses to announcement of the decision and opinion in the mainstream media:

Facebook Oversight Board Tells Zuckerberg He’s the Decider on Trump – The New York Times

Trump Is Still Banned on YouTube. Now the Clock Is Ticking. – WSJ

Facebook Oversight Board’s Trump Decision was Marbury v Madison Moment – CNBC

Two recent cases discussing Section 230 of the Communications Decency Act of 1996:

Daniels v Alphabet Inc ND Cal 2021

Murphy v Twitter Inc Cal App 2021

Discussions of United States’ positions on international human rights conventions:

Where the United States Stands on 10 International Human Rights Treaties – The Leadership Conference Education Fund

Human Rights and the United States

Public commentary on the controversy submitted to the Facebook Oversight Board:

Facebook Oversight Board Public Comments

School Is Out! Or Is It? Supreme Court to Consider School’s Constitutional Capacity to Discipline Student’s Off-Site Online Speech


Mahanoy Area School District v. B.L., et al., No. 20-255 (S. Ct.).  Oral argument scheduled for April 28, 2021 at 10 a.m.


Student B.L., who was all in on cheerleading activities, was distressed to learn that a less senior student had jumped the line to the varsity squad, while she, with a year’s experience to her credit, remained on the junior varsity squad.  As is normative among digital natives, B.L. made her views known online on the social media application Snapchat.  B.L. did not have a good word to say, and indeed she used some words that a grandmother might kindly term “unladylike.”

Soon thereafter the school was abuzz with the news of B.L.’s postings.  School administrators, displeased with her having posted material that it considered disrespectful and disruptive of school and school-related activities, determined that she ought to sit the cheerleading season out.  This was fiercely protested by B.L. and her family.  The school would not budge, and this case, which questions how much off-site speech a school may discipline, ensued.

During the Viet Nam War, students protesting the United States’ participation in that conflict came to school wearing black arm bands to signify their disagreement.  When a school tried to countermand this activity, the Supreme Court disciplined the school instead.  In Tinker v. Des Moines Independent Community School District, et al, 393 U.S. 503 (1969), the Court concluded that minor students are not without Constitutional rights, including speech and expressive rights.  Schools may not interfere with students’ speech and expressive activities except where the ordinary activity of the school or the rights of others may be substantially disrupted thereby.

Life today is no longer constrained geographically as in the past.  Communication is instant online and that communication may reach an audience any time and any where.   Boundaries as they once were known are no more, leaving schools to wonder how they might navigate the shoals of order and expression.

The petitioning school district argues that it was error for the trial and appellate courts to interpret Tinker as inapplicable to off-site activity.  Schools, responsible for so much of students’ lives in the day to day, must be able to maintain civility when offsite online behavior interferes with order or threatens others.

B.L. counters that the First Amendment rights recognized in Tinker would be meaningless if students, fearful of condemnation and harsh consequences from school authorities, were not able to communicate online as they would wish.

The United States, as amicus with a bit more clout than many other amici, while favoring the school’s position, suggests that there are several lenses with which to evaluate the interests of the parties, but asks the Supreme Court to return the case to the lower courts for further developments.

Mahanoy Area School District v. B.L., No. 20-255 Brief for Petitioner

Mahanoy Area School District v. B.L., No. 20-255 Joint Appendix

Mahanoy Area School District v B.L., No. 20-255 Brief for Respondents

Mahanoy Area School District v. B.L., No. 20-255 Reply Brief for Petitioner

Mahanoy Area School District v. B.L., No. 20-255 United States’ Amicus Curiae Brief

When Zeal Outstrips Reason: Second Circuit Upholds Judgment Stemming from Website’s Publication of Allegations of Child Sexual Abuse

Powell v. Jones-Soderman and Foundation for the Child Victims of Family Courts, No. 20-532-CV (2nd Cir.) February 26, 2021.


The United States Court of Appeals for the Second Circuit recently upheld a Connecticut federal court judgment that the founder of a child advocacy foundation had libeled a Connecticut father when, during pending divorce proceedings, she published on her website allegations that the father had committed child sexual abuse. 

On appeal, Jones-Soderman argued that the trial court erred in finding her liable because proof of the falsity of her statements was lacking, and such proof was necessary to overcome her First Amendment defense. Moreover, she said that the trial court failed to give consideration to her good faith belief that she was publishing the truth.  

While the First Amendment may protect commentary on matters of public interest, no such protection extends to demonstrably false statements, which the appellate court found were amply examined by the federal trial court in taking testimony and in admitting to the record state court findings that the allegations of sexual abuse were without merit.  

Jones-Soderman is not entitled to reliance on an “actual malice” standard for publication of defamatory material, the Second Circuit found, but even if she were, that standard would have been met, and it would negate any qualified privilege she might have.  

That Jones-Soderman published statements about the plaintiff when in his ex-wife’s employ in a custody battle and with knowledge that clinicians, state authorities, and the state court had found the abuse claims without foundation.  No qualified privilege may serve as shield in such circumstances, nor may a “good faith belief” in the truth of the published statements be invoked where Jones-Soderman knew of evidence contradicting the claims.

Jones-Soderman’s status as a mandated reporter of child abuse is of no moment with respect to the facts in this case, particularly where no complaint to Child Protective Services was ever made.

Powell v. Jones-Soderberg, No. 20-532 (2nd Cir.)

“Sure sounds like a termination.”–Judge in Parler Dispute With Amazon Web Services Appears to Appreciate Impact, But Questions Need for Injunctive Relief

Parler LLC v. Amazon Web Services, No. 2:21-cv-00031(BJR) (W.D. Wash). Argument concerning injunctive relief held January 14, 2021.


Today the U.S. District Court for the Western District of Washington heard arguments concerning whether Amazon Web Services (AWS) ought to be ordered to restore service to Parler, LLC, whose site was deplatformed on short notice provided on January 9 because, AWS believed, Parler was not ably managing removal of unacceptable content in compliance with its agreement with Amazon.

 

Counsel for Amazon downplayed any non-compliance on Amazon’s part, asserting that Parler had not and could not comply with its obligations whether AWS  had suspended or terminated Parler.

 

AWS noted that as of January 6, 2021, what had been long feared became painfully real in the attacks at the U.S. Capitol. AWS perceived a need for action.  

 

Amazon Web Services noted that AWS’ actions respecting Twitter differ from its actions with Parler because Amazon Web Services does not access or engage with Twitter’s live feed as it does with Parler.

 

Parler submitted that losses to Parler are irreparable.  Advertisers, the site’s sole revenue source, no longer provide income, and fifteen million account holders no longer can access Parler.

 

Although Parler offered that just recently Parler had been discussing adopting AWS’ software and obtaining venture capital, no counsel present would opine concerning whether their respective clients would be interested in further discussions.

 

Parler has admitted that some harms might be remedied by money damages, but pointed to the immediate present losses of income and customers as worthy of injunctive redress.

 

On inquiry by the court, counsel for Parler did not articulate a present emergency which would justify injunctive relief.

 

The court, without elaboration, promised its order would issue promptly.

Parler Resists War of Words with Amazon Web Services and Insists Parler Will Likely Go Out of Business Absent Judicial Intervention

Parler, LLC v Amazon Web Services, No. 2:21-cv-00031-BJR (W.D. Wash,).  Telephone conference with court set for 10 a.m. PST on January 14, 2021.


In Reply to Amazon Web Services’ (AWS) Opposition to Parler’s Motion for Injunctive Relief, Parler argues that AWS miscasts termination as suspension, a position negated by AWS’ statement to Parler that Parler could do nothing to be restored to service.

 

Parler offers that AWS never advised Parler what contractual obligation Parler had allegedly breached. Most significantly, AWS breached the contract by failing to adhere to the thirty day period before termination the agreement requires.

 

AWS has always been aware of, and never questioned, Parler’s proactive practices concerning problematic posts, which are reactive and use a jury system issues with posts.  Parler envisioned moving to prospective artificial intelligence screening in the coming year. Moreover, AWS expressed interest in Parler’s adoption of AWS’ proprietary software, an arrangement which, if consummated, would essentially marry the two entities.

 

Parler states that it has always responded to any posting issues presented to it by AWS.  When competitor Twitter terminated Donald Trump’s account and created a Parler account, mass migration from Twitter to Parler caused Parler not only to crash but to face a backlog of troublesome posts.

 

Parler worked diligently to address problematic material, advising AWS of its progress, and was all but finished with the backlog when AWS terminated service to Parler.

 

Parler notes that no one arrested in connection with the January 6th violence in the U.S. Capitol had a Parler account, An individual killed there had an account that was dormant since November.  The posting of videos by account holders does not establish that the poster was present at the Capitol.

 

Parler argues that AWS has succumbed to pressure to suppress conservative speech as well as to deny the President social media access. 

 

Parler further argues that AWS has unlawfully preferenced the bigger and wealthier Twitter, ensuring Twitter’s market dominance by forcing Parler out of business.

 

Surely AWS can be seen as having interfered with business relationships, Parler argues, as AWS’ termination of Parler interfered with Parler’s relationships with every one of its fifteen million users.

 

Section 230 of the Communications Decency Act does not operate as a bar to an antitrust action:  Section 230 immunizes speech, not anticompetitive conduct, which the Ninth Circuit has recognized.

 

Parler states that AWS’ termination has made it difficult for Parler to find a new web hosting partner, making it likely that Parler will go out of business absent judicial intervention.  

 

If the court fails to enjoin AWS, Parler submits, AWS’ termination will likely be fatal to Parler, but an injunction will require only that AWS provide services as required in its contract with Parler, balancing the equities in Parler’s favor.

Parler LLC v. Amazon Web Services, No. 2:21-cv-00031 (W.D. Wash.). Parler Reply (2021-01-13)

 

It’s not us, it’s them: Amazon Web Services States Parler’s Breach of Agreement with AWS Permitted Suspension, Denies Antitrust Violation, and Claims Immunity under Section 230 of the Communications Decency Act of 1996

Parler, LLC v. Amazon Web Services, No. 2:21-cv-00031 (BJR) (W.D. Wash.). Opposition to motion for injunction filed January 12, 2021.


Amazon Web Services (AWS) has opposed Parler’s motion for injunctive relief, asserting that its agreement with Parler permitted AWS to suspend or terminate Parler because of repeated troubling postings after the November election and after the January 6th eruption of violence in the Capitol.

 

AWS states that its agreement with Parler specifically permits the actions that it took. Amazon Web Services states that Parler was slow or failed to remedy threatening postings, and that when tens of thousands of posts went unaddressed, AWS was within its contractual rights to terminate or suspend Parler

 

Parler cannot state a claim for tortious interference with business relationships in the absence of a breach of contract, AWS reasons.  AWS states that Parler has not in fact been harmed, given Parler’s assertion that it would be offline for only half a day.

 

AWS argues that Parler cannot state a claim for violation of the Sherman Act where there is no evidence of any anti-competitive communication, let alone agreement, between AWS and Parler’s competitor Twitter.  Any difference in treatment between Parler and Twitter by AWS exists because of differences in AWS’s agreements with the two entities. 

 

Finally, and perhaps most importantly, AWS asserts that Section 230 of the Communications Decency Act of 1996 immunizes AWS from liability for any actions it has taken to remove offensive or harmful material from Parler, including suspension or termination..  The immunities conferred by Section 230 preclude Parler’s claims for breach of contract and anticompetitive conduct, AWS argues.

 

AWS states that injunctive relief is inappropriate where an injunction would inhibit or preclude AWS from entering into or policing its agreements.

 

AWS has submitted redacted copies of allegedly problematic postings from Parler and has submitted, with a request that they remain under seal, unredacted copies of such material.

 

Parler may submit a response today. At this writing no time for oral argument has been established.

Parler LLC v. Amazon Web Services, No. 2.21-cv-00031 (W.D. Wash.) Opposition to Motion for Injunction

David Versus Goliath (and Goliath). Parler Challenges Amazon Web Services’ Suspension as Anti-Competitive and in Breach of Contract

Parler LLC v. Amazon Web Services, No 2:21-cv-00031 (BJR) (W.D. Wash.) Verified Complaint filed January 11, 2021.


Amazon Web Services (AWS) has suspended webhosting services to Parler, a relative newcomer to the social media marketplace because, AWS has stated, AWS doubts Parler’s capacity to monitor postings that incite violence.

 

AWS suspended  Parler almost immediately after Parler’s competitor Twitter permanently terminated the account of Donald J Trump.  This  termination prompted a mass migration of customers from Twitter to Parler as well as a significant spike in new customers. 

 

AWS towers above other web hosting services globally.  By comparison with the shuttered Parler, Parler observes that AWS has promised Twitter timeline and enhanced services.

 

Parler asserts in its Complaint in federal court in Washington that because of the suspension, which Parler says has been presented like a termination, AWS has irreparably damaged Parler’s business and reputation.  

 

Even if Parler is able to find another platform, Parler avers, the time and other costs associated with rewriting Parler’s AWS-compatible code will be extraordinary.

 

Parler alleges that AWS’ agreement to enhance services to Twitter while forcing Parler from the marketplace violates the Sherman Antitrust Act. 

 

Parler also asserts that by effectively terminating Parler without the thirty day’s notice required by the agreement between the two, AWS has breached its agreement with Parler.  

 

Parler denies any breach of its agreement with AWS, stating that it removed any allegedly unacceptable comments that AWS brought to Parler’s attention.  Parler observes that similar content has been retained without comment on Twitter.

 

Briefing concerning injunctive relief will close January 13th.  A time for oral argument has not been set.

Parler LLC v. Amazon Web Services, No. 2:21-cv-00031 (W.D. Wash.) Verified Complaint

Life Online: Court Declines to Order Discovery of Litigant’s Internet Identities and Activities in Its Entirety


Lindke v. Freed, No. 20-10872 (S.D. Mich.) November 2, 2020.


Plaintiff sued the city manager of Port Huron, Michigan, asserting that deleting unfavorable or politically disadvantageous comments from the city manager’s Facebook page violates LIndke’s First Amendment rights.

The Second Circuit has concluded that public officials’ public social media accounts may not exclude opinion because of disagreement.  Knight First Amendment Institute at Columbia University v. Trump, 928 F.3d 226 (2nd Cir. 2020), petition for cert. Filed August 20, 2020 (20-197). 

Freed seeks discovery, broadly stated, of all plaintiff’s social media history and activity, which plaintiff argues is beyond the scope of the lawsuit.

Defendant objects to the idea that the discovery must be cabinned to the case:  the information sought is essential to establishing that plaintiff is a “cyberbully.”

The court recognized that discovery in support of a cyberbully defense could be had but not until Freed better articulates the nature of the defense he intends to present so that discovery can be reasonably related to the case and not overly broad or unduly burdensome.  

This is particularly important, the court pointed out, where states have adopted various definitions as components of “cyberbullying.”  The court noted that whether such activities qualify for First Amendment protections may remain open for exploration, as the range of definitions of “cyberbullying” vary from unprotected “true threats” to annoyance.  Michigan criminal law tends toward “true threats” but of interest concerning discovery is which definition Freed intends to advance.

In addition the issue of whether the plaintiff posted using multiple pseudonyms may be relevant but the discovery request remains too broad.  Freed may be able to seek information about plaintiff’s behavior on Freed’s site but not throughout the internet.  Postings and accounts unrelated to Freed are not discoverable, the court has concluded.

The court declined to enter  protective order limiting discovery to matters in the complaints as discovery is already limited in that way.  Further refinement at this time is not necessary, the court concluded, but the court left open the issue of whether an order would be appropriate in light of the defendant’s refinement of his defense. 

Lindke v. Freed (E.D. Mich. 2020) Order November 2, 2020