By Brad Balach

For nearly 30 years, Section 230 of the Communications Decency Act has shielded internet platforms from the liabilities that third party content can create.[1] Initially, the authors of Section 230 intended it to clarify the liability of online services for content posted by others on their platforms.[2] However, as social media companies emerged and benefitted from favorable judicial interpretations over the decades, Section 230 has become a source of total immunity despite some of the harmful and tragic events that have been contributed by platform use.[3]

With little regulation, internet platforms have had the latitude to moderate content as little or as much as they see fit. That may change soon, as the Supreme Court is expected to rule on a pair of cases addressing Section 230 and the content moderation practices of social media platforms.[4]

In Gonzalez v. Google, the family of a victim of the November 2015 ISIS attack in Paris alleges that the Google-owned service YouTube was used by ISIS to recruit and radicalize combatants, and that the platform provided material support to terrorists by sharing advertising revenue.[5] The Ninth Circuit Court of Appeals dismissed the case, citing Section 230 protection for YouTube and the revenue sharing being part of normal business.[6]

In Twitter v. Taamneh, family members of a victim of a 2017 ISIS attack in Istanbul alleged that Twitter, Google, and Facebook aided and abetted ISIS by allowing the distribution of its material without editorial supervision.[7] The Ninth Circuit found that the companies could face claims for playing an assistive role.[8] Both cases were granted certiorari and completed oral arguments in the first quarter of 2023. [9]

The plaintiffs in the Gonzalez and Taamneh cases have argued that the assumption made in Section 230 that online platforms are simply transporting the work of third parties does not accurately reflect how companies utilize digital technology today.[10] They contend that algorithmic recommendation, which is a standard feature on most platforms, transforms them from an interactive computer service protected under Section 230 to an unprotected information content provider.[11]

Several co-authors of Section 230 have argued in an amicus brief that the law anticipated recommendation algorithms and content curation efforts.[12] The Department of Justice also submitted amicus brief arguing that algorithmic promotion is a distinct form of conduct and differs from the idealized public square as social media platforms are closed businesses designed to maximize revenue.[13] The impact of these factors on Section 230’s liability protections is likely to be a major issue before the Supreme Court.

Justice Thomas has hinted in the past at two possible approaches that the Court could take in guiding their decision in a Section 230 case.[14] In 2020, Justice Thomas noted that many courts have interpreted Section 230 too broadly, and thus platforms have been given total immunity for distributed content.[15] He suggested that scaling back this immunity would not necessarily make these companies liable for online misconduct, but it would give plaintiffs a chance to bring claims against them.[16]

The Second proposed approach suggested that some platforms may be regulated as common carriers or places of accommodation.[17] This concept traditionally applies to telephone companies, and the plaintiffs in Gonzalez and Taamneh argue that online platforms are part of the communications infrastructure, providing a potential opening to make this argument.[18]

The Court’s decision on the content moderation issue is expected to spark public debate and prompt calls for Congress to take the lead in making decisions rather than leaving it to the courts. It remains uncertain how the Court will address the Section 230 issue, as there are several possible directions it could take regarding content moderation, but what is certain is that we await a decision that will likely change the landscape of the internet for decades to come.


[1] Nina Totenberg, Supreme Court showdown for Google, Twitter and the social media world, NPR (Feb. 21, 2023) https://www.npr.org/2023/02/21/1157683233/supreme-court-google-twitter-section-230.

[2] Id.

[3] Id.

[4] Id.

[5] Gonzalez v. Google LLC, 2 F.4th 871, 880 (9th Cir. 2021).

[6] Id.

[7] Twitter, Inc. v. Taamneh, 214 L. Ed. 2d 12, 143 S. Ct. 81 (2022)

[8] Id.

[9] Supra note 1.

[10] Mark McCarthy, Congress Should Reform Section 230 in Light of the Oral Argument in Gonzalez, Lawfare (Mar. 22, 2023) https://www.lawfareblog.com/congress-should-reform-section-230-light-oral-argument-gonzalez.

[11] Id.

[12] Tom Wheeler, The Supreme Court takes up Section 230, Brookings (Jan. 31, 2023) https://www.brookings.edu/blog/techtank/2023/01/31/the-supreme-court-takes-up-section-230/.

[13] Id.

[14] Id.

[15] Id.

[16] Id.

[17] Wheeler, supra note 12.

[18] Wheeler, supra note 12.