War of The Mods



A big landmark case involving Twitter and the family of a man murdered by ISIS is currently in progress and our Supreme Court is somehow befuddled on how to proceed. The case was heard on Wednesday and concerns the Justice Against Sponsors of Terrorism Act (JASTA). JASTA reads like it was written by a sadist who takes a perverse pleasure in watching lawyers and judges try to navigate convoluted regulation. Briefly, the law permits “any national of the United States” who is injured by an act of international terrorism to sue anyone who aids and abets, by knowingly providing substantial assistance to anyone who commits such an act. It also instructs courts to look at a federal appeals court’s 1983 decision in Halberstam v. Welch.

Two cases have been heard this week and are awaiting adjucation. Gonzalez vs. Google and Twitter v. Taamneh.

The case against Google was brought by the family of Nohemi Gonzalez, a 23-year-old college student who was killed in a restaurant in Paris during the terrorist attacks in November 2015. The ISIS attack targeted the Bataclan concert hall, where Eagles of Death Metal were playing a live show. Gonzalez’s family filed suit under the Antiterrorism Act, which allows the families of victims of terrorism to sue entities that aid terrorist groups. This is apparently based on a proliferation of Isis recruitment videos being recommended by YouTube’s algorithm at the time of the attack. Google’s defense was that Section 230 of the Communications Decency Act gives it immunity from civil liability.

Twitter v. Taamneh which concerns the murder of Nawras Alassaf of Istanbul by ISIS — a lawsuit asking whether social media companies violated JASTA on the preposition that the terrorist group ISIS sometimes uses social media platforms like Twitter, Facebook, and YouTube — is urging the lawyers arguing the case to give them some modicum of an objective framework to latch onto.

Seth P. Waxman, a lawyer for Twitter, stressed that the plaintiffs had not accused his client of providing “substantial assistance, much less knowing substantial assistance, to that attack or, for that matter, to any other attack,” adding that it was undisputed that Twitter “had no intent to aid ISIS’s terrorist activities.”

He went on: “What we have here is an alleged failure to do more to ferret out violations of a clear and enforced policy against assisting or allowing any postings supporting terrorist organizations or activities.” That was not enough, Mr. Waxman argued, to indicate “aiding and abetting an act of international terrorism.”

Justice Sonia Sotomayor told Mr. Waxman that the fact remained that “you knew that ISIS was using your platform.”

Justice Brett M. Kavanaugh summarized Twitter’s position: “When there’s a legitimate business that provides services on a widely available basis in an arm’s length manner, it’s not going to be liable under this statute even if it knows bad people use its services for bad things.”

Justice Elena Kagan asked Edwin S. Kneedler, a lawyer for the federal government arguing in support of Twitter, how the case differs from ones concerning providing banking services to known terrorists.

“They provide a hundred other clients who are not terrorists with the same banking services, but they provide this known terrorist with these banking services that are very important to its terrorist activities,” she said. “Can you go after that person under this statute?”

Mr. Kneedler said yes, so long as the customer was “somebody who is a leader or somebody who you know has committed or is about to commit a terrorist act.”

Section 230 of the Communications Decency Act, which protects these companies, has faced criticism across the political spectrum. Many liberals say it has shielded tech platforms from responsibility for disinformation, hate speech and violent content. Some conservatives say the provision has allowed the platforms to grow so powerful that they can effectively exclude voices on the right from the national conversation.

This landmark case is echoed by another — Netchoice v. Paxton. A suit against Attorney General Ken Paxton of Texas resulting from HB 20, a Texas law that would govern the behavior, particularly with regard to content moderation, of social media companies with more than 50 million users. This, in addition to the same plaintiff filing against Florida in NetChoice v. Moody, regarding Florida’s SB 7072.

The court’s request for the Biden administration’s views in the two new cases — NetChoice v. Moody, No. 22-393, and NetChoice v. Paxton, No. 22-555 — probably means that it will rule on the previous cases concerning Section 230 before it decides whether to hear the new cases.

Popper’s Paradox states “Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed.” It’s the ouroboros snake. Although, who decides what is intolerant and what isn’t? It then becomes an ethical and epistemological debate. There isnt a simple answer, but one could could surely concede — that which isn’t life affirming or isn’t supportive of inalienable rights is intolerant.

Leave a Reply

Your email address will not be published. Required fields are marked *

0
Would love your thoughts, please comment.x
()
x