Sign In  |  Register  |  About Menlo Park  |  Contact Us

Menlo Park, CA
September 01, 2020 1:28pm
7-Day Forecast | Traffic
  • Search Hotels in Menlo Park

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Supreme Court to hear arguments on Google and Twitter’s liability for terrorists’ online postings

The Supreme Court will hear arguments this week in a pair of cases about the legal liability of tech platforms like Google and Twitter for content posted by terrorists.

The Supreme Court will hear arguments on Tuesday and Wednesday in a pair of cases regarding the legal liability of technology platforms like Google and Twitter for content posted by adherents of terrorist groups like ISIS and the platforms’ role in allowing its circulation.

At issue in one case is Section 230 of the Communications Decency Act – a landmark law written in 1996 that helped shape the modern internet. Section 230 states, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

In practice, this means that tech companies like Facebook, Google and Twitter are generally immune from liability for what’s posted on their platforms by users – although they’re required to remove content that’s prohibited by federal law such as material infringing on copyrights or violating sex trafficking laws.

SUPREME COURT TO WEIGH GOOGLE AND TWITTER INTERNET FREE-SPEECH POLICIES

Section 230 has been controversial in recent years, leading to calls for reform by Congress. Proponents of Section 230 argue it has helped spawn innovations like social media platforms by not holding them liable for content posted by users, while critics argue it goes too far in shielding those platforms when users post harmful content or the companies censor or suppress certain viewpoints.

With the pair of cases set to come before the court, justices will have an opportunity to review Section 230 and how federal counterterrorism laws apply to internet platforms. Here’s a look at the cases:

This case was brought against Google by the family of Nohemi Gonzalez, who was killed at the age of 23 in the November 2015 ISIS terror attack in Paris.

The lawsuit claims that Google, which is the parent company of YouTube, helped ISIS’s recruitment efforts by allowing the terror group’s members to post videos on YouTube aimed at inciting violence and by the company recommending extremist videos to some of the platform’s viewers using its algorithms that consider viewers’ interests.

FIVE BIG CASES THE SUPREME COURT WILL HEAR OVER THE NEXT FOUR WEEKS

When the case reached the Ninth Circuit Court of Appeals, a panel of judges held that Section 230 protects platforms’ algorithmic recommendations. The judges said that while the language of Section 230 may be broader than Congress intended, it’s up to the legislative branch to clarify it, not the courts.

The plaintiffs argue that the Ninth Circuit erred in dismissing their case because YouTube provided notifications for recommended content and wasn’t simply acting as a provider in that process. Further, they claim that even if YouTube and Google were protected by Section 230 in this case, the recommendations still violated the Anti-Terrorism Act by helping ISIS recruit potential members.

Google and YouTube counter that while they condemn terrorism and have taken steps to increase their capacity to remove materials posted by terror groups, the lawsuit should be dismissed because Section 230 protects their actions as a publisher.

The Supreme Court will hear arguments in the Gonzalez v. Google case on Tuesday, Feb. 21.

ELON MUSK HINTS AT NEW FEATURES COMING TO TWITTER DIRECT MESSAGES, GROUP CHATS

This case was brought by the American family of Nawras Alassaf, a citizen of Jordan who was killed in January 2017 in a terror attack by ISIS that targeted a club in Istanbul, Turkey.

Their claim doesn’t focus on Section 230 and was brought instead under the Anti-Terrorism Act, alleging that Twitter hosting terrorists’ content online constitutes "aiding and abetting" the terror group because the social media company provided a platform for the gunman recruited by ISIS.

The Ninth Circuit Court of Appeals upheld the aiding and abetting claim from the family’s lawsuit but stopped short of saying that all content posted by terror groups on social media platforms is sufficient to advance an aiding and abetting claim under the Anti-Terrorism Act.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Twitter argues that the appellate court was mistaken in its ruling and that the relevant Anti-Terrorism Act provisions would only hold the social media company liable if it had provided assistance for a specific act of terror or declined to block accounts it knew were plotting a specific terror attack.

The Supreme Court will hear arguments in the Twitter v. Taamneh case on Wednesday, Feb. 22.

Fox News’ Shannon Bream and Bill Mears and the Associated Press contributed to this report.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 MenloPark.com & California Media Partners, LLC. All rights reserved.