The families of three men killed at a gay nightclub in Orlando are suing Google, Twitter and Facebook, accusing the companies of allowing the Islamic State to spread extremist propaganda on their sites.
The lawsuit argues the sites provided
“platforms” that successfully “radicalized’ Omar Mateen, the gunman who killed 49 people in one of the largest mass shootings in American history.
The lawsuit is not the first attempt to sue the sites on these specific grounds, but some experts argue that other victims should and could follow this path.
On June 17, 2015, Dylan Roof entered Methodist Episcopal Church in Charleston, South Carolina, opened fire and killed nine African-Americans.
Roof has made no apologies for the killings and has spoken extensively about his radicalization process and manifesto. He writes that a Google search
about ‘black on white crime’ following Trayvon Martin’s death changed him forever.
Google did not comment on the case but told theGrio.com in a statement:
“The views expressed by hate sites are not in any way endorsed by Google, but search is a reflection of the content and information that is available on the Internet. We do not remove content from our search results, except in very limited cases such as illegal content, malware and violations of our webmaster guidelines.”
The explanation is not good enough for advocates at the Southern Poverty Law Center
who say it’s time private companies like Google, Twitter and Facebook take some ownership over some of these violent incidents.
“They should be held responsible for the content that they are allowing to be promoted and to reach a large audience through that mechanism,” said Keegan Hankes, with Southern Poverty Law Center. “And they should hold themselves to a higher standard given how much of the country gets their news and their information from social media platforms.”
As it stands, U.S. federal law has protections in place that prevent the technology industry from being liable for content posted by other people. Site providers have made the claim they are not content contributors but instead a platform.
“Whether it’s inadvertent or not while they allow really repugnant and racist views to propagate there,” Hankes adds. “And I think that they have a responsibility to take a stance.”
The issue hits home for people like Brittany Packnett, co-founder of the Campaign Zero
. The organization protests against police violence in America and has consistently been victims of hate speech.
“Social media is helping create a community, [so] there should be guideline standards for how one engages with that community and whose allowed to be in that community based on how they treat other people,” Packnett said. “And I don’t know that the guidelines that currently exist are enough.”
Instead, Packnett has had to take steps on her own to block hateful internet speech while still fearing for her personal safety. Combined, organizers have blocked more than 10,000 accounts.
“I will tell you the hate speech that I’m getting everyday is not coming from any person of color, it’s coming from white radicalized people who we want to pretend were radicalized themselves.”
Twitter and Facebook declined to comment for this story.
Facebook has previously stated its policy regarding hate speech in a blog post stating
, “We prohibit content deemed to be directly harmful, but allow content that is offensive or controversial.”
In part, Twitter’s abusive policy reads
, “You may not make threats of violence or promote violence, including threatening or promoting terrorism.”
Roof was convicted in December on all 33 counts of federal hate crime charges for murdering nine people inside Emanuel AME Church.
The sentencing phase of Roof’s trial is likely to end Monday. Prosecutors are arguing Roof should face the death penalty.
Ashantai Hathaway is a reporter at theGrio. Keep up with her on Twitter @ashantaih83.
The post Could Charleston victims’ families sue sites like Google over Dylann Roof’s rampage? appeared first on theGrio.