Are social media companies doing enough to police terror activity?
Sen. Dianne Feinstein, a California Democrat, argues emphatically that they are not. The top Democrat on the Senate Judiciary Committee is planning a new push to force social media companies to report “knowledge of terrorist activity” as relatives of victims in the 2015 San Bernardino, California, terror attack sue Twitter, Google and Facebook, accusing them of supporting extremists online.
Social media companies are feeling heightened pressure to do more in the U.S. and abroad. A new report from the British Parliament asserts the Silicon Valley companies are “shamefully far” from taking the needed action to quickly remove terror-related content and should face major fines.
The escalating demands on companies like Google, Facebook and Twitter to censor or turn over information about their users raises major questions, though, about national security, privacy and how much power the government should have to know what you do on your smartphone.
Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University in California, questioned where service providers would draw the line if Congress forced them to report all “terrorist activity.”
“What are the risks to innocent people who are mistakenly flagged as engaging in terrorist activities?” said Goldman, a law professor and former Silicon Valley lawyer. “Such a flag has the potential to destroy their lives based on a mistake.”
Congress is closely watching whether social media companies can solve the problem on their own.
Google, which owns YouTube, is taking a hit, with major advertisers pulling out after it turned out their ads were appearing next to extremist or hate-speech videos. Facebook is adding 3,000 staffers under pressure to monitor content after Facebook Live was used to post a series of horrific acts, including a murder in Cleveland.
Governments are increasingly frustrated with the growing use of encrypted devices or apps to mask content. The FBI reported to Congress last week that during the last six months alone it failed to unlock half the 6,000 phones or other devices for which it has search warrants or court orders to access the data. The Trump administration is concerned about the issue and debating what to do about it, according to the agency.
“It is clear these issues are becoming increasingly problematic for law enforcement and that ultimately some sort of legislative solution will be required to address them,” said Susan Hennessey, a former National Security Agency lawyer who is a cybersecurity expert at the Brookings Institution, a Washington research group.
Feinstein wants to work with the FBI on a measure based on child pornography statutes that would require tech companies to report online “terrorist activity” to law enforcement.
Feinstein, who has the support of Republican senators such as Richard Burr of North Carolina and Marco Rubio of Florida, has tried before to force tech companies to act. She’s now in a stronger position as the top Democrat on the Judiciary Committee, the panel that has broad jurisdiction over internet issues.
Feinstein is also talking about resurrecting efforts in Congress to require Silicon Valley companies to give law enforcement access to encrypted data when there is a court order. Apple refused a court order last year to unlock the iPhone of San Bernardino shooter Syed Farook. The FBI ended up spending $900,000 to hack into the phone.
“I've gone out, I tried to talk to the tech companies that are in my state. One – Facebook – was very good and understood the problem. But most do not,” Feinstein said at a recent Senate Judiciary Committee hearing.
Tech companies and privacy advocates counter that it’s dangerous for the government to demand “backdoor” access to encrypted information from phones and other technology that hackers and thieves can then exploit.
Mark MacCarthy, vice president of the Software and Information Industry Association, whose members include Facebook and Google, said social media companies were constantly sharpening their efforts to block terrorist propaganda.
He said the companies were “vigilant about conduct that creates an imminent danger to the public by reporting this information to law enforcement.”
“While it is a legitimate concern that terrorist organizations are using social media platforms to spread their propaganda and recruit fighters, an obligation for social media companies and others to report undefined ‘terrorist activity’ to the government is a terrible idea,” he said. “It would create First Amendment issues and potentially place innocent people under government surveillance for protected expression while doing nothing to make us safer.”
Pressure grows after each terror attack. Relatives of three of the 14 people killed in San Bernardino sued seeking monetary damages against Twitter, Google and Facebook in the U.S. District Court for the Central District of California.
Courts have dismissed similar lawsuits under the Communications Decency Act, a federal statute that shields online providers from liability for what users post.
“It has very low odds of success,” said Santa Clara University law professor Goldman.
The lawsuit, though, could be more effective as a public relations attempt to pressure the companies into greater action – or to help prod Congress into deciding it is time to pass legislation forcing Silicon Valley’s hand.
“For years, defendants have knowingly and recklessly provided the terrorist group ISIS with accounts to use its social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits,” the lawsuit claims. “Without defendants Twitter, Facebook and Google (YouTube), the explosive growth of ISIS over the last few years into the most feared terrorist group in the world would not have been possible.”