Facebook increases efforts against Terrorism


Hours after the December shootings in San Bernardino,  Mark Wallace the coach of the  nonprofit Extremism Project asked his employees at the Counter to comb social media for profiles of the alleged attackers.
They failed reason be that A team at Facebook Inc. had already removed a profile for Tashfeen Malik, after haven  seen her name in news reports.
The incident highlights how Facebook, under pressure from government bodies is more aggressively policing material it views as supporting terrorism. The world’s largest social network is quicker to remove users who back terror groups and investigates posts by their friends. It has put together a team focused on terrorist content and is helping promote “counter speech,” or posts that aim to discredit militant groups like Islamic State.

This move comes as attacks on Westerners proliferate and U.S. lawmakers and the Obama administration intensify pressure on Facebook and other tech companies to curb extremist propaganda online. Top U.S. officials flew to Silicon Valley in January to press their case with executives including Facebook Chief Operating Officer Sandberg  Sheryl . Last week, Twitter Inc. said it suspended 125,000 accounts associated with Islamic State.
Tech companies “have a social responsibility to not just see themselves as a place where people can freely express themselves and debate issues,” Lt. Gen. Michael Flynn, who ran the U.S. Defense Intelligence Agency till 2014 said in an interview.
Facebook’s tougher approach puts the company in a tight spot, forcing it to navigate between public safety and the free-speech and privacy rights of its nearly 1.6 billion users.
After the Jan. 8 meeting, the Electronic Frontier Foundation, a nonprofit privacy organization, urged other tech companies and facebook as well not to “become agents of the government.”
Facebook said it believes it has an obligation to keep the social network safe.
Political winds have shifted since the 2013 disclosures by former National Security Agency contractor Snowden Edward  about government surveillance. Then, Facebook reassured users that the U.S. government didn’t have direct access to its servers and started regularly reporting the volume of government requests for it's user database. The enhanced push against extremist content is a separate initiative.
Leading Facebook’s new approach is Monika Bickert, a former federal prosecutor whose team sets global policy for what can be posted on the social network.
Facebook takes a hard line toward terrorism and terrorists, she said.
If it’s the leader of Boko Haram and he wants to post pictures of his two-year-old and some kittens, that would not be allowed.
—Monika Bickert, Facebook’s head of global policy management
Facebook now relies on users to report posts that violate its standards, such as images that “celebrate or glorify violence.” After an attack, it scours news reports or asks police agencies and activists for names so it can remove suspects’ profiles and memorialize victims’ accounts.
Facebook has strengthened the process over the past year. The social network says it uses profiles it deems supportive of terrorism as a jumping-off point to identify and potentially delete associated accounts that also may post material that supports terrorism.
Executives say they began this fanning-out process, which they use only in terrorism cases, about a year ago after consulting with academics and experts who said terrorists normally operate in groups. The searches are conducted by a multilingual team that examines the events people have attended or pages they have “liked,” among other things.
In some cases, Ms. Bickert consults Facebook lawyers about whether a post contains an “imminent threat.” Facebook’s legal team makes the ultimate decision about whether to notify law enforcement. Simply posting praise of Islamic State may not be what the U.S government is after .
Neither Facebook nor law-enforcement agencies would discuss in detail how closely they cooperate, in part to avoid tipping off terror groups, they said. Facebook also wouldn’t discuss the criteria it uses to determine what material supports terrorism and how many terror experts it has hired.
Some counterterrorism experts say Facebook shouldn’t quickly delete user accounts, so police can monitor them and possibly snare others. But Ms. Bickert, who worked on public corruption and gang-related violence cases as a prosecutor, said leaving up terrorist messages could cause harm.

Popular posts from this blog

Salight - I Am The Light

Heaven Rejoicing Ft. Tosin Bee, Moses Onofeghara & Femi Okunuga

AUDIO + VIDEO: Prinx Louis - Bia Nulu