One of Britain’s most senior police chiefs says migrants are being “lured to their death” because Facebook is failing to shut down trafficking pages.
Tom Dowdall, the deputy director of the National Crime Agency, pointed the finger of blame at Facebook after claiming that it has found more than 800 pages run by organised gangs, which ferry illegal immigrants across Europe on boats unfit for crossing oceans.
It remains a major problem in Europe, with the UN revealing last week that more than 1,500 refugees and migrants lost their lives attempting to cross the Mediterranean in 2018 alone.
In an interview with The Evening Standard, Dowdall said: “Since December 2016, we have identified over 800 Facebook pages which we consider as being associated with organised immigration crime. That is largely offering vessels, documents, transport services. There is enough we are seeing to indicate to us that it supports criminality.”
He added: “They are being lured to their deaths using an application that they are using every day of the week.”
Dowdall said it is problem that needs to be addressed by Facebook, but the National Crime Agency “haven’t had enough willingness yet” from the Silicon Valley company. The technology exists, he said, to target the offending pages, but Facebook is “not stepping up in the way we would want.”
He explained: “Facebook have developed a fantastic ability to be able to identify patterns and how everybody operates on a day to day basis.
“This is no different: there will be patterns that are developed here which we know that Facebook and others can be onto really quickly. We need their cooperation to be able to identify and to either close down these sites or be able to further investigate them.”
A Facebook spokeswoman said: “People smuggling is illegal and any posts, pages or groups that coordinate this activity are not allowed on Facebook.
“We work closely with law enforcement agencies around the world including Europol to identify, remove and report this illegal activity, and we’re always improving the methods we use to identify content that breaks our policies, including doubling our safety and security team to 20,000 people and investing in technology.”