Google, Government, Islamic State, Research, Society, Technology

Jihadi propaganda still active on YouTube

RESEARCH STUDY BY CEP

A study has revealed that YouTube repeatedly fails to remove jihadist videos within two hours of them being posted – because of “staggering holes” in its monitoring.

It found that the Google-owned video sharing site missed its target for taking down Islamic State films in one in four cases.

Dozens of terrorist-propaganda and recruitment videos were left for public viewing for more than three days at a time, clocking up tens of thousands of views, according to the three-month study by the Counter Extremism Project (CEP).

Disturbing, too, is that six in ten of the IS supporters who posted the hate videos were not even banned from the site and their accounts remain active.

The failings come after YouTube rejected an offer of free technology to instantly block any previously identified extremist content, preferring to develop its own system that it says deletes millions of banned videos before they are seen.

At the G7 summit in October last year, YouTube joined with Facebook, Twitter and Microsoft in an accord aimed at removing extremist content from their platforms within two hours.

But in the first in-depth independent study of IS videos on YouTube, the CEP found this was not happening because of “inexcusable” holes in the service’s monitoring system. Researchers found 229 previously identified terror videos were uploaded 1,348 times and viewed on 163,000 occasions over three months from March 8 to June 8, with 24 per cent left on the site for more than two hours.

They included the film Caliphate 4 – uploaded six times during the trial period – in which a terrorist taunts former soldier Prince Harry.

Another video called Hunt Them O Monotheist was uploaded 12 times during the study and on one occasion allowed to remain for 39 hours.

Computer scientist Dr Hany Farid, from Dartmouth College in the US, who developed a system that stops child abuse films being uploaded, created a similar program that instantly identifies and removes terror videos.

YouTube, Facebook and Google were all offered the eGlyph system free by the CEP in 2016 but decided not to use it.

Dr Farid said it was “infuriating” that companies worth billions refused to implement systems that could instantly stop jihadist videos. “Spectacular failures are allowing terror groups to continue to radicalise and recruit online,” he added.

Former Conservative Party minister Mark Simmonds, now a senior adviser to CEP, said: “This study dispels any lingering myth that YouTube are doing enough to stop their site being used as an IS recruitment tool.

“The research shows that YouTube are not even meeting their own promise to delete all extremist content within two hours. For them to fail in a quarter of all cases, with much of the content still available three days or more after first being uploaded, is unacceptable.”

He added: “Even videos that stayed online for less than two hours received a total of nearly 15,000 hits – any one could become a potential terrorist.

“It is staggering and inexcusable that well over half of the IS supporters who upload this dangerous content are not even banned and their accounts remain active . . . spreading IS propaganda and grooming potential recruits.”

Google said it “rejects terrorism and has a strong track record of taking swift action against terrorist content”.

A spokesman added: “We’ve invested heavily in people and technology to ensure we keep making progress to detect and remove terror content as quickly as possible.

“We’re a founding member of the Global Internet Forum to Counter Terrorism, which sees tech companies collaborate to keep terror content off the web.”

Standard
European Court, Google, Government, Legal, Society

Google deletes 1m links in ‘right to be forgotten’

GOOGLE & SOCIAL SEARCH ENGINES

GOOGLE has been asked to remove 2.4million web links under the so-called ‘right to be forgotten’.

The web giant has deleted 900,665 links from its search results since the European ruling came into force three years ago – including to news websites and government documents.

This comes amidst Google facing its first ‘right to be forgotten’ legal battle in the English courts against a businessman it accused of attempting “to rewrite history” by using the rule to try and hide articles about his criminal past.

The European Court of Justice ruled in May 2014 that Google must remove links to websites that include content that is “inadequate, irrelevant or no longer relevant”. Removing a link from search engine results makes it very difficult to find online.

Requests have included demands from killers, terrorists, fraudsters and internet trolls who want to hide their criminal pasts.

The court case will be closely watched by convicted criminals and others who wish to hide their past histories.

It involves a businessman – who cannot be named for legal reasons – who was imprisoned for “conspiracy to account falsely” in the 1990s.

 

Standard
Google, Government, Technology, Terrorism

Google finally acts to block internet terrorists

GOOGLE

The Internet Search Giant Google is to start automatically searching for extremist material online. After months and years of campaigning by human rights groups and other lobbyists, it finally seems to be taking seriously the threat of terrorists on the web.

Through a process of algorithm adaptation Google will enhance the effectiveness of its computers to look for potentially dangerous content. This will then be reviewed to decide if it should be taken down.

Technology firms, including Google and its video streaming site YouTube, have been accused of foot-dragging and failing to remove extremist material quickly enough.

Kent Walker, Google’s senior vice-president, has now announced a plan to tackle the problem. He admits the search engine had previously not done enough.

“There should be no place for terrorist content on our services. While we and others have worked for years to remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done,” he said.

As part of the new effort, Google will use new technology to help identify extremist videos.

It is also extending its funding of experts who decide whether material should be taken down from the web.

The firm has pledged a “tougher stance” on videos that do not directly violate its rules but contain, for example, inflammatory religious or supremacist content.

In future, these will appear with a warning and adverts will not run with them, meaning those who post them online will not make money from such content.

And YouTube will re-direct potential Islamic State recruits who search for extremist material to anti-terror videos aimed at stopping them from being radicalised.

Mr Walker said: “Collectively these changes will make a difference. And we’ll keep working on the problem until we get the balance right.

“Extremists and terrorists seek to attack and erode not just our security but also our values – the very things that make our societies free. We must not let them. Together, we can build lasting solutions that address threats to our security. We are committed to playing our part.”

Labour MP Yvette Cooper, who was chairman of the home affairs select committee in the last Parliament, welcomed Google’s announcement.

She said: “The committee recommended that they should be more proactive in searching for – and taking down – extremist content.

“News that Google will now proactively scan content is therefore welcome, though there is still more to do.

“Still today there is illegal content easily accessible on YouTube – including terrorist propaganda. Google cannot delay in implementing these new rules.

“As with any other business, social media companies have a responsibility to make sure their platforms are safe. These steps are the first in a series which need to be taken to ensure they are fulfilling their important obligations.”

Meanwhile, Foreign Secretary Boris Johnson has called on fellow EU ministers to apply joint pressure on technology firms to do more to tackle the problem of extremism online.

At a meeting concluded in Luxembourg, Mr Johnson is hoping that all 28 foreign ministers will agree to establish an industry-led forum on preventing radicalisation via the internet.

In the wake of the Manchester bomb attack last month, links to handbooks imploring extremists to murder, and providing instructions for constructing home-made bombs, are still readily available.

 

Standard