Facebook, YouTube, and Twitter could face fines if they fail to take down terrorist content within minutes
- The EU Commissioner for Security Julian King is drawing up legislation, which would force tech companies to take down terrorist content or face fines.
- King told the Financial Times that the European Commission has not seen enough progress on the removal of terrorist material from tech platforms.
- A senior EU official said the draft legislation would likely impose a limit of one hour for platforms to delete terr0r-related material flagged by enforcement agencies.
The EU is planning to crack down on tech companies like Facebook, YouTube, and Twitter by imposing fines if they don't remove terrorist material from their platforms quickly enough.
The EU Commissioner for Security Julian King told the Financial Times that in draft regulation due to be published next month, the EU will take a harder line with tech companies.
Until now the EU has adopted a policy of allowing tech companies to self-regulate, but King said the EU has "not seen enough progress" from tech companies and is taking a stronger position, "in order to better protect our citizens."
The exact details of the proposed regulation are still being thrashed out, but a senior EU official told the FT that tech companies would have a time limit of one hour to remove any material marked as terrorist content by the police or other relevant law enforcement. If companies like Facebook, Google and Twitter fail to do so, they could face fines.
"We cannot afford to relax or become complacent in the face of such a shadowy and destructive phenomenon," said King.
This would be the first time the EU has targeted tech companies' handling of illegal content with punitive measures, but the Commission has butted heads with big tech before.
It has demonstrated a willingness to punish Silicon Valley giants for wrongdoing, such as the record-breaking $5 billion fine for Google in July for abusing the dominance of its Android operating system.
King made it clear that the draft legislation would apply to all websites, large or small.
"The difference in size and resources means platforms have differing capabilities to act against terrorist content and their policies for doing so are not always transparent," he said. "All this leads to such content continuing to proliferate across the internet, reappearing once deleted and spreading from platform to platform."
Once published, the draft regulation would have to be approved by a majority of the EU's 28 member states. There is likely to be support for the plans, with British Prime Minister Theresa May previously warning tech firms to get their act together on terror content. Germany has already introduced fines of up to €50 million ($57 million) for firms that fail to remove hate speech.
Business Insider has contacted the European Commission, Facebook, Google, and Twitter for comment.
SEE ALSO: Google needs to apologize for violating the trust of its users once again
Join the conversation about this story »
NOW WATCH: Everything Samsung just announced — the Galaxy Note 9, Fortnite, and more
Contributer : Tech Insider https://ift.tt/2BtAVts
No comments:
Post a Comment