Friday 29 September 2017

EC makes idiotic plans for dangerous and unworkable mass online censorship

Copyright. Because - disappointingly, I couldn't find any good images for 'copyshite'
The European Commission has outlined plans for mandatory copyright filters on sites the public are allowed to post on. The filters will automatically decide whether some user-posted content - a video, say - infringes copyright and will automatically take down that content without input from any pesky, expensive humans.

This is the sort of thing Teresa May means when she talks about companies like Facebook using technology to take down 'extremist' content. But more on that later.

There are two problems with this idea. 

First, it is a clear violation of our human rights. These measures would constitute mass surveillance of all the content we post online and the automatic removal of the things we post that a company or government doesn't want us to.  This should worry you.  Governments and companies - however apparently benign - always want to censor complaints.  There's never been such a thing as a benign government and companies tend to get less so in direct proportion to the power they gain over their customers.  More worryingly still, governments are going to want access to the outputs of these filters. They're going to want to know who is posting what kind of content across multiple sites.  They'll want to know who's 'making trouble' by criticising party lines and what better way to do that by recognising what materials they've referenced in their works?  Companies will want access to the filters too and the ones running the filters will be all too happy to sell that access.  For a lot of companies, this kind of data would be pure gold. It would help them to better identify people as sales targets and to take down fair and protected criticism of their content.

Second, it won't work.  We know this because YouTube.  YouTube has spent years and several fortunes trying to solve this problem and the results are notoriously terrible.

Julia Redda is a German MEP and member of the Pirate Party Germany.  She wrote an article on (mostly) this second point. Filters of this sort don't work, have never worked and likely will never work, even if we wanted them to, which we shouldn't.

Here are some highlights:
5. Memory holes in the Syrian ArchiveAnother kind of filter in use on YouTube, and endorsed by the Commission, is designed to remove “extremist material”. It tries to detect suspicious content like ISIS flags in videos. What it also found and removed, however: Tens of thousands of videos documenting atrocities in Syria – in effect, it silenced efforts to expose war crimes.
I'll also wave vaguely in the direction of my point above about companies ravenously buying up access to filters. What better way to identify and censor whistleblowers?
6. Political speech removedMy colleague Marietje Schaake uploaded footage of a debate on torture in the European Parliament to YouTube – only to have it taken down for supposed violation of their community guidelines, depriving citizens of the ability to watch their elected representatives debate policy. Google later blamed a malfunctioning spam filter.
It was sweet of them to blame an oaf but really it was.... But ironically I can't find a clip of the Simpsons I wanted to link to to illustrate this. I wonder why.
7. Marginalised voices drowned outSome kinds of filters are used not to take down content, but to classify whether it’s acceptable to advertisers and therefore eligible for monetisation, or suitable for minors.
Recently, queer YouTubers found their videos blocked from monetization or hidden in supposedly child-friendly ‘restricted mode’ en masse – suggesting that the filter has somehow arrived at the judgement that LGBT* topics are not worthy of being widely seen.
Read the full article. And share it. But I think it's useful to put Redda's lessons supposedly but obviously not learned by the EC here:
  • Lesson: That such a ridiculous error was made after years of investment into filtering technology shows: It’s extremely hard to get this technology right – if it is possible at all.
  • Lesson: Copyright exceptions and limitations are essential to ensure the human rights to freedom of expression and to take part in cultural life. They allow us to quote from works, to create parodies and to use copyrighted works in education. Filters can’t determine whether a use is covered by an exception – undermining our fundamental rights.
  • Lesson:Public domain content is at risk by filters designed only with copyrighted content in mind, and where humans are involved at no point in the process.
  • Lesson: Automatic filters give big business all the power. Individual users are considered guilty until proven innocent: While the takedown is automatic, the recovery of an illegitimate takedown requires a cumbersome fight.
  • Lesson: Filters can’t understand the context of a video sufficiently to determine whether it should be deleted.
  • Lesson: There are no safeguards in place to ensure that even the most obviously benign political speech isn’t caught up in automated filters
  • Lesson: Already marginalised communities may be among the most vulnerable to have their expression filtered away.
  • Lesson: Legitimate, licensed uploads regularly get caught in filters.
  • Lesson: Not even those wielding the filters have their algorithms under control.
Remember: all these lessons are illustrated in Julia Redda's article with examples

Remember: lawmakers are recommending forcing this kind of clusterfuck on millions of people without understanding that they don't work or caring that they violate our most basic human rights.

Remember: that Julia Redda and others (the EFF, for example) are fighting this and you can support them.

No comments:

Post a Comment