• With Facebook becoming a key electoral battleground, researchers are studying how automated accounts are used to alter political debate online

    • Revealed: Facebook’s internal rules on sex, terrorism and violence
    A Facebook Live broadcast hosted by ITV News with Theresa May
    A Facebook Live broadcast hosted by ITV News had Theresa May answering questions sent in by users of the site. Photograph: Facebook/PA

    One of the most powerful players in the British election is also one of the most opaque. With just over two weeks to go until voters go to the polls, there are two things every election expert agrees on: what happens on social media, and Facebook in particular, will have an enormous effect on how the country votes; and no one has any clue how to measure what’s actually happening there.

    “Many of us wish we could study Facebook,” said Prof Philip Howard, of the University of Oxford’s Internet Institute, “but we can’t, because they really don’t share anything.” Howard is leading a team of researchers studying “computational propaganda” at the university, attempting to shine a light on the ways automated accounts are used to alter debate online.

    “I think that there have been several democratic exercises in the last year that have gone off the rails because of large amounts of misinformation in the public sphere,” Howard said. “Brexit and its outcome, and the Trump election and its outcome, are what I think of as ‘mistakes’, in that there were such significant amounts of misinformation out in the public sphere.

    “Not all of that comes from automation. It also comes from the news culture, bubbles of education, and people’s ability to do critical thinking when they read the news. But the proximate cause of misinformation is Facebook serving junk news to large numbers of users.”

    Emily Taylor, chief executive of Oxford Information Labs and editor of the Journal of Cyber Policy, agreed, calling Facebook’s effect on democratic society “insidious”. Taylor expressed similar reservations about fake news being spread on social media, (a term Howard eschews due to its political connotations, preferring to describe such sources as “false”, “junk” or simply “bad”), but she added there was a “deeper, scarier, more insidious problem: we now exist in these curated environments, where we never see anything outside our own bubble … and we don’t realise how curated they are.”

    Advertisement

    A 2015 study suggested that more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed.

    In reality, the vast majority of content any given user subscribes to will never appear in front of them. Instead, Facebook shows an algorithmic selection, based on a number of factors: most importantly whether anyone has paid Facebook to promote the post, but also how you have interacted with similar posts in the past (by liking, commenting or sharing them) and how much other people have done the same.

    It is that last point that has Taylor worried about automation on social media sites. Advertising is a black hole of its own, but at least it has to be vaguely open: all social media sites mark sponsored posts as such, and political parties are required to report advertising spend at a national and local level.

    No such regulation applies to automation. “You see a post with 25,000 retweets or shares that comes into your timeline,” Taylor said, “and you don’t know how many of them are human.” She sees the automation as part of a broad spectrum of social media optimisation techniques, which parties use to ensure that their message rides the wave of the algorithmic curation on to as many timelines as possible. It is similar, though much younger and less documented, to search engine optimisation, the art of ensuring a particular web page shows up high on Google’s results pages.

    Categories: Social Media

    Tags: , , , , , , , , ,