Facebook risks starting a war on knowledge

Data access, Facebook, Politics, Research, Social media

Facebook is closing its doors to researchers in the wake of the Cambridge Analytica scandal. The latest casualty is the app Netvizz, a research tool used by hundreds of academics to gather public Facebook data, that the social network has recently banned. The app has gathered more than 300 academic citations and has been used to produce studies on everything from Norwegian political party videos, to public opinion about the London 2012 Olympic Games, to Asian American student conferences. But now this fruitful source of data has been shut down.

More significantly, Facebook’s action sounds a death knell for civic access to public Facebook data. Inevitably, all apps like Netvizz will be wiped from the platform. The Cambridge Analytica scandal, which saw Facebook user data gathered supposedly for academic purposes but instead used by a private firm for political campaigning, created an opportunity for positive change. But Facebook sadly appears to be making its platform more opaque, unknown and unaccountable to the public.

Once apps like Netvizz are gone, there will be no accessible way of gathering large amounts of public page Facebook data. Facebook offers only highly restrictive search options for normal users. It has started new initiatives to offer access to its data for scholarly research, but these are dependent on a “hand-picked” group of scholars who “define the research agenda”. Without broader access for other researchers, the social, academic and political consequences are dark.

Netvizz offers users the ability to extract basic data from public Facebook pages, such as the content and frequency of posts, likes, shares and comments. This can be used to analyse what users are discussinghow they feel about certain things, or how they respond to certain content. And this can feed into studies on a huge range of important topics, such as how fake news spreads or how social media can affect young people’s mental health.

Netvizz is an internal app within Facebook that uses the social network’s Graph API (application programming interface), a piece of software that provides access to data. Netvizz then organises this data into a spreadsheet format that can be easily read by anyone. Importantly, it doesn’t gather personal data on users. But Facebook’s API is becoming a closed system, meaning that this basic public data is becoming impossible to access, threatening our knowledge of the world.

Without access to user data in this way, it will be a lot harder to spot patterns in what users are doing and saying on Facebook. In response, Netvizz’s creator, Bernhard Rieder of the University of Amsterdam, said: “academic research is set to be funnelled into new institutional forms that offer (Facebook) more control than API-based data access.” He added: “independent research of a 2+ billion user platform just got a lot harder.”

This isn’t just a headache for thousands of academics worldwide. Given the growing influence Facebook has over political debate and behavioural trends, it means that the public could be denied important information that is vital to protecting democracy, social relationships and even public health.

For example, my own research into British political parties’ campaigns on Facebook is set to become much more difficult. Without apps like Netvizz offering a gateway to extract public political content, messages sent to voters during elections will be too discrete to investigate. In this way, society’s capacity to question what political parties are doing is being curtailed by Facebook, undermining democratic accountability and our power to understand politics on social media.

User data can reveal things about political parties. https://www.facebook.com/labourparty/

The questionable use of Facebook data by academic researchers and political campaignersin the Cambridge Analytica scandal highlights the need for new privacy and security measures. But Facebook has already successfully altered API access over the last few years, preventing further personal data from being gathered in the manner of Cambridge Analytica, while allowing research with public data to continue.

Facebook had struck the perfect balance between privacy and access. But the company now appears to be building a wall around its data, not to just to protect users but also to protect itself. And in doing so, Facebook is also protecting the powerful, curtailing our ability to scrutinise and question the influence of politicians, corporations and others with the money to spend on large advertising campaigns. By prioritising privacy over transparency, Facebook is setting up a potential ban on this knowledge.

A legal framework is needed to guarantee Facebook users and researchers at least some access to API data for public pages, especially for those of national interest such as political parties, media organisations and government bodies. Facebook must go further than its current restrictive plans and open its data to help promote research and democratic accountability.

Several petitions have been started, including one I have launched, to encourage Facebook to do this. But a bigger “#openfacebook” campaign is needed that could work in conjunction with similar campaigns to make targeted advertising more transparent.

It’s still possible for Facebook to rethink its data policy in a way that respects individual privacy and limits the potential for data misuse, but also promotes transparency, accountability and independent research. If Facebook does not alter course, it will catastrophically undermine our ability not only to understand the social network machine and its millions of pages, but also the entire political and social order that the internet has created.

Facebook is restricting search results.

Data access, Facebook, Politics, Research, Social media

Last year, the capacities of modern technology allowed me to research my great grandfather’s role in World War I. Sadly my investigation reached an impasse. The records of his service were destroyed in September 1940 when a German incendiary bomb landed on the War Office’s Arnside Street depot in London. Now 78 years on, I have been confronted with a new dead end in my current research into the use of Facebook by UK political parties. An impasse not created by an inferno; but instead fashioned by Facebook itself.

Facebook is deliberately limiting the number of archived public messages anyone can see: hiding many messages sent by all manner of Facebook pages. This is important as many of these pages have public interest roles, including political party and media pages. Accounts such as the Conservative PartyJeremy Corbyn and the BBC News page send thousands of messages to voters every year – pages that are open to view without even requiring Facebook membership. However, people wishing to see the archive of these pages will find themselves facing severe limitations.

Facebook offers search functions to view historical page posts, however access to millions of posts has been restricted. For example, if a voter wished to find all the public posts made by the Labour Party in 2015, they can’t. Using the search function by year, one can only view the 600 “top posts” by the page in the given year – less than half the posts published by Labour that year. Searching by month does not improve the situation. The “recent posts” and “top posts” by month functions, which allow users to see historical posts, also return similarly limited results.

Facebook’s search function only provides limited results for users, both ‘top posts’ and ‘recent posts’ offer incomplete data. http://www.facebook.com

This problem is the same for other highly active pages on Facebook, from what Donald Trump messaged over the 2016 presidential campaign to what news sites such as the Daily Mirror sent during the 2017 Westminster attack, or what the Green Party posted in 2015. This constitutes a massive policy of deliberate data occlusion.

Voters can’t see what they have been sent publicly by parties and politicians in the past. Thousands of public posts are now impossible to view without the associated URL addresses of the messages, a policy tantamount to deletion. The only people who have these URLs are researchers like me who have extracted post data from Facebook near the time of posting. There is no other way to find posts’ URLs.

The messages being hidden are not targeted advertisements and do not include personal data. The public has a right to see these messages. But Facebook now operates a policy of restricting 2.13 billion users’ access to millions of public posts. I am sure that the Labour Party, BBC, the Conservative Party and thousands of others would prefer their information to be accessible. However, it appears that Facebook has total control over the visibility of its creators’ outputs.

Why it matters

Given the exponential growth in voters receiving political information via social media,Facebook is now integral to modern campaigns. Facebook has not made it clear why viewing archived posts is restricted. Potentially they take the view that less is more, but for public content this is short sighted.

The messages that have been hidden contain news stories and made arguments to persuade people to vote. They are full of information that changed their opinions and promised policies. How can people examine or question posts that are “fake news” or appreciate whether a party has fulfilled the promises they made in the past, if they are hidden from view?

The solution for Facebook is simple. It needs to change its application programming interface (API) to allow the extraction of all public post data, especially from political and news pages. It must also open its search function to cast a wider net, allowing people to see all the public content they were sent.

Facebook’s mission statement is to “give people the power to build community and bring the world closer together”. Facebook must therefore take the challenge to become more transparent and not less so. It should help to develop core values at the heart of democracy, including information access and democratic accountability, rather than hinder them.