New Publications

Recent Publications
It is hopeless trying to keep up with all the publications on 'fake news', conspiracy theory, post-truth, Big Data / artificial intelligence, cybercrime and information warfare that are appearing — well over 500 in the last 4 years, and that's just in English! I have reproduced below the book / report covers of many of these publications, roughly sorted under broad headings — clearly some titles could be listed under a number of categories...
I have made no attempt to evaluate or sort the titles, but have provided brief notes on half a dozen of the key ones. Please note that a number of the publications identified are available online (for free). The book covers are shown roughly in the order in which I became aware of them (ie newest additions at top of section). Do please let me know if I've missed any important recent publications or you think any of the items selected are inappropriate, for whatever reason. Many thanks!

Page Content

Ten Starter Texts
Here are ten publications that should give you a good introduction to the field of 'fake news' and disinformation, and the rise of populism and the crisis of trust in democracy. You can find the details on Google or Amazon. If you would like to suggest other titles that you think should be included, please also indicate which publication(s) you think should be removed. 
1    Democracy & Public Trust
2    Big Data & Artificial Intelligence
3    The Internet & Surveillance Capitalism
4    Social Media & Media Literacy
5    The Media & Media Bias
6    Fake News & Post-Truth
7    Conspiracy Theories & Denialism
8    The Dark Web & Cybercrime
9    Information Warfare / National CyberSecurity
“A new type of war has emerged, in which armed warfare has given up its decisive place in the achievement of the military and political objectives of war to another kind of warfare - information warfare.”       V. Kvachkov [Russia’s Special Purpose Forces]

10   Fighting Fake [1]
A   Analysis
B   Action
C   Analysis & Action
Brief Notes on Some of the Key Publications
Here's a note on some important recent publications which together provide a clear description of the problems created by big data / social media / malign state actors, and outline different ways in which these might be addressed. [The key recommendations made by some of these reports are also covered on the Recommendations Page.]
a)   Tackling the Information Crisis: A policy framework for media system resilience
The LSE Truth, Trust & Technology Commission has spent the last year addressing questions such as 'How should we reduce the amount of misinformation?’ 'How can we protect democracy from digital damage?’ and 'How can we help people make the most of the extraordinary opportunities of the Internet while avoiding the harm it can cause?’ Its report [published on 20 Nov 18] makes an important contribution to the debate.
It recommends the formation of an Independent Platform Agency (IPA), a watchdog which would evaluate the effectiveness of platform self-regulation and the development of quality journalism, reporting to Parliament and offering policy advice. It says the IPA should be funded by a new levy on UK social media and search advertising revenue. The Agency should be a permanent forum for monitoring and reviewing the behaviour of online platforms and provide annual reviews of ‘the state of disinformation’.   In addition to it’s key recommendation the Commission also proposes a new programme of media literacy and a statutory code for political advertising. Watch a short video about the information crisis and the work of the Commission [3:29 min], and read the full report here.
b)   The Case #ForTheWeb
"Open a newspaper, turn on the television or scroll through your Twitter feed, and you’re likely to see a story about how the World Wide Web is under threat. We’ve lost control of our personal data and that data is being weaponised against us. The power to access news and information from around the globe is being manipulated by malicious actors. Online harassment is rampant, and governments are increasingly censoring information online — or shutting down the internet altogether."

This report, unveiled by Sir Tim Berners-Lee [on 5 Nov 18] outlines a set of principles "to protect and enhance the web’s future, as well as craft a collective contract for May 2019."
Together with his organization, the Web Foundation,  Sir Tim wants the World Wide Web to be 'accessible and affordable for everyone,' 'safe and welcoming for everyone,' and 'empowering for everyone.'  Read the report here.
c)   Freedom on the Net 2018
Fake news, data collection, and the challenge to democracy

Governments around the world are tightening control over citizens’ data and using claims of 'fake news' to suppress dissent, eroding trust in the internet as well as the foundations of democracy, according to 'Freedom on the Net 2018' [published early Nov 18]. Out of the 65 countries assessed in Freedom on the Net, 26 experienced a deterioration in internet freedom. Almost half of all declines were related to elections. At the same time, the regime in China has become more brazen in providing like-minded governments with technology and training that enable them to control their own citizens.
“Democracies are struggling in the digital age, while China is exporting its model of censorship and surveillance to control information both inside and outside its borders,” said Michael J. Abramowitz, President of Freedom House. Read the full report here.
d)   Technology as Enabler of Fake News and a Potential Tool to Combat it
"On the Internet, every day is April Fool's day."

This  quote says is from an impressive report, published in May 18 by the European Parliament's Directorate-General for Internal Policies,  which investigates  the  role  of  technology  in  the  circulation  of  so-called  'fake  news'.  The report argues that whilst technology  is  a  major  tool  for  the  dissemination  of  mis/disinformation  it  also  offers  methods  to  analyse and reduce its impact.
The report stresses the importance of being vigilant, to avoid falling prey of fake news. As the author, Žiga Turk, writes: "Citizens need to become aware that the internet is a different media environment than TV and newspapers. There are no editors and no gatekeepers."

After having tried other approaches, like partnering with fact-checking organizations, big tech companies like Facebook and Google now seem to be particularly keen on solving the fake news issue by implementing technical solutions, such as scoring web pages or using artificial intelligence to detect them. They and others are also working on improving the media literacy of users. Read the full report here.
e)   Kremlin disinformation campaigns
LSE's recommendations to counter Russia computational propaganda in the UK. "Kremlin digital influence engineering involves the creation and dissemination of false narratives as well as technological manipulation such as the use of fake or automated social media accounts to distort public perceptions. The aim is to promote the Kremlin’s foreign-policy agenda, boost its local proxies, erode trust in democratic institutions, increase polarisation and spread confusion during crises.
Examples in Britain include a campaign to cast doubt on the integrity of the vote count following the Scottish Independence Referendum in 2016; the amplification of ethnic and religious hatred following terrorist attacks in Britain in 2017; and the undermining of public confidence in the British government’s explanation of the poisoning of Sergei Skripal in 2018.Evidence-based analysis in this area is at an early stage. Challenges include identifying unattributed activity as emanating from or generated by Russia, as well as in measuring the scale and impact of activity. Measuring these key indicators would be significantly easier given more cooperation with the tech companies. Kremlin digital disinformation and computational propaganda has been enabled and facilitated, albeit unwittingly, by the nature and policies of social media platforms. These hinder analysis and countermeasures.
Countering Russian digital influence engineering also requires dealing with urgent issues of privacy, online identity, data usage and wider digital rights. More broadly, digital disinformation and computational propaganda is just part of a much bigger 'full-spectrum warfare' arsenal, which includes the use of money, cyber-attacks, military (kinetic) intimidation and abuse of the legal system. Though these tactics fall outside the scope of this paper, responding to them will require an unprecedented whole-of-government response."  Read the full report here.
f)   Disinformation & 'Fake News'[Final Report of the HoC DCMS Select Committee]
"This is the Final Report in an inquiry on disinformation that has spanned over 18 months, covering individuals’ rights over their privacy, how their political choices might be affected and influenced by online information, and interference in political elections both in this country and across the world — carried out by malign forces intent on causing disruption and confusion..."
"In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices. While the Internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carries the insidious ability to distort, to mislead and to produce hatred and instability. It functions on a scale and at a speed that is unprecedented in human history. One of the witnesses at our inquiry, Tristan Harris, from the US-based Center for Humane Technology, describes the current use of technology as “hijacking our minds and society”. We must use technology, instead, to free our minds and use regulation to restore democratic accountability. We must make sure that people stay in charge of the machines. Read the Report."
g)   The Age of Surveillance Capitalism —Shoshana Zuboff (2019) [2]
Central Argument: Surveillance capitalists are after our identities, our personalities, and our emotions. Once surveillance capitalists are able to understand who we are, they try to modify our behaviour. This means that we are no longer free. The easier our behaviour is to predict, the more valuable our data is to them.
Behavioural Surplus:  Surveillance capitalists want to gather behavioural surplus. Some of the data generated by our actions on a platform is used to improve the platform for our benefit, but the surplus is sold to other companies. Behavioural surplus can be called ‘surveillance assets’; this can be turned into ‘surveillance revenues’ and translates into ‘surveillance capital’. This means that serving the user themselves is less valuable to a platform than others’ bets on our future. Users are the objects from which raw materials are extracted and expropriated for Google’s prediction factories.
Monopolies:  Google has built fortifications around its supply chains in order to protect surplus flows from challenge. For example, it built extensive relationships with Obama’s government to help with his re-election. This meant that regulation was delayed. More companies saw Google’s precedent and foraged into surveillance capitalism. Verizon is used as an example. Surveillance capitalism is making its way into physical spaces, too, via ‘smart cities’. Google has invested in smart-city companies like Sidewalk Labs. Cities are also required to invest in these technologies by ensuring their infrastructure aligns with the software. This diverts funds from other causes, like low-cost public bus services. We shouldn’t be asking, ‘who owns the data?’ but rather ‘why is our experience rendered as behavioural data in the first place?’ Also –it is so difficult to avoid. If we turn off behavioural surplus, we often lose functionality. And it is becoming harder to avoid IoT (Internet of things) products, especially given that many of us have mobile phones.
Freedom and Surveillance:  Surveillance capitalists are after our identities, our personalities, and our emotions. Once surveillance capitalists are able to understand who we are, they can try to modify our behaviour. This means that we are no longer free.

Freedom is defined as follows: ‘I live in an expansive landscape that already includes a future that only I can imagine and intend. In my world, this book I write already exists. In fulfilling my promise, I make it manifest. This act of will is my claim to the future tense’(p. 330). When we make promises, we are trying to bridge the gap between the known and the unknown. It is an important part of human interaction. But now, in an era of surveillance capitalism, this will has been usurped by surveillance capital’s exclusive claims on our futures. The purpose of surveillance capitalism is to fabricate predictions, which become more valuable as they approach certainty. This is a new type of market power. Zuboff calls it instrumentarianism –‘defined as the instrumentation and instrumentalization of behaviour for the purposes of modification, prediction, monetisation and control’ (p.357).It is contrasted with totalitarianism.

The Big Other and Instrumentarianism:  This power imposes its will through digital apparatus that Zuboff calls the ‘Big Other’. This digital apparatus reduces human experiences to measurable, observable behaviour while remaining steadfastly indifferent to the meaning of that experience. These methods reduce individuals to the lowest common denominator of sameness, an organism among organisms, despite all the vital ways in which we are not the same. We are just reduced to organisms that behave. ‘Surveillance capitalism departs from the history of market capitalism in three startling ways. First, it insists on the privilege of unfettered freedom and knowledge. Second, it abandons long-standing organic reciprocities with people. Third, the spectre of life in the hive betrays a collectivist societal vision sustained by radical indifference and its material expression in Big Other’ (p. 495).
Notes
1    I've recently separated the Fighting Fake entries into two, a) Analysis, and b) Action. Earlier books / reports are classified together (under heading 'c').

2   These notes are from ‘Surveillance and Privacy’ by Alice Thwaite [Sept 2019].


Share by: