Democracy Disrupted: Data Privacy, Social Media & Election Interference

Democracy Disrupted: Data Privacy, Social Media and Election Interference – Summary of Data Protection Forum speech

On March 5th, 2019 our Data & Privacy Director, John Tsopanis spoke at the Data Protection Forum event in London. His talk – ‘Democracy Disrupted: Data Privacy, Social Media, and Election Interference’ is presented here in article form.


When discussing social media, it’s important to understand that it is a visual media; a visual media that has the power to evoke powerful emotions in the individual, groups of individuals, tens of millions of individuals whose relation and opinion of the world is formed by the content they consume. So, when we talk about the scale of political disinformation campaigns we are attempting the impossible, trying to articulate the psychological impact that billions of messages are having on the psychology of tens of millions of individuals. The scale of influence is critical; according to data from Nielsen, Americans spend an average of 10 hours and 39 minutes consuming media across their devices every day. Specifically, five hours per day are spent on mobile devices. What we see on our screens is now the overwhelming driver of political opinion and consensus.

UK Parliament DCMS Fake News Report

UK Parliament’s DCMS report into fake news, disinformation and interference into Brexit concludes that data privacy rights were violated by Facebook and Cambridge Analytica during the Brexit referendum, and tens of millions of people were microtargeted with political disinformation as a result. The DCMS conclude that the institutions that are designed to protect us from this type of abuse are not fit for purpose nor appropriately funded. The DCMS have called for urgent action to safeguard our democracy from microtargeted political disinformation campaigns, funded by countries like Russia, that aim and are succeeding at fracturing the British political consensus into gridlock.

The DCMS acknowledge that the GDPR has been a necessary first step in establishing privacy rights for British citizens, but more protections are needed to safeguard citizens’ online safety given the privacy violations that have already occurred.

The DCMS report summarises as follows:

“We have always experienced propaganda and politically-aligned bias, which purports to be news, but this activity has taken on new forms and has been hugely magnified by information technology and the ubiquity of social media. In this environment, people are able to accept and give credence to information that reinforces their views, no matter how distorted or inaccurate, while dismissing content with which they do not agree as ‘fake news’. This has a polarising effect and reduces the common ground on which reasoned debate, based on objective facts, can take place. Much has been said about the coarsening of public debate, but when these factors are brought to bear directly in election campaigns then the very fabric of our democracy is threatened.

This situation is unlikely to change. What does need to change is the enforcement of greater transparency in the digital sphere, to ensure that we know the source of what we are reading, who has paid for it and why the information has been sent to us. We need to understand how the big tech companies work and what happens to our data.

In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices. While the Internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carries the insidious ability to distort, to mislead and to produce hatred and instability. It functions on a scale and at a speed that is unprecedented in human history. One of the witnesses at our inquiry, Tristan Harris, from the US-based Center for Humane Technology, describes the current use of technology as “hijacking our minds and society”. We must use technology, instead, to free our minds and use regulation to restore democratic accountability. We must make sure that people stay in charge of the machines.”

Data Privacy and British Democracy

The problem British democracy faces has two core components:

The first is the need to safeguard personal privacy and restrict the ability for personal data to be harvested, profiled and leveraged at scale by unknown actors. The GDPR has given individuals the rights to access and erasure which offer a solution for the individual, but if the organisations conducting the microtargeting are unknown and/or criminal it is very difficult for the individual to exercise these rights. What is needed is greater capacity for enforcement.

The suggested solution from the DCMS is to impose a 2% levy on big data and social media companies and ring fence that into funding the ICO’s enforcement work. This will allow the extension of powers offered to them under the GDPR which will enable them to identify, investigate and take down dark data and disinformation operations at scale. It is the international scale of operations working against British democracy through the vehicle of unregulated social media that has overwhelmed our current domestic regulatory bodies and our politics. Therefore, an urgent boost to the resources of the regulators is needed to tackle this problem at source.

The second problem is tackling disinformation. The DCMS has called for the following:

“There is now an urgent need to establish independent regulation. We believe that a compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content. The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries.

As we said in our Interim Report, such a Code of Ethics should be similar to the Broadcasting Code issued by Ofcom—which is based on the guidelines established in section 319 of the 2003 Communications Act. The Code of Ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.

The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.”

The scale of disinformation on social media platforms is the current largest threat to British democracy. It’s one that data privacy professionals have yet to truly understand, primarily because the 20% professional class are rarely the targets of micro targeted disinformation campaigns due to their inferred socioeconomic status. This perfect storm has meant that our privacy legislation now lags significantly behind the technology that needs to be regulated and there is an overcompensation needed to correct course.

Cambridge Analytica, Disinformation and Brexit

Cambridge Analytica were responsible for delivering the Trump and Leave.EU Brexit social media campaigns.

‘Today, in the United States, we have close to 4000 to 5000 data points on every individual. So we model every personality across the United States, some 230 million people’ – Alexander Nix, CEO of Cambridge Analytica, October 2016

See 6:40-11:07 for Channel 4’s undercover reporting of Cambridge Analytica’s political disinformation tactics:

The integrity of the information supply is the cornerstone of a free and functioning democracy

“A democracy needs good quality information, and fair distribution of that information in order to articulate, aggregate, and defend its own national interests. Without it, democracy falls.” said Professor AC Grayling, moral and political philosopher, and author of over 30 books on ethics, philosophy and the history of human rights. He also went on to say:

“In a mature democracy, citizens must be free to choose the information they consume, and to be able to easily identify and trust the source of that information at the point of consumption. The ability for citizens to do this, to opt out of illicit messaging from untrusted sources, is what we might consider exercising our right to privacy. Without these freedoms, we cannot meaningfully escape unwanted influence, and in a truly Orwellian sense, our vulnerability to psychological manipulation by unknown individuals and organisations makes us all less free”.

Foreign Interference in Brexit

The DCMS, along with tackling data privacy violations and disinformation, has also called for an urgent investigation into Russian interference into Brexit. The aim is to investigate the source of Mr Aaron Banks’ £9m donation to the Leave.EU campaign; the largest donation in British political history – the source of which is still unclear.

What is clear is that the disinformation networks that were operating during the Brexit referendum are still active and more effective than ever. The prevalence of known Kremlin Twitter and Facebook accounts amplifying pro-Brexit politicians (e.g. Conservative members of the “European Research Group” known as the ‘ERG’) and pro-Brexit social media pages like Leave.EU and Westmonster are deep cause for concern for British citizens. Leave.EU alone generated 661,000,000 impressions on Facebook and 221,000,000 impressions on Twitter in 2018.

The full nature of this relationship must be investigated by an Independent Counsel similar to the USA’s Mueller Enquiry, an enquiry that is investigating the Trump Organisation’s ties with Russia, and revealed to the public as a top priority.


Britain needs to take back control of its politics and to do so it needs to take back control of its data, give the necessary regulatory bodies the investigative and enforcement powers needed to conduct investigations at scale. It should create new institutions that are fit for holding social media companies accountable for disinformation campaigns run through their platform.

Trump, Brexit, Cambridge Analytica – Global Data Privacy Regulations

Privacy legislation advanced leaps and bounds in 2018 with Europe (GDPR), California (CCPA) and India (PDPB) pioneering the way for privacy protection for their citizens.

For many organisations, 2018 was the year that ‘data privacy’ became the two most cumbersome words in the professional lexicon.To comply with new legislation, organisations assessed their data practices and ability to protect citizens’ privacy rights in accordance with new legislations. With GDPR fines of up to €20m or 4% global turnover, 2018 was the year that businesses started taking data privacy seriously.

2018 Key Privacy Events

Europe and the GDPR – May 2018

Europe implemented the GDPR in May 2018 providing European residents the right to access and erase their personal information upon request, whilst mandating organisations to report security breaches to affected citizens.

In the UK, reporting of data breaches to the Information Commissioner’s Office (ICO) increased by 260% in the three months after May 2018 compared to the same three months in 2017; a remarkable cultural change in identifying and reporting data breaches.

The ICO also levied its first successful fine against AIQ, the Canadian data firm linked to Cambridge Analytica, before levying another fine against Cambridge Analytica itself for failing to comply with a data subject access request (SAR) from Professor David Carroll.

Key Privacy Trigger:

Cambridge Analytica, Brexit and Trump – 87 million US and UK citizens were psychologically profiled and micro targeted with political messaging and misinformation to influence the Brexit and Trump vote. There are 11 ongoing criminal enquiries into breaches of electoral law in the UK and illegal data practices are the cornerstone of those investigations. These investigations will escalate and conclude in 2019 heightening citizens’ understanding of how their privacy rights were abused.

USA and the California Consumer Privacy Act (CCPA) – July 2018

California announced the incoming CCPA which will come into effect on January 1st 2020. The CCPA provides similar rights to access and erasure as the GDPR, and also requires organisations to disclose which third parties they buy and sell personal data from upon request.

The CCPA has led to New York following suit with data privacy regulation of its own, and there are talks of federal privacy law being developed in 2019 as the complexity of state-by-state data privacy laws seem too impractical to overcome. This point was made clear after the two largest American data breaches of 2018 affected Americans across all 50 states.

  • Exactis – 340 million records breached
  • Marriott Hotels – 323 million records breached

Key Privacy Trigger: California Consumer Privacy Act and the right for Americans to sue

The CCPA provides California residents with a private right of action, allowing individuals to pursue their own lawsuits against organisations (rather than waiting for regulatory enforcement action). Individuals can enact this right when a breach occurs due to a demonstrable lack of appropriate security controls.

In the USA, a litigious society, we can expect the individual right to sue to drive interest in data privacy rights at a quicker rate than in the build up to the GDPR, which will in turn lead to federal calls for those same data privacy rights.

India and the Personal Data Protection Bill (PDPB) – September 2018

6 months after the Indian national identity system was breached exposing the data of 1.1 billion Indians, India announced their personal data protection bill. Openly modelled on the GDPR, the PDPB gives Indian citizens rights to access, erasure and the right to report breaches to a new Indian data protection authority (DPA) that will also have the power to influence rulemaking (unlike the ICO in the UK) and levy hefty fines.

The PDPB will also include sectoral consideration vis-a-vis the CCPA, and include provisions for national security concerns similar to the Chinese data protection regulations (CDPR).

Key Privacy Trigger – Aadhar Data Breach

In March 2018 a breach of India’s national identity database left personal and biometric information of 1.1 billion Indians exposed. The data was of sufficient detail to open bank accounts, enrol in state financial programmes and register SIM cards, sparking a nationwide debate on data privacy, national security and a 6 month turnaround to announcing the PDPB.

What to Look For in 2019

  1. Public outrage at AI’s abilities to psychologically profile and microtarget citizens in real time

The investigations into AIQ/SCL/Cambridge Analytica’s role in both Brexit and Trump campaigns will escalate through 2019. As indictments are served in relation to data crimes, the public will develop an understanding of how AI algorithms psychologically profile and microtarget them in real time.

The focus on authoritarian regimes’ use of these data practices to suppress opposition via social media platforms will come under specific scrutiny. This will lead to a strengthening of the political movements calling for AI transparency and major regulatory reform for big tech and microtargeting data practices.

  1. Big Tech vs Regulators battle it out over US federal privacy law

The fight over details of the CCPA are ongoing and we can expect the lobbyists of Google, Amazon, Facebook and Apple to continue actively resisting tighter regulation at each opportunity. We can expect pushbacks on citizens rights to access data, a sparking of a conversation surrounding consent for data usage, and an attempt by journalists to reveal the network of third party data analytics firms who would be the worst violators of new data privacy laws.

  1. The first £100m GDPR fine?

It is difficult to understand the privacy impact of a data breach, especially when the number of citizens affected runs into the hundreds of millions. These are numbers too large for individuals to comprehend but the privacy impacts will be accounted for by regulators in the form of mega fines in 2019.

The maximum fine for Facebook under the GDPR is an approximated $1.6bn and with investigators across the world scrutinising the data practices of multiple technology companies, 2019 could be the year of the first truly eye-watering fine.

4 Questions, All The Answers. What You Need to Know About GDPR

GDPR seems to be on everyone’s lips at the moment. While the regulation doesn’t come into force until 2018, preparation has already begun for many organisations. For some, however, GDPR still raises a number of questions and queries.

We asked the former Head of Fraud, Risk and Security for Vodafone UK and now Exonar’s Chief Operating Officer, Julie Evans, what GDPR means for Exonar, what we will be doing about it and what the potential implications for other UK businesses are.

What does GDPR Mean to Us and Our Clients?

GDPR significantly increases the level of proactive management of Personally Identifiable Information (PII). It increases the requirements on any organisation that deals with the personal information of EU citizen customers or employees. The fact is that no-one is clear on what the post-Brexit world of GDPR will look like in the UK but it will still impact most UK organisations.

The UK exit from the EU will not be complete before GDPR is implemented. There will be a significant period of overlap following the triggering of article 50 and, even after Brexit, there is a strong possibility that similar regulations will be sought by the ICO and demanded by international companies who will look for ‘adequacy’ in UK law to ensure that the UK can compete and operate seamlessly across Europe and the world. Further, GDPR requires adequate privacy protection in states outside the EU, if EU companies are to store their data there. In all, it seems nearly inconceivable that privacy of personal information will not be a significant factor in the coming years.

As well as increasing privacy requirements, GDPR introduces significant penalties for non-compliance and also broadens the scope of what is considered PII. Although somewhat lacking in absolute clarity, the Regulations define PII as being information that enables the identification of a person.

What does GDPR mean for Exonar?

As a relatively new company Exonar is not burdened by legacy of old IT infrastructure although we must ensure the way we hold data is compliant with GDPR. For us, this is primarily employee and shareholder data. In common with most organisations the first task is to find and create a register of the data. Even a relatively small organisation like Exonar uses multiple different platforms to store information; documents, spreadsheets, PDFs and presentations, located across file shares, email and in cloud drives. It’s not an insignificant issue, however, we do at least have our own Exonar software at our fingertips to enable us to map where this information is being stored.

As well as identifying where all of our PII is, we’ll also need to designate the role of Data Protection Officer (DPO), an individual within our organisation directly tasked with identifying and protecting individual’s information within our organisation, it does not need to be a full time role but there must be clarity of accountability and we are re-apportioning our job roles to accommodate this requirement.

How can We and Other Organisations get Ready for GDPR?

Understanding the key changes proposed by GDPR is the first step in understanding how to be compliant with the regulations. The table below (courtesy of consulting firm EY) highlights the key areas that need addressing:

Depending on the level of organisational maturity, the new regulations could therefore demand changes to resourcing, training, process definition, applications as well as how the data is handled. The requirements could be significant.

How Is Exonar Going About GDPR Compliance?

I am confident that the leadership team of any organisation would tell you that they would love to have the insight to their customer journey from a customer perspective. GDPR for us is a fantastic opportunity to use our own product and to experience the output. We have set up the ‘discover’ phase of the Exonar journey to crawl all of our data stores. Given that we only hold a couple of terabytes of data we achieved this in our first afternoon.

Our next phase is to ‘understand’ what we ‘discovered’, determining what PII was where, who put it there and why. We’re able to do this through the use of our software’s querying function, it’s “Find More Like This” capability for identifying all data relevant to a topic and the results graphs and charts that show me what information we have, in what format it’s in and in which application of filestore it’s been put.

Now I know what I’ve got I can act upon it so our next phase in GDPR readiness is to review our policy and process as well as our use of applications and communicate our recommendations clearly to the whole team. It does take time so it’s perhaps a good thing that we are not leaving compliance with GDPR until the last minute…