New Features Announcement – Topic Extraction

Topic Extraction – another new and exciting feature now available in our ever-growing data discovery platform. Users of this new feature will benefit from understanding the topics that summarise their data. Search data by topic and create topic word clouds to visualise and filter that data. Watch our demo video to see these new features in action.


Book a demo or a free trial  to learn more about how the Exonar platform can transform your data.

New Features Announcement – Enhanced Search

Search just got better! With our new Enhanced Search feature, users can now benefit from  simple search, phrase search, proximity search, fuzzy search, must/not include, and so much more making it easier to find the data you need swiftly, simply and at scale. Watch our demo video to see these new features in action.


Book a demo or a free trial  to learn more about how the Exonar platform can transform your data.

Your questions answered – IAPP webinar Q&A

Recently, Exonar organised a webinar hosted by the IAPP. ‘Thriving in Generation Privacy: Capitalising on DSAR Data from the Field’ was a great event with a large number of attendees and a thought-provoking programme that raised a number of questions from the floor. The webinar summary was as follows:

With the introduction of the EU GDPR, the CCPA and other global privacy laws, people have increased expectations of how their personal data will be handled and protected. This is driving up the number of inquiries for data subject access requests and requests to exercise the right to be forgotten. We commissioned our own research into how businesses are coping with the increased demand; the findings of which were remarkable.

If you missed the webinar, you can find it here:

Due to time constraints, it was not possible to address all the questions asked during the webinar, so we’ve gone through them all and you can find a complete list of questions and our answers below:


Q1. Is there any clarity (under EU GDPR Guidance etc) on what personal data can be safely classed under Legal Privilege and therefore remain undisclosed to data subject?

Answer: Where Legal Privilege protects sensitive content within confidential or privileged documents, the sensitive content is to be redacted when providing copies of the documents to the data subject if they have requested to access their personal information for legitimate reasons. Personal information within confidential and sensitive documents still belong to the individual and they have a right to request access to it. For instance, an ex employee requesting access to emails about their performance, the contents of which also contain the sensitive information of the client that their performance relates to. The organisation is to redact the sensitive information relating to the client, and satisfy the access request to access their performance related data.


Q2. Common challenges (Identifying the data subject) and fake SARs could be a real challenge too – how is this handled?

Answer: The steps you take to identify the individual will be particular to your organisation. In summary, ensure that you are asking for the same amount of verification as you would if that individual were to request their information for any other reason. Practically speaking, this will mean the key identifying information regarding them and potentially some form of identity verification.


Q3. Is unstructured data covered by GDPR?

Answer: Yes, all personal data relating to EU citizens is covered by GDPR.


Q4. What is the percentage of SARs for which you know, explicitly, the reason for submission, as there is no requirement for the individual to state the reason they want the data?

Answer: In practise, with the SARs we handle, it is only occasionally stated as to why the individual is requesting their data. Sometimes it becomes obvious during the review process and it may be appropriate to intervene in a different way (for example, it becomes evident that they are a customer who has a grievance).


Q5. Do you have any recommendations to streamline the SAR intake process?

Answer: Yes, pay close attention to what data you are providing, spread the load and invest in automation where appropriate. We often find organisations default to disclosing lots of context (i.e. contents of files and emails). In reality, the regulation requires that you disclose the personal data you hold, the purpose, where it is stored, and third parties you have provided it to. It may be appropriate to provide more information to diffuse a situation but it isn’t a requirement. Exonar can help automate this process. It needn’t take days; it can be achieved in minutes using their platform.


Q6. How do the regulators prioritise SARs? Aren’t they far busier with data breaches and other more “serious” incidents? In short, if they are inundated with SARs, it could take a long time for a data subject to get a response.

Answer: Satisfying the right to access through SARs is very high on the ICO’s priority list. Jonathon Bamford, the director of strategic policy at the ICO told us this at a recent Westminster eForum: “Well, actually, the biggest issue that’s raised is subject access, and it isn’t about little changes around if you can charge a fee, or how long it takes or things like that. It’s the core thing about securing somebody’s right to have access to their data, and that’s the biggest thing that we’ve got there, so when I’m talking about data protection back to basics it’s that one. I think the fact that we’ve got Subject Access Request (SAR) complaints up by 98% tells me something.Complaints have increased significantly since May and we’re on track to receive over 43,000 individual complaints by the end of the year, and certainly by the end of quarter 2 we’d received 94% more complaints than we had the year before. So that’s interesting. I think from May to October I think we got 16,000, nearly 17,000 complaints, in the previous period in 2017 that was 7,000. The biggest issue that’s raised is subject access”.


Q7. Are the panel aware of any significant increase in SARs as a result of equal pay (and similar) reporting requirements? For example, if the company holds an employee name and + or – average salary. Are there any exemptions to disclosure that could apply here?

Answer: We’ve asked around and we’ve not encountered this use case before, but in theory, an employee would be able to ask for their ‘relation to average salary’ data if it existed. That employee couldn’t access the details of other individual employees, and can otherwise access aggregate salary details in company reports, so the answer for the organisation is ‘don’t create politically toxic categories of personal data that employees and customers could potentially ask for’.


Q8. Is there any easy way to automate consent management in addition to the information itself?

Answer: There are automated consent management solutions on the market, and we’d be happy to give you our opinion on the solutions we have seen if that helps you.


Q9. Might we see the courts (and potentially the CJEU) eventually rule on SARs that are used abusively and contrary to the spirit, even if not letter, of the GDPR?

Answer: The GDPR already gives organisations the right to challenge the scope and legitimacy of a data subject access request to counter the types of trolling or excessive requesting that some might have expected. There has yet to be a high-profile instance of such an abuse of the SAR rules and I imagine that privacy regulators will respond if that threat does indeed materialise. To this point I don’t think the courts have been given any meaningful incentive to tighten those rules.


Q10. As a non-European/non-American, how do I know if I’m subject to GDPR or CCPA?

Answer: You are subject to GDPR if you hold any data regarding EU citizens.


Q11. How do you collect enough information to verify the data subject without creating another record by receiving that information?

Answer: Under GDPR there are six lawful bases for processing personal data. One of these is legal obligation. As it is your legal obligation to comply with a SAR, this is the basis for processing this information.


Q12. How do you verify the identity of the person requesting the SAR? A qualifier for my question; I’m referring of course to complaints to the regulator concerning unfulfilled SARs.

Answer: See Q2.


Q13. Can a SAR ask for details of technical and organisational measures taken to protect their data?

Answer: The right of access does not include disclosure as the methods used to protect information. However, taking appropriate measures is a legal obligation in itself.


Q14. Don’t SARs also apply to paper records?

Answer: Yes, GDPR is technologically neutral. The regulation applies in two situations; firstly, where processing of personal data is conducted by “automated means,” and/or where processing of personal data is not conducted by automated means, but it forms part of a filing system or is intended to form part of a filing system. This second condition clearly applies to paper filing systems.


Q15. From the average cost of SARs being £525, did any of the organisations who were involved with those SARs who took part in  the survey ask the data subject for a reasonable fee? £525 seems very costly to small organisations.

Answer: It is illegal under the GDPR to request a fee for fulfilling a SAR. It is for this reason that organisations must quickly move from a highly costly manual process into embedding an automated SAR solution that can reduce this financial burden long term.


Q16. There are some data breaches caused by a mishandling of SARs, such as the Amazon/Alexa case in Germany. Could you please talk a bit about this? Are there any other similar cases you might share with us, please?

Answer: Your response to a SAR is likely to contain a highly concentrated profile of personal information about the data subject. Using your data privacy impact assessment process, you should classify your SAR response communications as high risk, and apply the high risk security controls your organisation uses to protect other high risk communications and data transfer e.g. using secure file shares, encrypting the file, sending keys separately etc. Our advice is, therefore, to apply the high-risk security controls used for other high risk personal information transfers.


Q17. Given the pending final guidelines on the territorial scope of the GDPR (Article 3), how should entities outside of the EU who are unsure of their nexus respond to a SAR? With regards to Article 3(2).

Answer: GDPR applies to any organisation holding personal data relating to EU citizens. If this is you, you will need to respond to the SAR or you will be in breach of the regulation.


Q18. Is there a danger that some organisations are asking for too much information to confirm proof of identity? Some insist on copy of passport – something I might not be happy to share with a company I might already be unhappy with?

Answer: The IAPP has a great article on this.


Democracy Disrupted: Data Privacy, Social Media & Election Interference

Democracy Disrupted: Data Privacy, Social Media and Election Interference – Summary of Data Protection Forum speech

On March 5th, 2019 our Data & Privacy Director, John Tsopanis spoke at the Data Protection Forum event in London. His talk – ‘Democracy Disrupted: Data Privacy, Social Media, and Election Interference’ is presented here in article form.


When discussing social media, it’s important to understand that it is a visual media; a visual media that has the power to evoke powerful emotions in the individual, groups of individuals, tens of millions of individuals whose relation and opinion of the world is formed by the content they consume. So, when we talk about the scale of political disinformation campaigns we are attempting the impossible, trying to articulate the psychological impact that billions of messages are having on the psychology of tens of millions of individuals. The scale of influence is critical; according to data from Nielsen, Americans spend an average of 10 hours and 39 minutes consuming media across their devices every day. Specifically, five hours per day are spent on mobile devices. What we see on our screens is now the overwhelming driver of political opinion and consensus.

UK Parliament DCMS Fake News Report

UK Parliament’s DCMS report into fake news, disinformation and interference into Brexit concludes that data privacy rights were violated by Facebook and Cambridge Analytica during the Brexit referendum, and tens of millions of people were microtargeted with political disinformation as a result. The DCMS conclude that the institutions that are designed to protect us from this type of abuse are not fit for purpose nor appropriately funded. The DCMS have called for urgent action to safeguard our democracy from microtargeted political disinformation campaigns, funded by countries like Russia, that aim and are succeeding at fracturing the British political consensus into gridlock.

The DCMS acknowledge that the GDPR has been a necessary first step in establishing privacy rights for British citizens, but more protections are needed to safeguard citizens’ online safety given the privacy violations that have already occurred.

The DCMS report summarises as follows:

“We have always experienced propaganda and politically-aligned bias, which purports to be news, but this activity has taken on new forms and has been hugely magnified by information technology and the ubiquity of social media. In this environment, people are able to accept and give credence to information that reinforces their views, no matter how distorted or inaccurate, while dismissing content with which they do not agree as ‘fake news’. This has a polarising effect and reduces the common ground on which reasoned debate, based on objective facts, can take place. Much has been said about the coarsening of public debate, but when these factors are brought to bear directly in election campaigns then the very fabric of our democracy is threatened.

This situation is unlikely to change. What does need to change is the enforcement of greater transparency in the digital sphere, to ensure that we know the source of what we are reading, who has paid for it and why the information has been sent to us. We need to understand how the big tech companies work and what happens to our data.

In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices. While the Internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carries the insidious ability to distort, to mislead and to produce hatred and instability. It functions on a scale and at a speed that is unprecedented in human history. One of the witnesses at our inquiry, Tristan Harris, from the US-based Center for Humane Technology, describes the current use of technology as “hijacking our minds and society”. We must use technology, instead, to free our minds and use regulation to restore democratic accountability. We must make sure that people stay in charge of the machines.”

Data Privacy and British Democracy

The problem British democracy faces has two core components:

The first is the need to safeguard personal privacy and restrict the ability for personal data to be harvested, profiled and leveraged at scale by unknown actors. The GDPR has given individuals the rights to access and erasure which offer a solution for the individual, but if the organisations conducting the microtargeting are unknown and/or criminal it is very difficult for the individual to exercise these rights. What is needed is greater capacity for enforcement.

The suggested solution from the DCMS is to impose a 2% levy on big data and social media companies and ring fence that into funding the ICO’s enforcement work. This will allow the extension of powers offered to them under the GDPR which will enable them to identify, investigate and take down dark data and disinformation operations at scale. It is the international scale of operations working against British democracy through the vehicle of unregulated social media that has overwhelmed our current domestic regulatory bodies and our politics. Therefore, an urgent boost to the resources of the regulators is needed to tackle this problem at source.

The second problem is tackling disinformation. The DCMS has called for the following:

“There is now an urgent need to establish independent regulation. We believe that a compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content. The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries.

As we said in our Interim Report, such a Code of Ethics should be similar to the Broadcasting Code issued by Ofcom—which is based on the guidelines established in section 319 of the 2003 Communications Act. The Code of Ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.

The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.”

The scale of disinformation on social media platforms is the current largest threat to British democracy. It’s one that data privacy professionals have yet to truly understand, primarily because the 20% professional class are rarely the targets of micro targeted disinformation campaigns due to their inferred socioeconomic status. This perfect storm has meant that our privacy legislation now lags significantly behind the technology that needs to be regulated and there is an overcompensation needed to correct course.

Cambridge Analytica, Disinformation and Brexit

Cambridge Analytica were responsible for delivering the Trump and Leave.EU Brexit social media campaigns.

‘Today, in the United States, we have close to 4000 to 5000 data points on every individual. So we model every personality across the United States, some 230 million people’ – Alexander Nix, CEO of Cambridge Analytica, October 2016

See 6:40-11:07 for Channel 4’s undercover reporting of Cambridge Analytica’s political disinformation tactics:

The integrity of the information supply is the cornerstone of a free and functioning democracy

“A democracy needs good quality information, and fair distribution of that information in order to articulate, aggregate, and defend its own national interests. Without it, democracy falls.” said Professor AC Grayling, moral and political philosopher, and author of over 30 books on ethics, philosophy and the history of human rights. He also went on to say:

“In a mature democracy, citizens must be free to choose the information they consume, and to be able to easily identify and trust the source of that information at the point of consumption. The ability for citizens to do this, to opt out of illicit messaging from untrusted sources, is what we might consider exercising our right to privacy. Without these freedoms, we cannot meaningfully escape unwanted influence, and in a truly Orwellian sense, our vulnerability to psychological manipulation by unknown individuals and organisations makes us all less free”.

Foreign Interference in Brexit

The DCMS, along with tackling data privacy violations and disinformation, has also called for an urgent investigation into Russian interference into Brexit. The aim is to investigate the source of Mr Aaron Banks’ £9m donation to the Leave.EU campaign; the largest donation in British political history – the source of which is still unclear.

What is clear is that the disinformation networks that were operating during the Brexit referendum are still active and more effective than ever. The prevalence of known Kremlin Twitter and Facebook accounts amplifying pro-Brexit politicians (e.g. Conservative members of the “European Research Group” known as the ‘ERG’) and pro-Brexit social media pages like Leave.EU and Westmonster are deep cause for concern for British citizens. Leave.EU alone generated 661,000,000 impressions on Facebook and 221,000,000 impressions on Twitter in 2018.

The full nature of this relationship must be investigated by an Independent Counsel similar to the USA’s Mueller Enquiry, an enquiry that is investigating the Trump Organisation’s ties with Russia, and revealed to the public as a top priority.


Britain needs to take back control of its politics and to do so it needs to take back control of its data, give the necessary regulatory bodies the investigative and enforcement powers needed to conduct investigations at scale. It should create new institutions that are fit for holding social media companies accountable for disinformation campaigns run through their platform.