Posts

What Next With Your Personal Data Inventory (Article 30)?

3 Step Guide, Survey Results and Article 30² Toolkit

What Next With Your Personal Data Inventory (Article 30)?

Data privacy legislation requires organisations to discover and document their personal data processes e.g. GDPR – Article 30 ‘Record of Processing Activities’. For most organisations the simplest way to fulfil this obligation is to create and maintain a Personal Data Inventory.

Understanding what data you have, why you have it, where it is processed, who can access it, when it should be deleted, and how it is secured is the foundation of any data privacy or cyber security programme that aims to protect personal data and comply with data privacy legislation i.e. GDPR, CCPA, PIPEDA, PDBP and more.

Exonar Survey

Exonar surveyed 104 organisations to understand their experience discovering and operationalising their Personal Data Inventory. We have detailed the findings of this survey alongside a 3 Step Guide to Personal Data Inventory and Article 30² Toolkit.

Our first section on data discovery and personal data inventory will be most useful for organisations who are planning to create their Personal Data Inventory (e.g. those preparing for the California Consumer Privacy Act (CCPA) in 2020).

Our second and third sections will be most useful for organisations who have already created their Personal Data Inventory (e.g. those complying with General Data Protection Regulation (GDPR) from May 2018) to explain next steps for monitoring and compliance activities.

The Article 30² Toolkit can be filled to help you structure your journey through this process.

Download: What Next With Your Personal Data Inventory (Article 30)?

Download: Article 30² Toolkit

 

Join us for our Exonar Meet-ups in London and Reading

We’re delighted to be hosting two meet-ups in May. Both will provide the perfect opportunity for data privacy and information governance professionals to share their experiences of the GDPR one year on and ask – ‘What does the perfect privacy programme look like?’.

 

At Exonar, we are in the process of building information governance modules to leverage our discovery technology, in order to automate as many of the manual GDPR compliance processes as possible. We invite you to share your thoughts on what our ideal product should look like over pizza and a few beers.
For our London event, we look forward to hearing from James Cole, partner at Brown Rudnick LLP as guest speaker.
We hope you can join us!

 

Let us know which date is best for you:
London – May 7th, 6:00PM to 8:00PM: http://exo.nr/london-meetup
Reading – May 22nd, 6:00PM to 8:00PM: http://exo.nr/reading-meetup

 

You can also register for our webinar – ‘GDPR One Year On: What Does a Perfect Privacy Programme Look Like?’ here: http://exo.nr/Privacy-Webinar

 

Free Webinar – The Perfect Privacy Programme. Register Now!

GDPR One Year On: What Does a Perfect Privacy Programme Look Like?

Free Web Conference – Brought to you by Exonar

Broadcast date: 2:00pm, April 24, 2019

One year on from the introduction of the EU General Data Protection Regulation (GDPR), join Exonar and experts from the field in discussing ‘What does a perfect privacy programme look like?’

In this web conference we will hear from our panel of experts as they discuss:

  • What are the necessary components of an enterprise-level privacy programme?
  • How do we optimally assign roles and responsibilities within a privacy programme?
  • How can we most effectively create and manage accurate personal data inventory? (Article 30 – Records of Processing Activities)
  • How do we best monitor for GDPR compliance using both manual and technical controls?
  • What is the best way to deliver privacy training to our employees?
  • What are the most effective tools available to satisfy individual rights? I.e. Subject Access Requests (SARs), Right to be Forgotten, data deletion and retention.

In addition to discussion from the field, our panel will also discuss Exonar’s recent findings based on surveys of 100+ organisations and consumers into:

“What’s Next with Personal Data Inventory?” – Exonar have profiled 100+ organisations’ attempts to create personal data inventory. One year on we ask what monitoring and compliance actions they are now planning to take as a result.

“Consumer Attitudes to Subject Access Requests (SARs): A SARvey” – Exonar have surveyed 100+ consumers to assess their sentiment towards data privacy and the ability to exercise their privacy rights.

There will be a live Q&A session in the final 15 minutes of the webinar so, to avoid missing your chance to contribute, register on the form below:

Host:
John Tsopanis, Data and Privacy Director, Exonar

Panelists:
Ralph O’Brien CIPM, Vice Chair UK Data Protection Forum, Principal Reinbo Consulting
Sophie Payne, Customer Success Lead and Data Scientist, Exonar
Ben Falk, CEO of Yo-Da, Your Data

 

Book your place now:

Your questions answered – IAPP webinar Q&A

Recently, Exonar organised a webinar hosted by the IAPP. ‘Thriving in Generation Privacy: Capitalising on DSAR Data from the Field’ was a great event with a large number of attendees and a thought-provoking programme that raised a number of questions from the floor. The webinar summary was as follows:

With the introduction of the EU GDPR, the CCPA and other global privacy laws, people have increased expectations of how their personal data will be handled and protected. This is driving up the number of inquiries for data subject access requests and requests to exercise the right to be forgotten. We commissioned our own research into how businesses are coping with the increased demand; the findings of which were remarkable.

If you missed the webinar, you can find it here: http://exo.nr/Watch-IAPP-Webinar

Due to time constraints, it was not possible to address all the questions asked during the webinar, so we’ve gone through them all and you can find a complete list of questions and our answers below:

 

Q1. Is there any clarity (under EU GDPR Guidance etc) on what personal data can be safely classed under Legal Privilege and therefore remain undisclosed to data subject?

Answer: Where Legal Privilege protects sensitive content within confidential or privileged documents, the sensitive content is to be redacted when providing copies of the documents to the data subject if they have requested to access their personal information for legitimate reasons. Personal information within confidential and sensitive documents still belong to the individual and they have a right to request access to it. For instance, an ex employee requesting access to emails about their performance, the contents of which also contain the sensitive information of the client that their performance relates to. The organisation is to redact the sensitive information relating to the client, and satisfy the access request to access their performance related data.

 

Q2. Common challenges (Identifying the data subject) and fake SARs could be a real challenge too – how is this handled?

Answer: The steps you take to identify the individual will be particular to your organisation. In summary, ensure that you are asking for the same amount of verification as you would if that individual were to request their information for any other reason. Practically speaking, this will mean the key identifying information regarding them and potentially some form of identity verification.

 

Q3. Is unstructured data covered by GDPR?

Answer: Yes, all personal data relating to EU citizens is covered by GDPR.

 

Q4. What is the percentage of SARs for which you know, explicitly, the reason for submission, as there is no requirement for the individual to state the reason they want the data?

Answer: In practise, with the SARs we handle, it is only occasionally stated as to why the individual is requesting their data. Sometimes it becomes obvious during the review process and it may be appropriate to intervene in a different way (for example, it becomes evident that they are a customer who has a grievance).

 

Q5. Do you have any recommendations to streamline the SAR intake process?

Answer: Yes, pay close attention to what data you are providing, spread the load and invest in automation where appropriate. We often find organisations default to disclosing lots of context (i.e. contents of files and emails). In reality, the regulation requires that you disclose the personal data you hold, the purpose, where it is stored, and third parties you have provided it to. It may be appropriate to provide more information to diffuse a situation but it isn’t a requirement. Exonar can help automate this process. It needn’t take days; it can be achieved in minutes using their platform.

 

Q6. How do the regulators prioritise SARs? Aren’t they far busier with data breaches and other more “serious” incidents? In short, if they are inundated with SARs, it could take a long time for a data subject to get a response.

Answer: Satisfying the right to access through SARs is very high on the ICO’s priority list. Jonathon Bamford, the director of strategic policy at the ICO told us this at a recent Westminster eForum: “Well, actually, the biggest issue that’s raised is subject access, and it isn’t about little changes around if you can charge a fee, or how long it takes or things like that. It’s the core thing about securing somebody’s right to have access to their data, and that’s the biggest thing that we’ve got there, so when I’m talking about data protection back to basics it’s that one. I think the fact that we’ve got Subject Access Request (SAR) complaints up by 98% tells me something.Complaints have increased significantly since May and we’re on track to receive over 43,000 individual complaints by the end of the year, and certainly by the end of quarter 2 we’d received 94% more complaints than we had the year before. So that’s interesting. I think from May to October I think we got 16,000, nearly 17,000 complaints, in the previous period in 2017 that was 7,000. The biggest issue that’s raised is subject access”.

 

Q7. Are the panel aware of any significant increase in SARs as a result of equal pay (and similar) reporting requirements? For example, if the company holds an employee name and + or – average salary. Are there any exemptions to disclosure that could apply here?

Answer: We’ve asked around and we’ve not encountered this use case before, but in theory, an employee would be able to ask for their ‘relation to average salary’ data if it existed. That employee couldn’t access the details of other individual employees, and can otherwise access aggregate salary details in company reports, so the answer for the organisation is ‘don’t create politically toxic categories of personal data that employees and customers could potentially ask for’.

 

Q8. Is there any easy way to automate consent management in addition to the information itself?

Answer: There are automated consent management solutions on the market, and we’d be happy to give you our opinion on the solutions we have seen if that helps you.

 

Q9. Might we see the courts (and potentially the CJEU) eventually rule on SARs that are used abusively and contrary to the spirit, even if not letter, of the GDPR?

Answer: The GDPR already gives organisations the right to challenge the scope and legitimacy of a data subject access request to counter the types of trolling or excessive requesting that some might have expected. There has yet to be a high-profile instance of such an abuse of the SAR rules and I imagine that privacy regulators will respond if that threat does indeed materialise. To this point I don’t think the courts have been given any meaningful incentive to tighten those rules.

 

Q10. As a non-European/non-American, how do I know if I’m subject to GDPR or CCPA?

Answer: You are subject to GDPR if you hold any data regarding EU citizens.

 

Q11. How do you collect enough information to verify the data subject without creating another record by receiving that information?

Answer: Under GDPR there are six lawful bases for processing personal data. One of these is legal obligation. As it is your legal obligation to comply with a SAR, this is the basis for processing this information.

 

Q12. How do you verify the identity of the person requesting the SAR? A qualifier for my question; I’m referring of course to complaints to the regulator concerning unfulfilled SARs.

Answer: See Q2.

 

Q13. Can a SAR ask for details of technical and organisational measures taken to protect their data?

Answer: The right of access does not include disclosure as the methods used to protect information. However, taking appropriate measures is a legal obligation in itself.

 

Q14. Don’t SARs also apply to paper records?

Answer: Yes, GDPR is technologically neutral. The regulation applies in two situations; firstly, where processing of personal data is conducted by “automated means,” and/or where processing of personal data is not conducted by automated means, but it forms part of a filing system or is intended to form part of a filing system. This second condition clearly applies to paper filing systems.

 

Q15. From the average cost of SARs being £525, did any of the organisations who were involved with those SARs who took part in  the survey ask the data subject for a reasonable fee? £525 seems very costly to small organisations.

Answer: It is illegal under the GDPR to request a fee for fulfilling a SAR. It is for this reason that organisations must quickly move from a highly costly manual process into embedding an automated SAR solution that can reduce this financial burden long term.

 

Q16. There are some data breaches caused by a mishandling of SARs, such as the Amazon/Alexa case in Germany. Could you please talk a bit about this? Are there any other similar cases you might share with us, please?

Answer: Your response to a SAR is likely to contain a highly concentrated profile of personal information about the data subject. Using your data privacy impact assessment process, you should classify your SAR response communications as high risk, and apply the high risk security controls your organisation uses to protect other high risk communications and data transfer e.g. using secure file shares, encrypting the file, sending keys separately etc. Our advice is, therefore, to apply the high-risk security controls used for other high risk personal information transfers.

 

Q17. Given the pending final guidelines on the territorial scope of the GDPR (Article 3), how should entities outside of the EU who are unsure of their nexus respond to a SAR? With regards to Article 3(2).

Answer: GDPR applies to any organisation holding personal data relating to EU citizens. If this is you, you will need to respond to the SAR or you will be in breach of the regulation.

 

Q18. Is there a danger that some organisations are asking for too much information to confirm proof of identity? Some insist on copy of passport – something I might not be happy to share with a company I might already be unhappy with?

Answer: The IAPP has a great article on this. https://iapp.org/news/a/how-to-verify-identity-of-data-subjects-for-SARs-under-the-gdpr/

 

Democracy Disrupted: Data Privacy, Social Media & Election Interference

Democracy Disrupted: Data Privacy, Social Media and Election Interference – Summary of Data Protection Forum speech

On March 5th, 2019 our Data & Privacy Director, John Tsopanis spoke at the Data Protection Forum event in London. His talk – ‘Democracy Disrupted: Data Privacy, Social Media, and Election Interference’ is presented here in article form.

 

When discussing social media, it’s important to understand that it is a visual media; a visual media that has the power to evoke powerful emotions in the individual, groups of individuals, tens of millions of individuals whose relation and opinion of the world is formed by the content they consume. So, when we talk about the scale of political disinformation campaigns we are attempting the impossible, trying to articulate the psychological impact that billions of messages are having on the psychology of tens of millions of individuals. The scale of influence is critical; according to data from Nielsen, Americans spend an average of 10 hours and 39 minutes consuming media across their devices every day. Specifically, five hours per day are spent on mobile devices. What we see on our screens is now the overwhelming driver of political opinion and consensus.

UK Parliament DCMS Fake News Report

UK Parliament’s DCMS report into fake news, disinformation and interference into Brexit concludes that data privacy rights were violated by Facebook and Cambridge Analytica during the Brexit referendum, and tens of millions of people were microtargeted with political disinformation as a result. The DCMS conclude that the institutions that are designed to protect us from this type of abuse are not fit for purpose nor appropriately funded. The DCMS have called for urgent action to safeguard our democracy from microtargeted political disinformation campaigns, funded by countries like Russia, that aim and are succeeding at fracturing the British political consensus into gridlock.

The DCMS acknowledge that the GDPR has been a necessary first step in establishing privacy rights for British citizens, but more protections are needed to safeguard citizens’ online safety given the privacy violations that have already occurred.

The DCMS report summarises as follows:

“We have always experienced propaganda and politically-aligned bias, which purports to be news, but this activity has taken on new forms and has been hugely magnified by information technology and the ubiquity of social media. In this environment, people are able to accept and give credence to information that reinforces their views, no matter how distorted or inaccurate, while dismissing content with which they do not agree as ‘fake news’. This has a polarising effect and reduces the common ground on which reasoned debate, based on objective facts, can take place. Much has been said about the coarsening of public debate, but when these factors are brought to bear directly in election campaigns then the very fabric of our democracy is threatened.

This situation is unlikely to change. What does need to change is the enforcement of greater transparency in the digital sphere, to ensure that we know the source of what we are reading, who has paid for it and why the information has been sent to us. We need to understand how the big tech companies work and what happens to our data.

In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices. While the Internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carries the insidious ability to distort, to mislead and to produce hatred and instability. It functions on a scale and at a speed that is unprecedented in human history. One of the witnesses at our inquiry, Tristan Harris, from the US-based Center for Humane Technology, describes the current use of technology as “hijacking our minds and society”. We must use technology, instead, to free our minds and use regulation to restore democratic accountability. We must make sure that people stay in charge of the machines.”

Data Privacy and British Democracy

The problem British democracy faces has two core components:

The first is the need to safeguard personal privacy and restrict the ability for personal data to be harvested, profiled and leveraged at scale by unknown actors. The GDPR has given individuals the rights to access and erasure which offer a solution for the individual, but if the organisations conducting the microtargeting are unknown and/or criminal it is very difficult for the individual to exercise these rights. What is needed is greater capacity for enforcement.

The suggested solution from the DCMS is to impose a 2% levy on big data and social media companies and ring fence that into funding the ICO’s enforcement work. This will allow the extension of powers offered to them under the GDPR which will enable them to identify, investigate and take down dark data and disinformation operations at scale. It is the international scale of operations working against British democracy through the vehicle of unregulated social media that has overwhelmed our current domestic regulatory bodies and our politics. Therefore, an urgent boost to the resources of the regulators is needed to tackle this problem at source.

The second problem is tackling disinformation. The DCMS has called for the following:

“There is now an urgent need to establish independent regulation. We believe that a compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content. The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries.

As we said in our Interim Report, such a Code of Ethics should be similar to the Broadcasting Code issued by Ofcom—which is based on the guidelines established in section 319 of the 2003 Communications Act. The Code of Ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.

The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.”

The scale of disinformation on social media platforms is the current largest threat to British democracy. It’s one that data privacy professionals have yet to truly understand, primarily because the 20% professional class are rarely the targets of micro targeted disinformation campaigns due to their inferred socioeconomic status. This perfect storm has meant that our privacy legislation now lags significantly behind the technology that needs to be regulated and there is an overcompensation needed to correct course.

Cambridge Analytica, Disinformation and Brexit

Cambridge Analytica were responsible for delivering the Trump and Leave.EU Brexit social media campaigns.

‘Today, in the United States, we have close to 4000 to 5000 data points on every individual. So we model every personality across the United States, some 230 million people’ – Alexander Nix, CEO of Cambridge Analytica, October 2016

See 6:40-11:07 for Channel 4’s undercover reporting of Cambridge Analytica’s political disinformation tactics:

The integrity of the information supply is the cornerstone of a free and functioning democracy

“A democracy needs good quality information, and fair distribution of that information in order to articulate, aggregate, and defend its own national interests. Without it, democracy falls.” said Professor AC Grayling, moral and political philosopher, and author of over 30 books on ethics, philosophy and the history of human rights. He also went on to say:

“In a mature democracy, citizens must be free to choose the information they consume, and to be able to easily identify and trust the source of that information at the point of consumption. The ability for citizens to do this, to opt out of illicit messaging from untrusted sources, is what we might consider exercising our right to privacy. Without these freedoms, we cannot meaningfully escape unwanted influence, and in a truly Orwellian sense, our vulnerability to psychological manipulation by unknown individuals and organisations makes us all less free”.

Foreign Interference in Brexit

The DCMS, along with tackling data privacy violations and disinformation, has also called for an urgent investigation into Russian interference into Brexit. The aim is to investigate the source of Mr Aaron Banks’ £9m donation to the Leave.EU campaign; the largest donation in British political history – the source of which is still unclear.

What is clear is that the disinformation networks that were operating during the Brexit referendum are still active and more effective than ever. The prevalence of known Kremlin Twitter and Facebook accounts amplifying pro-Brexit politicians (e.g. Conservative members of the “European Research Group” known as the ‘ERG’) and pro-Brexit social media pages like Leave.EU and Westmonster are deep cause for concern for British citizens. Leave.EU alone generated 661,000,000 impressions on Facebook and 221,000,000 impressions on Twitter in 2018.

The full nature of this relationship must be investigated by an Independent Counsel similar to the USA’s Mueller Enquiry, an enquiry that is investigating the Trump Organisation’s ties with Russia, and revealed to the public as a top priority.

Conclusion

Britain needs to take back control of its politics and to do so it needs to take back control of its data, give the necessary regulatory bodies the investigative and enforcement powers needed to conduct investigations at scale. It should create new institutions that are fit for holding social media companies accountable for disinformation campaigns run through their platform.

Facebook labelled ‘digital gangsters’ – Sky News Interview

Parliamentary report reveals fake news and disinformation was used by Facebook to manipulate elections

A new parliamentary report reveals that Facebook broke privacy and competition law and warned that the organisation should be regulated urgently. The final report of the Digital, Culture, Media and Sport select committee’s 18-month investigation into disinformation and fake news accused Facebook of purposefully obstructing its inquiry and failing to tackle attempts by Russia to manipulate elections.

Following the announcement of this report, our Data and Privacy Director, John Tsopanis was invited to discuss the findings live on the Sky News, Sunrise programme on Monday, 18th February. Watch the full interview here:

Missed Our IAPP Webinar? Watch ‘Thriving in Generation Privacy’

Exonar Webinar hosted by the IAPP: ‘Thriving in Generation Privacy: Capitalising on DSAR Data from the Field’. Your chance to view the recorded webinar.

With the introduction of the EU GDPR, the CCPA and other global privacy laws, people have increased expectations of how their personal data will be handled and protected. This is driving up the number of inquiries for data subject access requests and requests to exercise the right to be forgotten. We commissioned our own research into how businesses are coping with the increased demand; the findings of which were remarkable.

First broadcast on the IAPP website on February 7th 2019, watch this recorded webinar to hear from the field about these survey results and more, including:

  • The cost of handling data subject access requests. (UK public sector organisations example).
  • The results of a subject access request to a UK based high street bank
  • How the world’s leading tech companies dealt with recent requests for personal data
  • How organisations are profiting from their privacy programs
  • The toxic data you’re storing and what to do about it
  • How companies have prepared for Generation Privacy and what you can do now.

Host:
Dave Cohen, CIPP/E, CIPP/US, Knowledge Manager, IAPP

Panelists:
Adrian Barrett, CEO, Exonar
Phil Lee, CIPP/E, CIPM, Partner, Privacy, Security and Data Protection Practice, FieldFisher, London, U.K.
Steve Wright, GDPR Advisor at Bank of England, CEO, Data Privacy Architect, Privacy Culture, London, U.K.

Run time – 60 minutes.



ePrivacy a 2019 Priority – Online tracking regulations to tighten

Sweeping GDPR Fines from German Regulator Send Clear Message; ‘ePrivacy is a 2019 Priority.

 

A new ePrivacy Regulation that tightens rules for online ‘tracking tools’ such as cookies is expected to replace the ePrivacy Directive in late 2019.

Its importance was emphasised last week when the German DPA (Data Protection Authority) announced that they intend to fine forty organisations for using ‘tracking tools’ on their websites, violating the GDPR.

With ePrivacy Regulation set to tighten GDPR rules on ‘tracking tools’, the announcement of sweeping fines for non-compliant cookie practices under GDPR sends a clear message to organisations in 2019: ‘ePrivacy is a priority’.

How will ePrivacy Regulation seek to protect personal privacy?

The ePrivacy Regulation will outline how organisations must uphold Article 7 of the Charter of Fundamental Rights of the EU which guarantees individuals the right to a private life and private communications.

Where the GDPR has a focus on protecting personal data, ePrivacy Regulation will have a specific focus on protecting personal privacy, seeking to empower individuals to opt-out of unwanted data tracking, processing and digital communications.

The ePrivacy Regulation will be ‘lex specialis’ to the GDPR, detailing specific applications of the rules within the scope of the GDPR. The ePrivacy Regulation will specify rules for the use of:

  • Online tracking technologies
  • Citizen profiling and behavioural advertising
  • Metadata processing and brokerage, i.e. geolocation, IP address and device number
  • IoT – Smart Device communications
  • Spam marketing

Why is protecting personal privacy and the integrity of digital communication important?

The profiling and microtargeting of 87 million UK and US citizens by SCL/AIQ/Cambridge Analytica with disinformation from 2016 onwards has been cited in Parliamentary Enquiries across the world as direct evidence for the need for ePrivacy Regulation.

A vast unregulated network of data tracking technologies, profiling softwares and microtargeting practices has left citizens vulnerable to unsolicited digital influence. These practices leave citizens with little control over who is collecting, analysing and leveraging their personal information for commercial and political gain as they browse the internet.

ePrivacy Regulation will allow for GDPR size fines against firms who perform data tracking without consent which will lead to a collapse in data tracking practices. This will help re-establish establish boundaries between citizens and the private and political actors who wish to influence them. It will also allow citizens to better distinguish between legitimate and illegitimate actors in the online space, and provide a fundamental safeguard to ensure that Article 7 of the Charter of Fundamental Rights of the EU is upheld.

How are regulators signalling that ePrivacy is a priority?

The German DPA has taken a major step towards enforcement on ePrivacy by announcing fines for forty large organisations who were found to be tracking visitors on their websites without appropriate consent. The German DPA audited forty “large websites” from the following industries:

(a) Online retail;
(b) Sports;
(c) Banking & insurance;
(d) Media;
(e) Automotive & electronics;
(f) Home and residential; and
(g) Other.

The investigation showed that all forty websites had non-compliant cookie practices with “tracking tools” inappropriately integrated into their sites.

The three major violations found were:

1. No Active Cookie Consent – Cookies and tracking technologies were gathering data on users before obtaining consent. The German DPA said that most of the forty websites used cookie banners to inform users about cookie usage but none of these banners resulted in active consent being obtained from the user before the cookies gathered user data.

2. No Informed Cookie Consent. Thirty of the forty cookie policies were ‘insufficiently transparent’. The German DPA defines ‘sufficiently transparent’ as: a) individually identifying all cookies/trackers (and presumably the companies behind them); and (b) letting users know the specific purposes for which data collected by the identified cookies will be used.

3. Third Party Processing Without Consent. Most of the 40 websites automatically sent data to third-party cookie providers as soon as a user visited the website.

How will the ePrivacy Regulation affect your organisation?

Organisations will have to adapt their cookie practices to adhere to the new regulation, most likely moving to an explicit and informed opt-in consent mode for advertising cookies. There will also be specific requirements in assessing the legitimacy of third party data processing and brokerage of metadata. Organisations will be required to demonstrate a higher level of due diligence/data auditing for third party data processors and have accurate records of data processing in preparation for heightened scrutiny from regulators.

Free IAPP Web Conference – Registration Now Open

Thriving in Generation Privacy: Capitalising on DSAR Data from the Field

Free IAPP Web Conference – Brought to you by Exonar

Broadcast date: Thursday, February 7, 2019
Time: 8:00–9:00 a.m. PT, 11:00 a.m.–noon ET, 4:00 – 5:00 p.m. GMT

With the introduction of the EU General Data Protection Regulation, the California Consumer Privacy Act and other global privacy laws, people have increased expectations of how their personal data will be handled and protected. This is driving up the number of inquiries for data subject access requests and requests to exercise the right to be forgotten. Exonar recently surveyed a number of organizations to understand how they have been coping with these new and increased privacy control operations, and the results were remarkable.

Join us for this upcoming web conference to hear from the field about these survey results and more, including:

  • The cost of handling data subject access requests. (U.K. public sector organizations example).
  • What the results of a SAR request to a U.K.-based, High Street Bank resulted in.
  • How the world’s leading tech companies dealt with recent requests for personal data.
  • How organizations are profiting from their privacy programs.
  • The toxic data you’re storing and what to do about it.
  • How companies have prepared for Generation Privacy and what you can do now.

Host:
Dave Cohen, CIPP/E, CIPP/US, Knowledge Manager, IAPP

Panelists:
Adrian Barrett, CEO, Exonar
Phil Lee, CIPP/E, CIPM, Partner, Privacy, Security and Data Protection Practice, FieldFisher, London, U.K.
Steve Wright, GDPR Advisor at Bank of England, CEO, Data Privacy Architect, Privacy Culture, London, U.K.

Book your place now: exo.nr/IAPP-webinar