Between protecting personally identifiable information (PII), the impact of artificial intelligence (AI), avoiding phishing and ransomware attacks, and GDPR sanctions, lawyers and law firms have a lot to be concerned about. In light of this complex landscape of surging data, litigation, and regulatory obligations, where do the risks really lie?
Jonathan Armstrong, a partner at U.K. compliance law firm Cordery who advises on eDiscovery, investigations, and GDPR issues, addressed our PIIP 2023 audience with a fascinating review of international trends to watch, considerations for U.S. businesses operating globally, and predictions for U.S. data privacy regulations and enforcement. Enjoy this lightly edited summary of his comments. For additional thoughts on GDPR and data privacy, read Part 1 and Part 2.
Do you have to comply with GDPR if the personal data is anonymous?
Another issue that we see in any form of eDiscovery is, “I don’t have to comply with GDPR because it’s anonymous data.” GDPR applies to identified or identifiable data, and the definition of personal data under GDPR is far wider than the definition of PII in a U.S. scenario. That’s why personal data and PII aren’t terms that we should interchange and think they’re the same. I can also talk about somebody’s personal data without referring to them by name. I don’t need to name names for that to be their personal data.
Whenever I hear a client or vendor say, “We don’t have to comply with GDPR because it’s all anonymous data,” I always worry because I’ve never seen a useful piece of data that is truly anonymized. If we don’t mention the name of the people involved, then it could be what’s called pseudonymized data, but that’s still within GDPR. Even with possession of a relatively small amount of data, then I can probably pinpoint it to an exact address and a time and a date.
GDPR treats things like geolocation data even more sensitively because, in some cases, they can be what’s called special category data that relates to places of worship, hospital premises, or something of that nature. What we’re increasingly seeing is this worry that people aren’t complying with GDPR because they think it’s all anonymous since they’ve been told it’s anonymous, but it never was anonymous, and they’ve made other assumptions based on that false premise. Whenever we’re looking at setting up some sort of eDiscovery or data collection exercise, we’ve got to look at what is anonymous data. I’m going to suggest that’s going to be a very rare thing. We are seeing individuals involved in eDiscovery and investigations use their GDPR rights more and more because they don’t want things to happen to them and they object to their data being used as a means to an end.
Besides the GDPR, there are criminal provisions in U.K. law as well – the 2018 Data Protection Act. People in the eDiscovery world can innocently violate this Act, particularly if they try to re-identify anonymized or pseudonymized data, don’t give away the subject access request in time, or don’t provide data to a third party when they’re asked for it. GDPR also created some new rights, and extended others like the subject access right. These rights are being used by individuals to gain advantage, stay out of investigations, and avoid litigation – particularly litigation outside their home country.
Do you anticipate a rise in class action lawsuits related to GDPR?
We’ve got a number of things to thank the U.S. for – fizzy soft drinks, McDonald’s, Kentucky Fried Chicken, and also the rise in class actions. We’re certainly seeing the importation of U.S. style class action litigation into parts of the EU Liability and compensation are definitely things on the agenda for these class action law firms with GDPR type breaches, and there is a right to compensation under GDPR in some circumstances.
There’s lots of litigation about cookies on websites around at the moment which have similar bases, and there’s an emerging trend of data damages claims. We’re seeing the effects of data subject requests, and subject access requests being used as a cheap form of pre-action discovery. And we’re seeing more countries – Germany and Netherlands, for example – allowing for forms of class actions when traditionally much of mainland Europe was resistant to this type of litigation.
How does GDPR look at artificial intelligence (AI)?
Obviously, there’s good AI and there’s bad AI. There are good AI providers and bad ones, but increasingly we’re seeing AI featured in a lot of GDPR cases. Another big myth is to say, “well, the Law doesn’t really regulate our AI,” or the EU and the U.K. have proposals to legislate for AI but they’re not in effect yet. That’s nonsense.
Our firm did a quick analysis and I think the figure was something like 146 million Euros of fines related to AI from regulators over the last 12 months. There are many cases, a lot of them against Clearview AI, but others as well. It’s a growing area of GDPR enforcement. AI is always likely to require a data protection impact assessment, and it’s a hot area for employee and customer concern.
There’s all sorts of interesting stuff in this area – if you’re really into AI – including a very strange meeting in a German castle amongst German regulators, which has tried to set principles for enforcement in Germany. GDPR does restrict automated processing and Chatbots bring their own issues. AI is not infallible and we shouldn’t rely on it blindly. We should assess, as I say, good providers and bad providers. It’s like anything that’s quality out there – there’s a lack of quality as well.
How vulnerable are lawyers and law firms to phishing and ransomware attacks?
I’m seeing a real rise in attacks on lawyers and law firms. Ransomware attacks are far more sophisticated, and I know that many law firms are falling for these types of issues at the moment. I think there are a number of reasons for that, and some human factors are influencing behavior as a whole. This whole great resignation and quiet quitting people being at their desks but not being tuned on. We’ve certainly seen people who’ve clicked on phishing emails as a result. The pandemic also reduced loyalty to employers, and there’s nobody looking over anyone’s shoulder. Even the world of document reviews is done differently now.
With people in disparate places oftentimes rather than in just one room with a supervisor walking around, we’ve lost what I call the “Clever Sandra” in the working environment, when we could turn to somebody next to us and say, “Sandra, have a look at this.” In a number of our cases, we’ve seen people because they’ve lost their Clever Sandra. They just click on the link and say, “what’s the worst that could happen?” We’ve lost those corridor chats, the ability to nudge behavior. And I think we’ve got to some extent the return of the corporate maverick as well, which in itself has consequences.
Most ransomware will be reportable. We will have to tell a regulator; we might have to tell others involved. We might have obligations under another set of laws than these laws, which are in the process of being updated across Europe. Depending on the sectors that we’re in, regulators are being more thorough. We have had an echo of the Sullivan and Uber litigation over here. For those of you who haven’t followed it, this was a ransomware payment that was disguised as something else that led to criminal consequences for Uber’s former CSO.
We are again concerned about transparency obligations. I think a lot of people, including law firms, are pre-programmed whenever they have an attack to say, “Oh, it’s a zero-day sophisticated cyberattack from a nation state actor.” Be aware that saying those words probably invalidates your insurance because of the way insurer’s rules have changed. Lloyd’s rules change again next month.
The U.K.’s leading criminal defense lawyers had a ransomware attack on August 24, 2020, and they reported it to the regulator within 24 hours. However, their backup was also encrypted by the ransomware gang, and they lost 972,000 files in that one incident alone due to the late application of a patch. Most ransomware attacks come from just a handful of known vulnerabilities. One of the key things they asked a third party to look at was how the law firm could improve its cybersecurity. They’d gotten a list of actions which they’d buried and done nothing with. So, that led to a relatively small fine, but costly reputation consequences as well.