Contact Sales
Legal

Data Protection for Law Firms: Expert Insights from Judith Ratcliffe

Law firms face growing challenges in safeguarding client data, from evolving privacy laws to the risks of AI misuse. In this interview, data protection consultant Judith Ratcliffe shares practical insights on mitigating compliance risks, protecting reputations, and maintaining client trust in an increasingly complex regulatory landscape.

Legal Sector AI in Legal Legal Case Management Legal Practice Management Compliance

Posted 18/09/2025

Caveat: Nothing in this blog post is/ is intended to be legal advice. If you feel you need legal advice as a result of anything you read, here, please speak to a barrister/ solicitor/ attorney of your choice.

Judith Ratcliffe is not a fully-qualified barrister as she hasn’t done pupillage.

Nothing in this blog post is/ is intended to be case-specific Privacy Advice – It reflects the views of the writer, Judith Ratcliffe (Privacy Professional) – if you feel you need case-specific Privacy Advice, please reach out to her, or another Privacy Professional.

What are the top data protection compliance risks law firms face and how could they mitigate them?

The top one is giving bad/ wrong advice because they fail to understand the Law (including overarching Privacy Law) and they fail to focus on resolution.

Also, (arguably) unlawful refusal of erasure requests by default, when lawyers should know (and behave) better, is a known problem.

Mitigation is easy – make sure all lawyers are properly trained in Privacy Law (including personal autonomy, reputation protection and all the rest of it) and resolution of matters to the satisfaction of all parties concerned, remembering that fighting battles for the sake of it, is actually against the best interests of your clients.

What are the reputational risks for firms that mishandle personal data, particularly in high-profile or sensitive cases? 

Loss of prospective and current clients, public censure, increased regulatory scrutiny. Also, loss of reputation within the profession, jeopardising future collaborations and employment moves, attraction of future talent. 

Could changes to UK Data Protection Law (e.g. the Data Use and Access Act) lead to increased litigation or regulatory scrutiny for law firms?

Re: Regulatory Scrutiny

Yes and No.

No, in relation to the Information Commissioner’s Office e.g. Privacy and Electronic Communications Regulations audit powers appear to have been extinguished, reducing regulatory scrutiny.

Yes, in relation to EU Countries’ Regulators, considering the clear and obvious divergences from the EU-GDPR, including those which arguably put EU Nationals at risk of harm, such as those that erect barriers to actioning Rights Requests and/or facilitate delays in actioning Rights Requests.

Yes, in relation to the USA’s Federal Trade Commission, in relation to AI/ algorithmic harms, which may be exacerbated because of DUAA provisions.

Re: Litigation

No – people still won’t be able to afford to litigate, where they couldn’t previously.

Yes – when people can afford to litigate – to overturn the arguably unlawful provisions within the DUAA, including those in relation to recognised legitimate interests and those in relation to biometrics.

How should law firms mitigate risks related to international data transfers, especially with the UK’s adequacy decision under EU GDPR up for review in 2025? 

Transfer means view/ access/ store data in, not just send from/to.

Before any transfers happen, consider: should you even have that data on your systems, in the first place? – if not, you definitely shouldn’t transfer them.

Local Law firms shouldn’t need to ‘transfer’ any data. They shouldn’t even need to use ‘the cloud’. If/ where ‘the cloud’ genuinely needs to be used, keep access/viewing and storage in ‘local’ cloud environments and accessed/ viewed only by local cloud providers’ staff members and your own local employees.

Standard Contractual Clauses should include technical and organisational measures for prompt and effective Rights Request handling.

Big Tech often seems to refuse to action Rights Requests in relation to data in ‘domains’ outside the UK and EU, claiming they fall outside the territorial effect of the GDPR/ UK-GDPR – Such (arguable) abuses of Law need to be called out and stopped. If the situation is left as it is, law firms and their clients may suffer, particularly where breached data and/or other harmful content remains accessible more or less worldwide.

How could law firms improve their client onboarding and KYC processes to better protect Privacy, while still respecting financial crime prevention rules?

Keep genuine informed and free choices at the heart of all communications, onboarding and KYC processes, including allowing people to submit paper-based verification documents and to physically give them to you, rather than forcing people to verify ‘digitally’.  It also means keeping those paper documents offline. 

Free and informed means no extra charges for paper-based/ face to face verification.

It means explaining the higher risks of exposure to identity theft and theft of money that clients and staff will be exposed to if they choose to verify digitally and to undergo digital KYC checks.

Forcing people to ‘verify digitally’ or lose legal assistance may be considered to facilitate economic / financial crime, contrary to new UK legislation, as well as breaking the ‘freely given’ part of consent.

Keeping processes done by local staff on local systems, is also key to protecting people and their data better.

How should law firms approach the use of AI in legal research or document review in light of the evolving rules on automated decision-making (noting recent cases before the SRA and courts showing solicitors using ‘fake authorities’ generated by AI and the resulting dangers and consequences)? 

Short answer: Don’t use it. There is no genuine need to use it.

AI tends to be inaccurate and to plagiarise. Where AI is used, teams tend to repeat the same work twice, when they have to re-do the work AI failed to do correctly the first time.

Document review systems can miss relevant material and include undisclosable material. De-duplication often appears incomplete.

What are the risks of using generative AI tools in client work, especially regarding confidentiality and data minimization? 

Confidentiality
Lawyers need to keep intellectual property and personal data, confidential.

Lawyers will want to avoid accusations of facilitating intellectual property theft and/or trademark misuse, and/ or copyright breaches, caused through running clients’ intellectual property through AI.

Copyrighted works can be ‘devalued’ and creators/ copyright/ trademark owners can be disenfranchised when unlawfully created copies and ‘derived’ works enter marketplaces.

Competition Law may be broken, where cartel-like activities occur.

Personal Data Breaches happen when personal data enters AI, unless the specific individuals (to whom the personal data belongs) have freely consented to such data entry. Even disclosures to AI providers count as personal data breaches.

Harms that may be caused include theft of money and identities, as well as blackmail.

Data Minimisation requires data processed to be relevant and not excessive.

AI processes and produces irrelevant data, which must not be relied on by lawyers.

Examples of irrelevant data, include:

  • Poor translation;
  • Miswriting names;
  • Mistatements of facts and law (when AI makes things up and/or mixes fact with fiction) – These things can also adversely impact financial crime prevention activities.
  • Tainted Evidence.

Risks to lawyers include cases being thrown out of court, revocation of licences to practice, public admonishment, wasted costs orders and referrals to regulators.

Excessive data collection use and storage arises from:

Organisational failures to collect the minimum amount of personal data by AI and/or for AI use.

Organisations collecting data through AI when they have no genuine need to do so and/or AI uses that aren’t genuinely necessary because the same processes can be done manually.

Organisational failures and refusals to destroy data produced by AI and/or from AI systems.

Risks include:

  • algorithms stealing data from law firms, without anyone knowing until it’s too late.
  • losing prospective clients because they fear personal data will be used for AI purposes against their wishes.

“Privacy is far more than Data Protection alone. It is vital to maintain client and court trust, it’s at the heart of winning cases and avoiding harms. Those things will impact reputations, and law firms and chambers’ profits.”

Judith Ratcliffe