 
			No. Here’s why.
Today I noticed a couple of start-ups recruiting lawyers to train AI models in legal reasoning. “No contracts”, “no commitments of any kind”, PhD and Masters students or graduates preferred.
My first thought was that the legal profession should go after the scumbags applying for these positions and burn them at the stake for selling out their profession. Then I realised — they could not do that even if they wanted to.
Firstly, the whole idea is based on a false premise: that higher education reflects the way a lawyer reasons and thinks in practice. The worlds of legal practice and legal academia are vastly different. And understanding how a legal academic may think is no clue to how a lawyer engaged in legal practice may think. The AI model will be flawed.
And it will not be able to deliver legal advice. Why not? Because an AI application is not a natural person and cannot hold a practising certificate. End of story.
At best, the AI — no matter how well trained and de-bugged — will be able to deliver “advice” of some sort, peppered with disclaimers. It will need to be checked and signed off by a real lawyer.
In the meantime, some PhD and Masters students may be able to earn some pocket money on the side by providing their input for a fee. Until whatever startup they are working for burns through investors’ cash and disappears — as most of them do.
Our jobs as lawyers are safe. Not just “for now”. For now and forevermore.
(This article was not written by AI — nor with its assistance, in case you were wondering.)
Author: Eric Kalde