Venture capital-backed AI law firms: malpractice machines or democratizing tools?
A new "law firm" funded by Sequoia Capital has set up a structure so that non-lawyers (i.e. AI) can analyze contracts
A few weeks ago, venture capital firm Sequoia Ventures announced its “first AI-powered” law firm, Crosby AI. According to a report in TechCrunch, “the company is currently promising that its AI software, with human overseers, can review a new client contract in under an hour.”
Since Crosby soft-launched in January of this year, the company claims it has already processed many Master Service Agreements, data processing agreements, and non-disclosure agreements on behalf of startup clients.
Naturally, there is skepticism about whether legal services can be performed without a human double- or triple-checking the details and nuance. But beyond that, in most places, it is not legal for non-lawyers to advise people on legal matters. This is in large part due to ethical concerns, that conflicts of interest that could inadvertently pit non-lawyers against their clients in order to fulfill fiduciary responsibilities. Lawyers are held to a specific ethical standard—and to remain practicing, they must be barred.
At the same time, the billable hours model is not often sustainable. The vast majority of people who can’t pay $400-$1,000+ an hour for a contract review. Some say that law firms need outside investment to innovate, and that there is little pressure from equity partners to change the billable hours model—but that this all needs to be transparent, and in compliance with bar rules.
We asked a few others what they make of “AI law firms,” from lawyers to individuals and businesses in need of legal services. Is it concerning that a law firm might be tied to a tech platform owned by investors? Is there a blend of AI and human that can exist to lessen the cost?
Our respondents range from the highly skeptical to the cautiously optimistic—and they include startup founders and employees in need of contract analysis, experts in trust and safety, and lawyers themselves:
Britta Mulderrig, founder of start-up Jasper: “AI’s real power is in filling the gap between no help and human help. It’s not here to replace people. It’s here for the in-between moments.
In the legal world, AI can be a game-changer for startups and small businesses. Most founders don’t have the money or time for traditional firms, so contracts get delayed or skipped. If AI tools can streamline that process, great. The innovation is about access.
Bottom line: AI should be seen as infrastructure. It’s not about eliminating jobs. It’s about expanding access to services that are often out of reach because of cost or availability. Still, it needs to be used responsibly. AI should support, not replace, human connection.”
NYC-based startup employee in her early 30s: “I’d be happy to use a contract review tool for documents I feel that I understand somewhat on my own. For example, I had ChatGPT review my employment agreement for a startup job and my apartment lease, and its analysis seemed fine. For a more complicated exit review scenario, I’d rather have a lawyer review, although I wouldn’t mind if they were also using some kind of AI tool.
It also depends on the stakes of the situation. My partner works at a hedge fund and typically makes the vast majority of household income. His contracts include some potentially complicated financial and economic structures. So his are well worth paying the billable hours to have someone review.
My employment agreements are pretty simple, and as long as the salary details are right and there is nothing wildly off-standard, I’m happy. So if an AI tool is a little bit faster or better than doing it on my own, I would pick that over a lawyer.
Lastly, I’m a bit of a “Nervous Nellie” so I read agreements myself, have ChatGPT review, and sometimes ask a lawyer to read. I’ve been horrified at how many of my friends sign leases or employment contracts without even reading them, because they think ‘well I don’t understand what I’m looking for and can’t afford a lawyer to review anyway, so what’s the point?’
So I would love to see a more affordable option, so more people are able to have the benefit of at least some level of review and advice.”
Tech founder with expertise in trust and safety: “Based on everything I see of Crosby—from their “three-hour turnaround” to the roles they’re hiring for—it’s giving Builder.AI.
These are some pretty green founders, both in general experience and in the landscape they’re in. When you’re working with the law, you want adults in the room, and I’m not seeing that. They’ve each had one real job and while one went to law school, he only spent a year practicing law. I wouldn’t hire them as my lawyer, so why would I trust them to do my legal work?
They don’t have a cookies policy, terms of service, or privacy policy in their site —which is not technically illegal but also again I wouldn’t take legal advice from a firm who isn’t following legal advice best practices. I think they just made an outsourced dev shop but for lawyers.
Like they say it’s tech and ‘lawyer in the loop’ to review, but I don’t see the need for some advanced tech with what they’re doing?
They’re focused on some of the easiest legal work around: contracts and agreements. As a business owner, I can review these documents myself in under an hour. Plus, having the knowledge and ability to review contracts as a business owner is important.
Businesses generally have templates and if a potential customer redlines things, they’re generally pretty clear already. And the art of contracts is less about the complex legal artwork they talk about, and more about negotiations and understanding the intent behind the changes—all of which require talking to the other side versus handing over more data to a ‘law firm.’
So now I’m supposed to pay for a lawyer’s time, plus the AI costs? Also, legal being a bottleneck is an issue, but not for the reason they apparently think based on positioning.
With contract review, I don’t think it’s complex legal work, it’s just lawyers having bandwidth to review quickly.
I’ve used AI for a variety of legal needs like interpreting redlines on potential strategies, but never would I fully trust AI to review contracts and approve changes without internal human oversight.
Also: this seems like a privacy and cybersecurity nightmare waiting to happen! Especially because they don’t provide any transparency on their usage or security of your data found within legal docs!
Vincent White, employment attorney: We certainly have hourly clients and we bill at $1,160.00 per hour, but the current state of AI would make the quality of the AI work highly questionable. (Our hourly work tends to be wealthy clients who do not want to share a percentage on their employment litigation with a law firm, or custom executive compensation package work wherein we negotiate comp packages for high end executives.)
I produced two long form videos on the serious issues we have been seeing in AI work product within the legal realm and employment law specifically:
Lots of specificity in those two videos, but the long story short is: AI just can't do the work yet, and it doesn't look like it will be able to do credible legal work in the next 3 to 5 years. All current AI models I have touched and our teams have played with— whether large general LLMs or legal-specific targeted products—are nothing more than malpractice machines actively designed to harm clients and law firms who involve themselves with these products.
What do you think about AI’s intrusion into the legal field? We’d love to hear from you; you can respond to this email directly or I am on Signal at @aristeinhorn.17.
Talked w/friends in high places at large firms who’ll be happily retired before Crosby comes along. Younger partners not so much.