DiscoverMarketplace TechAI can’t handle the truth when it comes to the law
AI can’t handle the truth when it comes to the law

AI can’t handle the truth when it comes to the law

Update: 2024-03-11
Share

Description


Almost one in five lawyers are using AI, according to an American Bar Association survey. But there are a growing number of legal horror stories involving tools like ChatGPT, because chatbots have a tendency to make stuff up — such as legal precedents from cases that never happened. Marketplace’s Meghan McCarty Carino spoke with Daniel Ho at Stanford’s Institute for Human-Centered Artificial Intelligence about the group’s recent study on how frequently three of the most popular language models from ChatGPT, Meta and Google hallucinate when asked to weigh in or assist with legal cases.


Comments 
loading
In Channel
loading
Download from Google Play
Download from App Store
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

AI can’t handle the truth when it comes to the law

AI can’t handle the truth when it comes to the law

Marketplace / Marketplace Tech Staff