this post was submitted on 11 Feb 2024
16 points (80.8% liked)
Futurology
1809 readers
191 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is there no risk of the LLM hallucinating cases or laws that don't exist?
How to use Chat GPT to ruin your legal career.
AI does help with discovery and they don't need to spend 8 days scanning emails before the trial, but they'll still need lawyers and junior lawyers.
GPT4 is dramatically less likely to hallucinate than 3.5, and we're barely starting the exponential growth curve.
Is there a risk? Yes. Humans do it too though if you think about it, and all AI has to do is better than humans, which is a milestone it's already got within sight.