• Gork@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    Legal questions are very case sensitive, no pun intended. It’s like asking an extremely specific programming implementation question. LLMs don’t do very well with those types of prompts because the narrower the focus, the less of its training data applies to it and the more likely it’ll just straight up hallucinate. And they don’t yet have the nuance necessary to determine that an area of case law may not be settled and is in a legal grey area.

  • synae[he/him]@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I had a realization recently. These things are like the reverse of the mythical Cassandra: no one can ever be sure that their information is correct, but everyone trusts what they say.

  • AdmiralShat@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    8 months ago

    This article is pointless.

    These chat bots are generative. Yes, they generate fake laws and fake cases and fake outcomes. That’s how they work. Expecting anything else out of something designed to create is pointless and a waste of time. They aren’t designed to not lie. That’s so well established at this point I think the people doing this research on fucking chatGPT for law questions are either mooching funding just to keep a job or are bored.

    If they trained a LLM on nothing but a dictionary, law books, and fed it case outcomes, it would probably be a reasonable tool for law offices. Make sure it only outputs indexes to real cases and real laws, and make sure that law offices have to legally follow up on and verify these things but I see this as an actual use case for these types of bots.

    There still is a lot of nuance involved, especially for a layman who wouldn’t even begin to understand the terminology required to start the search, so a human lawyer would/should still be involved, but these tools would absolutely help speed up the judicial system and probably lower costs.