• DokPsy@infosec.pub
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Letting a language model do the work of thinking is like building a house and using a circular saw to put nails in. It will do it but you should not trust the results.

    It is not Google. It can, will, and has made up facts as long as it fits the format expected

    Not at the very least proof reading and fact checking the output is beyond lazy and a terrible use of a tool. Using it to create the end product instead of as a tool to use in creation of an end product are two very different things.