• Nougat@fedia.io
    link
    fedilink
    arrow-up
    19
    ·
    5 months ago

    Read the fine print on the milk one:

    … as well as clam juice, glue, sunscreen, toothpaste, and hand lotion.

  • brian@lemmy.ca
    link
    fedilink
    arrow-up
    16
    ·
    5 months ago

    While it’s amusing that it feeds these Onion articles, it’s also a bit worrying when the search queries are worded in such a way that allows for such stark confirmation biases.

    It’s very similar to asking ChatGPT the same question phrased differently and getting entirely different answers.

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    9
    ·
    5 months ago

    That’s what you get when you train an AI that can’t tell the difference between fact and fiction to give “correct” information from the internet. It’s also pulling from Reddit and telling people to jump off a bridge.

  • SkyNTP@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    5 months ago

    The only idiots here are the humans who bought into the AI craze without understanding how Large Language Models actually work and their limitations.

  • Corroded@leminal.space
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 months ago

    I imagine this is going to get fixed in the next couple months but I feel like it would real funny if Google created a fork for this current iteration.