• Patch@feddit.uk
    link
    fedilink
    English
    arrow-up
    3
    ·
    14 hours ago

    The AI models can be hilariously bad even on their own terms.

    Yesterday I asked Gemini for a population figure (because I was too lazy to look it up myself). First I asked it:

    what is the combined population of place 1, place 2 and place 3

    It answered:

    Unfortunately, providing an absolutely precise, up-to-the-minute combined population figure is difficult. Population numbers are constantly changing.

    To find these figures, I would recommend checking the Office for National Statistics (ONS) website. They provide detailed population statistics for local authorities in England.

    On a whim, I asked it again as:

    combined population of place 1, place 2 and place 3

    And then it gave me the answer sweet as a nut.

    Apparently I was being too polite with it, I guess?

    • LinyosT@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      7 hours ago

      I slapped a picture of a chart into Gemini because I didn’t know what the type of chart was called but I wanted to mention it in a Uni report. I was too lazy to go looking at chart types and thought that would be quicker.

      I just asked it “What kind of chart is this” and it ignored that and started analysing the chart instead and started stating what the chart was about and giving insights into the chart. Didn’t tell me what kind of chart it was even though that was the only thing I asked.

      Bear in mind that I deliberately cropped out any context to avoid it trying to do that, just in case, so all I got from it was pure hallucinations. It was just making pure shit up that I didn’t ask for.

      I switched to the reasoning model and asked again, then it gave me the info I wanted.