After Nine blamed an ‘automation’ error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    didn’t post the pics they started with of the women

    That’s enough for me to discard this as clickbait at best.

    Post your data if you want me to take your journalism seriously.

    If you want a fair comparison, start with the woman wearing an actual suit, followed by a woman wearing a button up shirt, as your shoulder up pic and see what gets generated.

    20 bucks says the woman in the suit… generates the rest of the suit.

    And 20 bucks says the button up shirt… generates jeans or etc as well.

    If you compare apples to oranges, don’t pretend that getting oranges instead of apples is surprising.

    The fact people aren’t calling this out on here speaks volumes too. We need to have higher standards than garbage quality journalism.

    “Men wearing clearly suits generates the rest of the suits weareas women generate ??? Who knows, we won’t even post the original pic we started with so just trust me bro”

    0/10

      • pixxelkick@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Everyone shut the fuck up for a second.

        Why does this extremely specific gif exist? Who made it? Are there more? I love it lol

        • ArxCyberwolf@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Judging by the style of the smiley faces, this is some ancient gif from a forum. Probably for when people would make wild claims without evidence in a thread, and people wanted proof.

    • CrystalEYE@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      @pixxelkick Thank you! This article clearly is written completely biased. Photohops AI generator tries to interpret the whole picture to expand the cropped image. So in case of the original Georgie Purcell photo, the AI sees “woman, tank top, naked shoulders and arms, water in the background”, so of course it tries to generate clothing it thinks fitting to wear at seaside or a beach.
      I just tried the same with a male model in tank top on a beach and it did not magically put him in a suit, it generated swim wear.
      If I use a picture on Georgie Purcell in more formal clothing, it generates more formal cloting.

      Georgie Purcell in generated swimwear
      Georgie Purcell in generated suit/dress
      Male in generated swimwear

      But, to be fair, this quote from the article:

      But what it proves is that Adobe Photoshop’s systems will suggest women are wearing more revealing clothing than they actually are without any prompting. I did not see the same for men.

      is indeed true. In general pictures of women tend to generate more “sexy” output than pictures of men.

      And, of course, NINE clearly edited the image badly and could have chosen another generated output with no effort at all.

      @LineNoise