I lost my job after AI recruitment tool assessed my body language, says make-up artist::A make-up artist says she lost her job at a leading brand after an AI recruitment tool that used facial recognition technology marked her down for her body language.

  • Flying Squid@lemmy.world
    link
    fedilink
    English
    arrow-up
    154
    arrow-down
    4
    ·
    9 months ago

    Absolute bigotry against neurodivergent people. Normalizing body language is exactly the sort of prejudice neurodivergent people have to put up with all the time.

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        66
        arrow-down
        1
        ·
        9 months ago

        Thinking more on this, you don’t need to even have autism or ADHD or any other form of diagnoseable neurodivergence. You could just be an introverted person who doesn’t do well in such situations.

        And then there’s the nationality issue. Different nationalities and cultures have different body languages. It is disrespectful to make eye contact in Japan and expected in the U.S. So what does this AI do with a Japanese hiree?

        • originalucifer@moist.catsweat.com
          link
          fedilink
          arrow-up
          35
          arrow-down
          2
          ·
          9 months ago

          resting bitch face is a thing. i am constantly having to tell people i am not angry, thats just what i look like.

          like jim bruer and his ‘resting stoned face’

          • Flying Squid@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            1
            ·
            9 months ago

            It sounds like a few people stand to get rich from suing whatever company made this crap into oblivion.

    • psud@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      9 months ago

      And people from a different culture than the people who trained the AI

    • agitatedpotato@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      That was my first thought too, anyone with a condition with a sensory component is inherently discriminated against.

  • RotaryKeyboard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    1
    ·
    9 months ago

    If a company requires you to re-apply for the job you already have, you lost your job long before you ever recorded yourself with HireVue.

    • treadful@lemmy.zip
      link
      fedilink
      English
      arrow-up
      34
      ·
      9 months ago

      These kinds of tools can easily just be the fall guy. An excuse they use to get rid of you. Then if you complain, they can just be like “the AI did it!”

      • Taleya@aussie.zone
        link
        fedilink
        English
        arrow-up
        30
        ·
        edit-2
        9 months ago

        Suddenly, viciously reminded of that quote: “If a machine cannot be held accountable, it cannot be allowed to make a management decision” (paraphrased)

        …should prooooobably start legislating that shit

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        Just like return to office, relocations, and the overwhelming majority of performance metrics.

        Cheap excuses for shit managers that can’t or won’t handle their employees (and firing of them) properly.

  • Meron35@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    9 months ago

    This isn’t new. Recruiting firms such as HireVue have been pushing out “AI” interviewing platforms which automatically judge your body language, fashion choices, tone of language etc since at least 2018.

    Companies are using AI to stop bias in hiring. They could also make discrimination worse. - https://www.vice.com/en/article/qvq9ep/companies-are-using-ai-to-stop-bias-in-hiring-they-could-also-make-discrimination-worse

    • Shirasho@lemmings.world
      link
      fedilink
      English
      arrow-up
      42
      ·
      9 months ago

      For a brief moment I worked in that industry as a programmer. The whole point is not to find the most qualified candidate but to find the one that fits into the company culture the most in order to reduce turnover. These algorithms will throw away applications from people of color because they have “behaviors not in line with the company culture” or applications from disabled people because they would “not react properly to certain situations”.

      Of course they aren’t explicitly rejecting these people, but the questions and answers on the tests for applications are specifically and painstakingly crafted to filter out these people without making it clear what type of person the question is trying to filter out.

      This doesn’t necessarily have to do with the AI in question, but my point is that the entire hiring/firing process is totally fucked, and companies are constantly looking for ways to get around discrimination laws.

      • linearchaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        9 months ago

        Using an AI to grade someone’s body language seems like a horrible thing.

        Although I will say there is some validity to being careful about who you hire company culture wise and I’m not talking about race gender or disability.

        We’ve turned down the ‘best programmer’ numerous times, some people that really had some solid skills, because they came in aggressive and brash.

        The one guy got his “sorry but no thanks”, said look at my resume I’m an absolute master at everything you do, and he wasn’t kidding he was very good. We told him we recognize his skills but said that socially he was difficult and abrasive just in interviews and that there’s no way that they could subject him to the rest of the company. He unleashed a string of profanities and said couldn’t we just have him work somewhere else on his own separate projects. No, that wasn’t going to be an option.

        Nobody wants to hire somebody that’s going to make a workplace toxic. That means that sometimes you turn down some of the better skilled opportunities, but you can always find somebody nicer and train/educate them.

        As far as race, gender, quirks, we have you meet with everybody, a group takes you out to lunch. You can be shy, flighty, uncomfortable, awkward, the basic test is, can you mostly do the job and would other people want to work with you. And if the people come back with the answer of no, we don’t bring you on. We’ve done that since the very beginning, so everybody there is already pretty much a tolerant nice person.

        I had this one guy interview for my department, He made it through the morning interviews no problems. Gold star. The lunch crew took him out to lunch. He turned it into a people watching affair and started making horrible comments about all the people coming in the door. One of the strongest personalities I know was the lunch came back to me and said he makes me very uncomfortable. I sent him packing.

        I hope we don’t get to the point where all jobs are using AI to weed people out without humans checking behind it.

      • Nawor3565@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        Funny. That sounds exactly like how they tried to use “intelligence tests” to prevent Black people from voting. The questions didn’t explicitly exclude Black people, but we’re written in a vague and subjective way so that the test-giver could claim that any answer was right/wrong and thereby exclude anyone they wanted.

    • TheObviousSolution@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 months ago

      They are not using it to stop bias. If history has proven anything, it’s that AI is biased as shit. They are using AI to excuse bias, because “computers ergo cold hard logic” while ignoring that they aren’t training in ethical and moral considerations.

  • bionicjoey@lemmy.ca
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    9 months ago

    To paraphrase Groucho Marx, I don’t want to work anywhere that would use an AI body language analyzer in their recruitment pipeline.

    No way a place that does that has a good culture.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      9 months ago

      Well, at least not very soon, for sure.

      May have had a good culture, then the C-class jackasses decided to destroy it. And that’s being optimistic (my default is cynicism).

    • Khanzarate@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      9 months ago

      Only if the AI used discriminatory criteria from a protected class.

      They CAN fire you for feeling you’re likely to sue. They can’t retaliate against a lawsuit, but there isn’t one yet. At-will employment sucks, and the thing that protects against this is a union, not discrimination laws.

  • TheObviousSolution@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    ·
    9 months ago

    It classified her as “most likely to be critical and sue company for ethical violations”.

    But seriously, I don’t know what it is with the AI craze. Today, HAL 9000 seems like a documentary because it’s like most of these AI are behaving - highly reliable until they go off the completely deep end and suddenly aren’t. They are at their worst when deployed in highly subjective and dynamic situations, like the one mentioned in this article.

    • Theoriginalthon@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      All I’m reading from this is that the company has “ethical violation” problems and should probably be investigated