• Dubious_Fart@lemmy.ml
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    1
    ·
    1 year ago

    This technology should be illegal, and the databases containing the facial/biometric data seized and destroyed.

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      8
      ·
      1 year ago

      This technology should be illegal

      Why?

      and the databases containing the facial/biometric data seized and destroyed.

      What would that do?

  • lmnjello@lemm.ee
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    This wasn’t just a bad facial recognition issue. During the photo lineup the detective intentionally used a many years old photograph of the victim that more closely matched the person in the surveillance video even though she had a current photo available. She also knew the woman in the surveillance video was definitely not 8 months pregnant, so it couldn’t possibly be the woman they identified, but she still arrested the victim anyway.

    Yes the facial recognition gave a false positive but any reasonable person would recognize instantly that they had the wrong person. The detective was either incompetent or a liar.

    • doctorcherry@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Incompetent, you don’t even need an undergraduate degree to be a detective in Detroit:

      Must be at least 18 years of age Must be a United States citizen Must have 20/20 vision or corrected 20/20 vision Must have normal color and depth perception vision Must hold a valid driver’s license Must have a good driving record Must have a high school diploma or GED Must have no felony convictions

      This technology is going to be a nightmare when combined with decades of deliberately selecting for low iq in the police hiring process.

      It seems obvious to check that the women in surveillance footage is also pregnant. But, if you have poor critical reasoning skills and you’re only looking for confirmation I can totally see how pregnancy could be missed.

      Incompetence.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    This is the best summary I could come up with:


    According to The New York Times, this incident is the sixth recent reported case where an individual was falsely accused as a result of facial recognition technology used by police, and the third to take place in Detroit.

    Advocacy groups, including the American Civil Liberties Union of Michigan, are calling for more evidence collection in cases involving automated face searches, as well as an end to practices that have led to false arrests.

    A 2020 post on the Harvard University website by Alex Najibi details the pervasive racial discrimination within facial recognition technology, highlighting research that demonstrates significant problems with accurately identifying Black individuals.

    Further, a statement from Georgetown on its 2022 report said that as a biometric investigative tool, face recognition “may be particularly prone to errors arising from subjective human judgment, cognitive bias, low-quality or manipulated evidence, and under-performing technology” and that it “doesn’t work well enough to reliably serve the purposes for which law enforcement agencies themselves want to use it.”

    The low accuracy of face recognition technology comes from multiple sources, including unproven algorithms, bias in training datasets, different photo angles, and low-quality images used to identify suspects.

    Reuters reported in 2022, however, that some cities are beginning to rethink bans on face recognition as a crime-fighting tool amid “a surge in crime and increased lobbying from developers.”


    I’m a bot and I’m open source!

  • ChrisostomeStrip@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    What a times to be alive. Remember when China was frowned upon for using such technology? And here we are, the county of freedom…

  • stopthatgirl7@kbin.social
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    1 year ago

    All six individuals falsely accused have been Black. The Detroit Police Department runs an average of 125 facial recognition searches per year, almost exclusively on Black men, according to data reviewed by The Times.

    Oh.

    It’s particularly risky for dark-skinned people. A 2020 post on the Harvard University website by Alex Najibi details the pervasive racial discrimination within facial recognition technology, highlighting research that demonstrates significant problems with accurately identifying Black individuals.

    Oh. I see.

  • Jourei@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    They should run the developers and directors’ faces in the system before deploying it. Modify the images so their skin is a bit darker.

    Either they actually make it work or they get to live with a constant fear of being arrested for something bullshit.

    I know I might as well ask to have the moon from the sky. :(