New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

  • albinanigans
    link
    fedilink
    110 months ago

    Great, self-driving cars can be childfree and racist, just like some human drivers!

    • stopthatgirl7OP
      link
      fedilink
      010 months ago

      Researchers ran more than 8,000 images through the software and found that the self-driving car systems were nearly 20% better at detecting adult pedestrians than kids, and more than 7.5% better at detecting light-skinned pedestrians over dark-skinned ones. The AI were even worse at spotting dark-skinned people in low light and low settings, making the tech even less safe at night.

      I’d say a 20% difference is pretty significant.

        • stopthatgirl7OP
          link
          fedilink
          010 months ago

          I would say that’s not actually at all a relevant question, but a form of whataboutism, since this is looking at just driverless programs and comparing how they are with themselves, and what problems with programming and training models could result in that difference.

          • Ganondorf
            link
            fedilink
            1
            edit-2
            10 months ago

            a form of whataboutism

            Agreed. The argument of matching autonomous vehicle perceptions with human perception should be completely irrelevant. When an autonomous vehicle has that significant of a margin of error, who ends up being responsible for the accident? When humans are involved, the driver is responsible. Is a manufacturer liable in the event of all autonomous vehicle caused accidents? Guaranteed corporations will rally and lobby to make that not possible. The situations aren’t the same and a huge selling point of autonomous vehicles has always been that they should be the safest form of piloting a vehicle.