Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

  • Rinox@feddit.it
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    Isn’t that true for humans as well? I know I find it harder to see children due to the small size and dark skinned people at night due to, you know, low contrast (especially if they are wearing dark clothes).

    Human vision be racist and ageist

    Ps: but yes, please do improve the algorithms

    • tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Part of the children problem is distinguishing between ‘small’ and ‘far away’. Humans seem reasonably good at it, but from what I’ve seen AIs aren’t there yet.

  • dangblingus@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    I’m sick of the implication that computer programmers are intentionally or unintentionally adding racial bias to AI systems. As if a massive percentage of software developers in NA aren’t people of color. When can we have the discussion where we talk about how photosensitive technology and contrast ratio works?

    • pageflight@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      There’s still a huge racial disparity in tech work forces. For one example, at Google according to their diversity report (page 66), their tech workforce is 4% Black versus 43% White and 50% Asian. Over the past 9 years (since 2014), that’s an increase from 1.5% to 4% for Black tech workers at Google.

      There’s also plenty of news and research illuminating bias in trained models, from commercial facial recognition sets trained with >80% White faces to Timnit Gebru being fired from Google’s AI Ethics group for insisting on admitting bias and many more.

      I also think it overlooks serious aspects of racial bias to say it’s hard. Certainly, photographic representation of a Black face is going to provide less contrast within the face than for lighter skin. But that’s also ingrained bias. The thing is people (including software engineers) solve tough problems constantly, have to choose which details to focus on, rely on our experiences, and our experience is centered around outselves. Of course racist outcomes and stereotypes are natural, but we can identify the likely harmful outcomes and work to counter them.

  • angelsomething@lemmy.one
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    Easy solution is to enforce a buddy system. For every black person walking alone at night must accompanied by a white person. /s

  • OrdinaryAlien@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 year ago

    DRIVERLESS CARS: We killed them. We killed them all. They’re dead, every single one of them. And not just the pedestmen, but the pedestwomen and the pedestchildren, too. We slaughtered them like animals. We hate them!

  • 666dollarfootlong@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Wouldn’t good driverless cars use radars or lidars or whatever? Seems like the biggest issue here is that darker skin tones are harder for cameras to see

    • MSids@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Tesla removed the LiDAR from their cars, a step backwards if you ask me.

      Edit: Sorry RADAR not LiDAR.

    • mint_tamas@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I think many driverless car companies insist on only using cameras. I guess lidars/radars are expensive.

  • Tony Bark@pawb.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Maybe if we just, I dunno, funded more mass transit and made it more accessible? Hell, trains are way better at being automated than any single car.

    • Fushuan [he/him]@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yes, but also improve kid and dark skin people detection tools, they don’t work just for driving cars. Efficient, fast and accurate people detection and tracking tools can be used in other myriad of stuff.

      Imagine a system that tracks the amount of people in different sections of the store, a system that checks the amount of people going in and out of stores to control how many are inside… There’s a lot of tools that already do this, but and they works somewhat reliably, but they can be improved, and the models being developed for cars will then be reused. I+D is a good thing.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      1 year ago

      The trains in California are trash. I’d love to see good ones, but this isn’t even a thought in the heads of those who run things.

      Dreaming is nice… But reality sucks, and we need to deal with it. Self driving cars are a wonderful answer, but Tesla, is fucking it up for everyone.

      • fresh@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Strongly disagree. Trains are nice everywhere in the world. There’s no reason they can’t be nice in the US. Cars are trash. Strip malls are trash. Giant parking lots are trash. The sky high cost of cars is trash. The environmental impact of cars is trash. The danger of cars is trash. Car centric urban planning is trash.

        Self-driving cars are safer… than the most dangerous thing ever. But because cars are inherently so dangerous, they are still more dangerous than just about any other mode of transportation.

        Dreaming is nice, but that’s all self-driving cars are right now. I don’t see why we don’t have better dreams.

        • duffman@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Chiming in from Seattle, we just built light rail up here and it’s just awful how slow they made it. It has its own track… It’s insane that it’s slower than driving in traffic. But they wanted to serve every neighborhood possible instead of realizing trains are not a last mile solution unless you build cities specifically around it.

          • SkepticalButOpenMinded@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Reporting from Vancouver, Canada. Our skytrain system is very fast and reliable. Comes every 1-3 minutes. I’ve never heard any complaints.

            I looked this up and I was surprised to learn that the skytrain speed is 25-40km/h (20-25mph) while Seattle’s Link transit goes 35-55mph. That sounds very fast for a city transit system! Are you sure it’s slower than a car in traffic, with all the stop lights and rush hour? I’m skeptical but I’ve never used Link.

            • duffman@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Google maps right now puts the light rail at one hour on light rail for what is a 23 minute drive. Last time I rode it, it took over an hour and a half for that same trip, and that’s excluding the time waiting for the train.

      • zephyreks@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Trains in California suck because of government dysfunction across all levels. At the municipal level, you can’t build shit because every city is actually an agglomeration of hundreds of tiny municipalities that all squabble with each other. At the regional level, you get NIMBYism that doesn’t want silly things like trains knocking down property values… And these people have a voice, because democracy I guess (despite there being a far larger group of people that would love to have trains). At the state level, you have complete funding mismanagement and project management malfeasance that makes projects both incredibly expensive and developed with no forethought whatsoever (Caltrain has how many at-grade crossings, again?).

        This isn’t a train problem, it’s a problem with your piss-poor government. At least crime is down, right?

  • ChromeSkull@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    A single flir camera would help massively. They don’t care about colour or height. Only temperature.

    • UsernameIsTooLon@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I could make a warm water balloon in the shape of a human and it would stop the car then. Maybe a combination of all various types of technologies? You’d still have to train the model on all various kinds of humans though.

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    This has been the case with pretty much every single piece of computer-vision software to ever exist…

    Darker individuals blend into dark backgrounds better than lighter skinned individuals. Dark backgrounds are more common that light ones, ie; the absence of sufficient light is more common than 24/7 well-lit environments.

    Obviously computer vision will struggle more with darker individuals.

    • zephyreks@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      2
      ·
      edit-2
      1 year ago

      If the computer vision model can’t detect edges around a human-shaped object, that’s usually a dataset issue or a sensor (data collection) issue… And it sure as hell isn’t a sensor issue because humans do the task just fine.

      • WhiteHawk@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Do they? People driving at night quite often have a hard time seeing pedestrians wearing dark colors.

      • duffman@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        And it sure as hell isn’t a sensor issue because humans do the task just fine.

        Sounds like you have never reviewed dash camera video or low light photography.

  • camillaSinensis@reddthat.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I’d assume that’s either due to bias in the training set, or poor design choices. The former is already a big problem in facial recognition, and can’t really be fixed unless we update datasets. With the latter, this could be using things like visible light for classification, where the contrast between target and background won’t necessarily be the same for all skin tones and times os day. Cars aren’t limited by DNA to only grow a specific type of eye, and you can still create training data from things like infrared or LIDAR. In either case though, it goes to show how important it is to test for bias in datasets and deal with it before actually deploying anything…

  • Squirrel@thelemmy.club
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Okay? It’s not like these systems are actually intelligent. Anything different from the majority of cases is going to be at an inherent disadvantage in being detected, right? At the volume of data used for their models, surely it’s just a matter of statistics.

    Maybe I’m wrong (and I’m surely using the wrong terminology), but it seems like that must be the case. It’s not some issue of human racial bias, just a bias based on relative population. Or is my understanding that flawed?

    Mind you, I’m not saying it doesn’t need to be remedied posthaste.

    • hglman@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yes, the issue is the data used to teach the systems that people look like are biased towards white men most likely.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    1 year ago

    Ya know, I am not surprised that even self driving cars somehow ended up with the case of accidental racism and wanting to murder children. Even though this is a serious issue, it’s still kinda funny in a messed up way.

  • RobotToaster@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    The study only used images and the image recognition system, so this will only be accurate for self driving systems that operate purely on image recognition. The only one that does that currently is Tesla AFAIK.