• JoBo@feddit.uk
    link
    fedilink
    arrow-up
    74
    arrow-down
    1
    ·
    1 year ago

    Should the Cruise car have not started moving if there was a person still on the crosswalk? This whole sad affair raises many questions.

    There are some questions but “should cars start moving while a person is still on the crosswalk?” is surely not one of them.

    • medgremlin@lemmy.sdf.org
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      A different question I have is whether or not the cars have transponders or other communication devices to automatically call emergency services in case of accidents. I’m assuming not because they would probably have a lot of junk calls and I doubt the company would have spent the time to create an algorithm for when to call 911 if they didn’t create an algorithm for what to do if there’s a pedestrian in a crosswalk.

      That’s one of the big downsides of these driverless cars: if a human accidentally ran over the victim, they have the capability to get out of the car to assess the situation, call 911, and offer aid to the victim. An empty car can only ever just sit there with its hazard lights on and maybe call for emergency services.

      • Unaware7013@kbin.social
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        The auto-driving company should be required to have something like an on-star operator available any time the vehicle receives an impact/shock above a certain threshold and any time physical safety measures are required. The local governments should not have to pay for the externalities created by these ‘disruptive technology’ jerks, especially when there are literal lives on the line.

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          Cruise has this. I actually applied for the position after my contract doing the same thing with Waymo ended (but was unfortunately ghosted). They’ve got a team of people who monitor the fleets in real time, mostly just helping a “stuck” car by identifying any objects or street signs that the SDC has been confused by, so that it can proceed with its course. But they also have protocols in place for reporting any collisions as soon as they’ve happened, as well. Willing to bet that Cruise called emergency services before anybody on the scene even did.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      Without knowing what type of vehicle the first car was, it’s hard to say how this played out. If it was a van or truck or something else that could’ve easily obstructed Cruise’s LIDAR system or if the other vehicle stopped ahead of the crosswalk line, the SDC would’ve had little to no way of knowing that there was anybody in the crosswalk.

      Can’t know for sure unless Cruise releases the video to the public, which they’re unlikely to do until the police do their investigation.

      • SheeEttin@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        If your vision of the crosswalk is obstructed, you don’t proceed through until it’s unobstructed. That’s true whether it’s radar, lidar, or vision. Truck in the way? Pull up as far as you can safely see, then look and proceed if clear.

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      11
      ·
      1 year ago

      A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data. Those outlier situations are exactly what real world data is needed for. And the only way to properly train for most of these situations is to drive in the real world. The real world isn’t perfect situations and nice lines on fresh asphalt so while base training in perfect situations is useful, it will still miss the exact same situation in a real world environment with crappy infrastructure.

      Not sure what or how Cruise uses the data collected in real-time, but I can see camera visuals categorizing a person laying in the crosswalk as something like damage to painted lines, and small debris that can be ignored. Other sensors like radar and lidar might have categorized returns as something like echoes or false results that could be ignored, again because a person laying in the crosswalk is extremely unlikely. False data returns happen all the time with things like radar and lidar, millions of data points are ignored as outliers or info that can be safely ignored, and sometimes that categorization is incorrect.

        • Duranie@lemmy.film
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          Not only that, but no matter whether it can identify a person as a person, cars shouldn’t be driving over objects that are child sized or larger.

          • Chozo@kbin.social
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            These cars won’t even drive over a fire hose laid out across the road (this was literally a test we did regularly when I worked on Waymo’s SDCs). They definitely won’t drive over a person (intentionally).

            Inertia is a factor here. Cars can’t stop on a dime. By the time the pedestrian was knocked in front of the SDC, there was likely not enough time to have possibly avoided hitting her, just due to physics.

      • JoBo@feddit.uk
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data

        I didn’t bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          That was the exact moment I called bullshit as well. You’d damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn’t train it on people holding canes or in wheelchairs.

          • JoBo@feddit.uk
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            It’s not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).

            She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.

        • mars296@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Thats the whole point of their comment. The car did not recognize anyone was on the crosswalk because it was never trained to look for people laying in the crosswalk.

          • SheeEttin@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            And that’s fine. But if it’s unable to recognize any object in the road, it’s not fit for purpose. The fact that the object was a person just makes it so much worse.

            • mars296@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Agreed. I’m not defending Cruise at all. They should have humans in the car if they are testing. Or at least a drone-style driver sitting in a room watching a camera feed. I wonder if the car thought there was just a speed bump ahead. Some speed bumps are striped similar to crosswalks. I can see situations where the autopilot can’t determine if something is a speed bump or genuine obstruction (either false positive or negative).

      • Nurchu@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I actually work at one of these AV companies. We definitely have training data on adults and children laying down. I’d be very very very surprised if Cruise doesn’t due to all the people laying down on the sidewalks in SF. In addition, the clarity of the lidar/camera data on objects on the road is very clear. You can see the dips and potholes in the road as well as specifically see the raises of the painted lines. There’s no way they weren’t tracking the person.

        I could see predictions on the pedestrian saying the coast is clear. Once the initial crash happens, there likely isn’t enough room to stop in time even with a max break.