cross-posted from: https://fedia.io/m/fuckcars@lemmy.world/t/2201156

In case you were worried about the roads being too safe, you can rest easily knowing that Teslas will be rolling out with unsupervised “Full Self Driving” in a couple days.

It doesn’t seem to be going great, even in supervised mode. This one couldn’t safely drive down a simple, perfectly straight road in broad daylight :( Veered off the road for no good reason. Glad nobody got badly hurt.

We analyze the onboard camera footage, and try to figure out what went wrong. Turns out, a lot. We also talk through how camera-only autonomous cars work, Tesla’s upcoming autonomous taxi rollout, and how AI hallucinations figure into everything.

  • MushuChupacabra@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    The obvious reason is that the product is not capable of driving itself.

    That system has no awareness that it’s a motor vehicle.

    It has the capacity to execute instructions.

    It has no capacity to judge if the instructions are reasonable.

    It does not know, understand, or care if it gets you to your destination or if it kills you, or itself.

    No cruelty, just executing instructions.

    • markovs_gun@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      Look I think Tesla’s self driving cars are bad but this is a crazy viewpoint. If you only knew how many systems that are several orders of magnitude more dangerous than cars are run almost entirely by automated systems operating on extremely simple instructions (and nowhere near having “awareness” of or ability to judge anything), you’d apparently be shitting yourself because consciousness is apparently required to make good decisions. Chemical plants have been mostly automated since the 90s running on computers way simpler than anything in a Tesla, and they have way higher potential for disaster if something goes wrong.

      Self driving cars have a lot of problems right now but it’s absolutely insane to say that they inherently can’t work because there’s something special about consciousness that prohibits them from working without it. That’s like saying you can’t drive a car without having a soul or some other bullshit like that.

  • HakFoo@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Read the headline as “flips off car” and was pleased to see they hit a new milestone in mimicking human drivers.

  • boydster@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    There’s a lot wrong with Tesla’s implementation here, so I’m going to zoom in on one in particular. It is outright negligent to decide against using LIDAR on something like a car that you want to be autonomous. Maybe if this car had sensors to map out 3D space, that would help it move more successfully through 3D space?

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      You and I are (mostly) able to safely navigate a vehicle with 3D stereoscopic vision. It’s not a sensor issue, it’s a computation issue.

      • brygphilomena@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        If I eventually end up on a fully self driving vehicle, I want it to be better than what you and I can do with our eyes.

        Is it possible to drive with just stereoscopic vision, yea. But why is Tesla against BEING BETTER than humans?

      • TwanHE@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        In theory maybe, but our brains are basically a supercomputer on steroids when it comes to interpreting and improving the “video feed” our eyes give us.

        Could it be done with just cameras, probably some time in the future, but why the fuck wouldn’t you use a depth sensor now, and even in the future as a redundancy.

      • this_1_is_mine@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I can also identify a mirror. Tesla smashed that head on. If you can’t effectively understand the image then it’s not enough information.