The incident in northern California marked the latest mishap blamed on the electric vehicle company’s Autopilot tech

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    4 months ago

    Now AI may or may not be overhyped but Tesla’s self-driving nonsense isn’t AI regardless. Just pattern recognition it is not the neural net everyone assumes it is.

    It really shouldn’t be legal, this tech will never work because it doesn’t include lidar so it lacks depth perception. Of course humans also don’t have lidar, but we have depth perception built in thanks billions of years of evolution. But computers don’t do too well with stereoscopic vision for 3D calculations, and really can do with actual depth information being provided to them.

    If you lack depth perception, and higher reasoning skills, for a moment you might actually think that a train driving past you is a road. 3D perception would have told the software that the train was vertical and not horizontal, and thus was a barrier and not a driving surface.

    • FishFace@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Just pattern recognition it is not the neural net everyone assumes it is.

      Tesla’s current iteration of self-driving is based on neural networks. Certainly the computer vision is; there’s no other way we have of doing computer vision that works at all well and, according to this article from last year it’s true for the decision-making too.

      Of course, the whole task of self-driving is “pattern recognition”; neural networks are just one way of achieving that.