The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • Konala Koala@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    13 hours ago

    Every time I hear something about pedestrian being killed by something self-driving, it begins to irk me as to why are we pushing for such and such technology.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 hours ago

      The bad news is people hitting and killing pedestrians is so common you don’t hear about it. Fuck Musk and all that, but some number of people are always going to get killed. Even the FSD system that was as close to perfect as possible would still occasionally kill someone in large enough numbers, because there’s too many variables to account for. If the numbers are lower than a human driving, it’s a positive.

      We should be trying to move away from cars though ideally. Fuck electric cars, FSD cars, and all other cars. A bus, train, bike, or whatever else would be safer and better for the environment.

      • beanlink@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Lets install adaptive headlights to stop blinding people or allowing manufacturers to install chrome accents on the rear of a vehicle to again stop blinding people or even just maybe make a smaller truck that isn’t lifting ego and instead actual building materials.

        NHSTA:

    • PeroBasta@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      5 hours ago

      Because it is generally proven to save lifes. You’ll never hear of “thanks for the auto-brake system no one got injured and everything was boring as usual” but it happened a lot (also to me in first person).

      I don’t like Musk but in general its a good thing to push self driving cars IMO. I drive 2 hours per day and the amount of time where I see retarded people doing retarded stuff at the wheel is crazy.

      • DillyDaily@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        3 hours ago

        This is the thing. Musk and everything his company does in terms of labour and marketing, and just their whole ethos is unethical as fuck, and I can’t stand that as a society we are celebrating Tesla.

        But self driving cars are not inherently bad or dangerous to persue as a technological advancement.

        Self driving cars will kill people, they’ll will hit pedestrians and crash into things.

        So do cars driven by humans.

        Human driven cars kill a lot of people.

        Self driving cars need to be safer than human driven cars to even consider letting them on the the road, but we can’t truly expect a 0% accident rate on self driving cars in the early days of the technology when we don’t expect that of the humanity driven cars.

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    edit-2
    18 hours ago

    Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

    This of course assumes 1) that cameras are just as good as eyes (they’re not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

    Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn’t. We tolerate crashes because we can’t improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it’s irrational not to do it.

    • blady_blah@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      15 hours ago

      This is directly a result of Elon’s edict that Tesla cars don’t use lidar. If you aren’t aware Elon set that as a requirement at the beginning of Tesla’s self driving project because he didn’t want to spend the money on lidar for all Tesla cars.

      His “first principles” logic is that humans don’t use lidar therefore self driving should be able to be accomplished without (expensive) enhanced vision tools. While this statement has some modicum of truth, it’s obviously going to trade off safely in situations where vision is compromised. Think fog or sunlight shining in your cameras / eyes or a person running across the street at night wearing all black. There are obvious scenarios where lidar is a massive safety advantage, but Elon made a decision for $$ to not have that. This sounds like a direct and obvious outcome of that edict.

      • WoodScientist@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        14 hours ago

        His “first principles” logic is that humans don’t use lidar therefore self driving should be able to be accomplished without (expensive) enhanced vision tools.

        This kind of idiocy is why people tried to build airplanes with flapping wings. Way too many people thought that the best way to create a plane was to just copy what nature did with birds. Nature showed it was possible, so just copy nature.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 hours ago

          To be fair, we achieved flight by copying nature. Once we realized the important part was the shape of a wing more than the flapping.

    • SkyeStarfall@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      39
      ·
      18 hours ago

      Also, on a final note…

      Why the fuck would you limit yourself to only human senses when you have the capability to add more of any sense you want??

      If you have the option to add something that humans don’t have, why wouldn’t you? As an example, humans don’t have gps either, but it’s very useful to have in a car

      • xthexder@l.sw0.com
        link
        fedilink
        English
        arrow-up
        24
        ·
        17 hours ago

        Unfortunately the answer to that is: Elon’s cheap and Radar is expensive. Not so expensive that you can’t get it in a base model Civic though, which just makes it that much more absurd.

      • sue_me_please@awful.systems
        link
        fedilink
        English
        arrow-up
        13
        ·
        17 hours ago

        Because a global pandemic broke your sensor supply chain and you still want to sell cars with FSD anyway, so cameras-only it is!

      • rsuri@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        18 hours ago

        Well building battlemechs does seem like the obvious next step on Elon’s progression

        • ℍ𝕂-𝟞𝟝@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          5
          ·
          16 hours ago

          You mean promising to build battlemechs, and fucking around for 5 years while grifting his stock valuation sky-high, then coming forward with a cheap robot that can’t even walk?

    • TheKMAP@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      14 hours ago

      If the camera system + software results in being 1% safer than a human, and a given human can’t afford the lidar version, society is still better off with the human using the camera-based FSD than driving manually. Elon being a piece of shit doesn’t detract from this fact.

      But, yes, a lot of “ifs” in there, and obviously he did this to cut costs or supply chain or blahblah

      Lidar or other tech will be more relevant once we’ve raised the floor (everyone getting the additional safety over manual driving) and other FSDs become more mainstream (competition)

    • Revan343@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      18 hours ago

      Regarding point number 2, I have no doubt we’ll be able to develop systems that process visual/video data as well as or better than people. I just know we aren’t there yet, and Tesla certainly isn’t.

      I like to come at the argument from the other direction though; humans drive with eyesight because that’s all we have. If I could be equipped with sonar or radar or lidar, of fucking course I’d use it, wouldn’t you?

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    3
    ·
    18 hours ago

    Really fucking stupid that we as a society intentionally choose to fuck around and find out rather than find out before we fuck around.

  • demizerone@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    16 hours ago

    I purchased FSD when it was 8k. What a crock of shit. When I sold the car, that was this only gave the car value after 110k miles and it was only $1500 at most.

    • DerArzt@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      18 hours ago

      Tesla Musk: Why would we need lidar? Just use visual cameras

      FTFY

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    84
    ·
    1 day ago

    Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

    how is it legal to label this “full self driving” ?

    • kiku@feddit.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      16 hours ago

      If customers can’t assume that boneless wings don’t have bones in them, then they shouldn’t assume that Full Self Driving can self-drive the car.

      The courts made it clear that words don’t matter, and that the company can’t be liable for you assuming that words have meaning.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      24 hours ago

      “I freely admit that the refreshing sparkling water I sell is poisonous and should not be consumed.”

      • don@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        22 hours ago

        “But to be clear, although I most certainly know for a fact that the refreshing sparkling water I sell is exceedingly poisonous and should in absolutely no way be consumed by any living (and most dead*) beings, I will nevertheless very heartily encourage you to buy it. What you do with it after is entirely up to you.

        *Exceptions may apply. You might be one.

    • krashmo@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      24 hours ago

      That’s pretty clearly just a disclaimer meant to shield them from legal repercussions. They know people aren’t going to do that.

      • GoodEye8@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        23 hours ago

        Last time I checked that disclaimer was there because officially Teslas are SAE level 2, which let’s them evade regulations that higher SAE levels have, and in practice Tesla FSD beta is SAE level 4.

          • GoodEye8@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            That’s what I read from an article but I don’t think whether they’re level 4 or not doesn’t really matter. The point is they officially claim to be level 2 but their cars clearly function beyond level 2.

      • FiskFisk33@startrek.website
        link
        fedilink
        English
        arrow-up
        7
        ·
        20 hours ago

        legal or not it’s absolutely bonkers. Safety should be the legal assumption for marketing terms like this, not an optional extra.

  • breadsmasher@lemmy.world
    link
    fedilink
    English
    arrow-up
    139
    arrow-down
    3
    ·
    1 day ago

    Eyes can’t see in low visibility.

    musk “we drive with our eyes, cameras are eyes. we dont need LiDAR”

    FSD kills someone because of low visibility just like with eyes

    musk reaction -

    • flames5123@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      16 hours ago

      The cars used to have RADAR. But they got rid of that and even disabled it on older models when updating because they “only need cameras.”

      Cameras and RADAR would have been good enough for most all conditions…

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      81
      arrow-down
      2
      ·
      edit-2
      1 day ago

      It’s worse than that, though. Our eyes are significantly better than cameras (with some exceptions at the high end) at adapting to varied lighting conditions than cameras are. Especially rapid changes.

      • jerkface@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        22
        ·
        23 hours ago

        Hard to credit without a source, modern cameras have way more dynamic range than the human eye.

        • magiccupcake@lemmy.world
          link
          fedilink
          English
          arrow-up
          33
          arrow-down
          1
          ·
          23 hours ago

          Not in one exposure. Human eyes are much better with dealing with extremely high contrasts.

          Cameras can be much more sensitive, but at the cost of overexposing brighter regions in an image.

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            18
            ·
            22 hours ago

            They’re also pretty noisy in low light and generally take long exposures (a problem with a camera at high speeds) to get sufficient input to see anything in the dark. Especially if you aren’t spending thousands of dollars with massive sensors per camera.

            • jerkface@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              5 hours ago

              I dunno what cameras you are using but a standard full frame sensor and an F/4 lens sees way better in low light than the human eye. If I take a raw image off my camera, there is so much more dynamic range than I can see or a monitor can even represent, you can double the brightness at least four times (ie 16x brighter) and parts of the image that looked pure black to the eye become perfectly usable images. There is so so so much more dynamic range than the human eye.

              • conciselyverbose@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                ·
                5 hours ago

                Do you know what the depth of field at f/4 looks like? It’s not anywhere in the neighborhood of suitable for a car, and it still takes a meaningful exposure length in low light conditions to get a picture at all, which is not suitable for driving at 30mph, let alone actually driving fast.

                That full frame sensor is also on a camera that’s several thousand dollars.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      24
      ·
      edit-2
      24 hours ago

      He really is a fucking idiot. But so few people can actually call him out… So he just never gets put in his place.

      Imagine your life with unlimited redos. That’s how he lives.

      • III@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        ·
        1 day ago

        Correction - Older Teslas had lidar, Musk demanded they be removed because they cut into his profits. Not a huge difference but it does show how much of a shitbag he is.

      • normanwall@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 day ago

        Honestly though, I’m a fucking idiot and even I can tell that Lidar might be needed for proper, safe FSD

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      1 day ago

      The whole “we drive with our eyes” thing is such bullshit. Humans are terrible drivers. Autonomous driving should be better than humans.

      That goes for OpenPilot too. They actually openly advertise that their software makes the same mistakes as humans, as if it’s some sort of advancement. Like if I could plug Lidar into my brain, I totally would.

    • RandomStickman@fedia.io
      link
      fedilink
      arrow-up
      23
      ·
      1 day ago

      You’d think “we drive with our eyes, cameras are eyes.” is an argument against only using cameras but that do I know.

    • aramis87@fedia.io
      link
      fedilink
      arrow-up
      12
      arrow-down
      4
      ·
      1 day ago

      What pisses me off about this is that, in conditions of low visibility, the pedestrian can’t even hear the damned thing coming.

      • SmoothLiquidation@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        6
        ·
        1 day ago

        I hear electric cars all the time, they are not much quieter than an ice car. We don’t need to strap lawn mowers to our cars in the name of safety.

        • 1984@lemmy.today
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          23 hours ago

          I think they are a lot more quiet. I’ve turned around and seen a car 5 meter away from me, and been surprised. That never happens with fuel cars.

          I think if you are young, maybe there isn’t a big difference since you have perfect hearing. But middle aged people lose quite a bit of that unfortunately.

          • idunnololz@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            23 hours ago

            I’m relatively young and it can still be difficult to hear them especially the ones without a fake engine sound. Add some city noise and they can be completely inaudible.

            • spacesatan@lazysoci.al
              link
              fedilink
              English
              arrow-up
              4
              ·
              19 hours ago

              ‘city noise’ you mean ICE car noise. We should be trying to reduce noise pollution not compete with it.

              • idunnololz@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                19 hours ago

                It’s not safe for cars to be totally silent when moving imo since I’d imagine it’s more likely to get run over.

  • JIMMERZ@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    21 hours ago

    The worst way to die would be getting hit by a shitbox Tesla. RIP.

  • fluxion@lemmy.world
    link
    fedilink
    English
    arrow-up
    110
    arrow-down
    1
    ·
    1 day ago

    National Highway Traffic Safety Administration is now definitely on Musk’s list of departments to cut if Trump makes him a high-ranking swamp monster

    • lurker8008@lemmy.world
      link
      fedilink
      English
      arrow-up
      88
      ·
      1 day ago

      Why do you think musk dumping so much cash to boost Trump? The plan all along is to get kickbacks like stopping investigation, lawsuits, and regulations against him. Plus subsidies.

      Rich assholes don’t spend money without expectation of ROI

      He knows Democrats will crack down on shady practices so Trump is his best bet.

      • vxx@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        ·
        edit-2
        1 day ago

        He’s not hoping for a kickback, he is offered a position as secretary of cost-cutting.

        He will be able to directly shut down everything he doesn’t like under the pretense of saving money.

        Trump is literally campaigning on the fact that government positions are up for sale under his admin.

        “I’m going to have Elon Musk — he is dying to do this… We’ll have a new position: secretary of cost-cutting, OK? Elon wants to do that,” the former president said"

    • WalnutLum@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      16 hours ago

      Alongside the EPA for constantly getting in the way of the FAA trying to slip his SpaceX flight licenses through with a wink and a nudge instead of properly following regulations, and the FAA for trying to keep a semblance of legality through the whole process.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    64
    arrow-down
    5
    ·
    1 day ago

    Humans know to drive more carefully in low visibility, and/or to take actions to improve visibility. Muskboxes don’t.

    • Hannes@feddit.org
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      1
      ·
      1 day ago

      They also decided to only use cameras and visual clues for driving instead of using radar, heat cameras or something like that as well.

      It’s designed to be launched asap, not to be safe

      • mindaika@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        4
        ·
        1 day ago

        I mean, that’s just good economics. I’m willing to bet someone at Tesla has done the calcs on how many people they can kill before it becomes unprofitable

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 day ago

      I’m not so sure. Whenever there’s crappy weather conditions, I see a ton of accidents because so many people just assume they can drive at the posted speed limit safely. In fact, I tend to avoid the highway altogether for the first week or two of snow in my area because so many people get into accidents (the rest of the winter is generally fine).

      So this is likely closer to what a human would do than not.

      • nyan@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        The question is, is Tesla FSD’s record better, worse, or about the same on average as a human driver under the same conditions? If it’s worse than the average human, it needs to be taken off the road. There are some accident statistics available, but you have to practically use a decoder ring to make sure you’re comparing like to like even when whoever’s providing the numbers has no incentive to fudge them. And I trust Tesla about as far as I could throw a Model 3.

        On the other hand, the average human driver sucks too.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          22 hours ago

          Yeah, I honestly don’t know. My point is merely that we should have the same standards for FSD vs human driving, at least initially, because they have a lot more potential for improvement than human drivers. If we set the bar too high, we’ll just delay safer transportation.

        • Jrockwar@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          20 hours ago

          You can’t measure this, because it has drivers behind the wheel. Even if it did three “pedestrian-killing” mistakes every 10 miles, chances are the driver will catch every mistake per 10000 miles and not let it crash.

          But on the other hand, if we were to measure every time the driver takes over the number would be artificially high - because we can’t predict the future and drivers are likely to be overcautious and take over even in circumstances that would have turned out OK.

          The only way to do this IMO is by

          • measuring every driver intervention
          • only letting it be driverless and marketable as self-driving when it achieves a very low number of interventions ( < 1 per 10000 miles?)
          • in the meantime, market it as “driver assist” and have the responsibility fall into the driver, and treat it like the “somewhat advanced” cruise control that it is.
      • III@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        low visibility, including sun glare, fog and airborne dust

        I also see a ton of accidents when the sun is in the sky or if it is dusty out. \s

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Yup, especially at daylight savings time when the sun changes position in the sky abruptly.

          Cameras are probably worse here, but they may be able to make up for it with parallel processing the poor data they get.

    • _bcron@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      22 hours ago

      The median driver sure, but the bottom couple percent never miss their exit and tend to do boneheaded shit like swerving into the next lane when there’s a stopped car at a crosswalk. >40,000 US fatalities in 2023. There are probably half a dozen fatalities in the US on any given day by the time the clock strikes 12:01AM on the west coast.

      Edit: some more food for thought as I’ve been pondering:

      FSD may or may not be better than the median driver (maybe this investigation will add to knowledge), but it’s likely better than the worst drivers… But the worst drivers are the most likely to vastly overestimate their competence, which might lead to them actively avoiding the use of any such aids, despite those drivers being the ones who would see the greatest benefit from using them. We might be forever stuck with boneheaded drivers doing boneheaded shit

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      Humans know to drive more carefully in low visibility…Muskboxes don’t.

      They do, actually. It even displays a message on the screen about low visibility.

  • elgordino@fedia.io
    link
    fedilink
    arrow-up
    28
    arrow-down
    4
    ·
    1 day ago

    If anyone was somehow still thinking RoboTaxi is ever going to be a thing. Then no, it’s not, because of reasons like this.

    • testfactor@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      5
      ·
      1 day ago

      It doesn’t have to not hit pedestrians. It just has to hit less pedestrians than the average human driver.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        20
        ·
        1 day ago

        Exactly. The current rate is 80 deaths per day in the US alone. Even if we had self-driving cars proven to be 10 times safer than human drivers, we’d still see 8 news articles a day about people dying because of them. Taking this as ‘proof’ that they’re not safe is setting an impossible standard and effectively advocating for 30,000 yearly deaths, as if it’s somehow better to be killed by a human than by a robot.

        • Dr. Moose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          16 hours ago

          But they aren’t and likely never will be.

          And how are we to correct for lack of safety then? With human drivers you obvious discourage dangerous driving through punishment. Who do you punish in a self driving car?

          • Billiam@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 day ago

            If you get killed by a robot, you can at least die knowing your death was the logical option and not a result of drunk driving, road rage, poor vehicle maintenance, panic, or any other of the dozens of ways humans are bad at decision-making.

        • ano_ba_to@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          22 hours ago

          “10 times safer than human drivers”, (except during specific visually difficult conditions which we knowingly can prevent but won’t because it’s 10 times safer than human drivers). In software, if we have replicable conditions that cause the program to fail, we fix those, even though the bug probably won’t kill anyone.

        • III@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          The problem with this way of thinking is that there are solutions to eliminate accidents even without eliminating self-driving cars. By dismissing the concern you are saying nothing more than it isn’t worth exploring the kinds of improvements that will save lives.

      • elgordino@fedia.io
        link
        fedilink
        arrow-up
        13
        ·
        1 day ago

        It needs to be way way better than ‘better than average’ if it’s ever going to be accepted by regulators and the public. Without better sensors I don’t believe it will ever make it. Waymo had the right idea here if you ask me.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 day ago

          But why is that the standard? Shouldn’t “equivalent to average” be the standard? Because if self-driving cars can be at least as safe as a human, they can be improved to be much safer, whereas humans won’t improve.

          • medgremlin@midwest.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            I’d accept that if the makers of the self-driving cars can be tried for vehicular manslaughter the same way a human would be. Humans carry civil and criminal liability, and at the moment, the companies that produce these things only have nominal civil liability. If Musk can go to prison for his self-driving cars killing people the same way a regular driver would, I’d be willing to lower the standard.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 day ago

              Sure, but humans are only criminally liable if they fail the “reasonable person” standard (i.e. a “reasonable person” would have swerved out of the way, but you were distracted, therefore criminal negligence). So the court would need to prove that the makers of the self-driving system failed the “reasonable person” standard (i.e. a “reasonable person” would have done more testing in more scenarios before selling this product).

              So yeah, I agree that we should make certain positions within companies criminally liable for criminal actions, including negligence.

              • medgremlin@midwest.social
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 day ago

                I think the threshold for proving the “reasonable person” standard for companies should be extremely low. They are a complex organization that is supposed to have internal checks and reviews, so it should be very difficult for them to squirm out of liability. The C-suite should be first on the list for criminal liability so that they have a vested interest in ensuring that their products are actually safe.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 day ago

                  Sure, the “reasonable person” would be a competitor who generally follows standard operating procedures. If they’re lagging behind the industry in safety or something, that’s evidence of criminal negligence.

                  And yes, the C-suite should absolutely be the first to look at, but the problem could very well come from someone in the middle trying to make their department look better than it is and lying to the C-suites. C-suites have a fiduciary responsibility to the shareholders, whereas their reports don’t, so they can have very different motivations.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 day ago

        That is the minimal outcomes for an automated safety feature to be an improvement over human drivers.

        But if everyone else is using something you refused to that would have likely avoided someone’s death, while misnaming you feature to mislead customers, then you are in legal trouble.

        When it comes to automation you need to be far better than humans because there will be a higher level of scrutiny. Kind of like how planes are massively safer than driving on average, but any incident where someone could have died gets a massive amount of attention.

      • dmention7@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        24 hours ago

        It’s bit reductive to put it in terms of a binary choice between an average human driver and full AI driver. I’d argue it has to hit less pedestrians than a human driver with the full suite of driver assists currently available to be viable.

        Self-driving is purely a convenience factor for personal vehicles and purely an economic factor for taxis and other commercial use. If a human driver assisted by all of the sensing and AI tools available is the safest option, that should be the de facto standard.

      • helenslunch@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        1 day ago

        It does, actually. That’s why robotaxis and self-driving cars in general will never be a thing.

        Society accepts that humans make mistakes, regardless of how careless they’re being at the time. Autonomous vehicles are not allowed the same latitude. A single pedestrian gets killed and we have to get them all off the road.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    edit-2
    1 day ago

    I thought it was illegal to call it full self driving? So I thought Tesla had something new.
    Apprently it’s the moronic ASSISTED full self driving the article is about. So nothing new.
    Tesla does not have a legal full self driving system, so why do articles keep pushing the false narrative, even after it’s deemed illegal?

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        24 hours ago

        Absolutely, but that’s what Tesla decided, that or supervised, because it’s illegal to call it actually full self driving.
        But an oxymoron is also fitting for Musk. You can even skip the oxy part. 😋

      • notfromhere@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        100% agree. Who sells assisted full self driving anyway? Tesla’s is supervised which means it drives and the person behind the wheel is liable for its fuckups.

    • notfromhere@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 day ago

      Did they change it again? It was FSD Beta, then Supervised, now you’re telling me it’s ASSISTED? Since that’s not in TFA…

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        24 hours ago

        IDK I heard assisted, maybe they decided on supervised? The central point is that it’s illegal in some states to call it full self driving, because it’s false advertising.

    • FigMcLargeHuge@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 day ago

      so why do articles keep pushing the false narrative, even after it’s deemed illegal?

      The same reason that simple quadcopters have been deemed by the press to be called “drones”. You can’t manufacture panic and outrage with a innocuous name.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        24 hours ago

        Calling it a drone has nothing to do with how many propellers it has, some drones are Jet driven. some are boats and some are vehicles.
        A Drone is simply an unmanned craft, controlled remotely or by automation.

        https://www.merriam-webster.com/dictionary/drone

        an uncrewed aircraft or vessel guided by remote control or onboard computers:

        • FigMcLargeHuge@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          23 hours ago

          It sure doesn’t say when that was updated, but for a long period of time the use of drone when discussing unmanned aircraft was reserved for military craft that were usually armed and used to kill people. In the attempt to demonize hobby rc use, the press started calling simple quadcopters (and other propeller configurations if we are being pedantic) drones and not what they were normally called by the people using and making them in the hobby. My point still stands, the press likes to change the wording of things, and will perpetuate their narrative in order to garner views. Manufacturing fear is part of their tactic, and is why I replied what I replied to the question of why the press continues to push the false narrative of these cars being “self driving”.

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      I thought it was illegal to call it full self driving?

      Courts have already ruled the opposite.

      why do articles keep pushing the false narrative

      Because that’s what it’s called.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            20 hours ago

            You Realize The Associated Press is responsible in California as well as any other state in USA right?

            • helenslunch@feddit.nl
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              20 hours ago

              The Associated Press? The news org? Responsible for what?

              Look, we can settle this real quick. Go to Tesla.com and find the spot where it still says “full self driving”. Maybe use a VPN if you’re in California.

              • Buffalox@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                19 hours ago

                Since it’s supposed to support your argument, it’s your job to show the use, not mine to find it.
                And even if they do, it’s false advertising to call it full self driving.
                And I don’t understand how they have not been sued to oblivion by now?

                • helenslunch@feddit.nl
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  19 hours ago

                  I didn’t ask you to find it. I told you where it was. If you want to cover your eyes with blinders, that’s your prerogative.