• @anlumo@feddit.de
    link
    fedilink
    410 months ago

    As a software developer, that’s not how testing works. QA is always trying to come up with weird edge cases to test, but once it’s out in the wild with thousands (or more) of real-world users, there’s always going to be something nobody ever tried to test.

    For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway. This is just something you wouldn’t think of in lab conditions.

      • @abhibeckert@beehaw.org
        link
        fedilink
        3
        edit-2
        10 months ago

        And a thing blocking the road isn’t exactly unforeseen either.

        Tesla’s system intentionally assumes “a thing blocking the road” is a sensor error.

        They have said if they don’t do that, about every hour or so you’d drive past a building and it would slam on the brakes and stop in the middle of the road for no reason (and then, probably, a car would crash into you from behind).

        The good sensors used by companies like Waymo don’t have that problem. They are very accurate.