Tesla must face fraud suit for claiming its cars could fully drive themselves - eviltoast
  • Aceticon@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    6 months ago

    As I explained somebody else the other day, software development follows a 90/10 rule in that 90% of the work that needs doing is in the last 10% of the result and these guys have been stuck for years at the “almost there” stage.

    It’s perfectly possible to hack your way for the first easy 90% of the result but that software development “method” won’t get you up to the 99.999% levels of reliability (or whatever number of nines the regulations demand) needed for a FSD system to be certified as autonomous.

    So no amount of people showing full self drive working without problems sometimes or even most of the time (or as you say, “practically”) will show that Testla has the capability of doing the last 10% (which, remember, is most of the work), whilst them having been stuck at pretty much the current level for years is a good indication that they’re probably stuck down a dead-end that will never lead to something that can achieve the necessary reliability to be certified as an autonomous system.

    Also, in my professional opinion as a very senior software engineer, looking from the outside and judging by many software and UI design choices in their vehicles, they’re unlikelly to actually be competent enough to pull it off and seem to be following a Tech Startup model (and I can tell you from experience in that Industry and others, that Startups are usually amateur hour, every hour of the day, every day of the week, every week of the year compared to all of the rest) hence me mentioning above the possibility that they’ve might have “hacked” (i.e. mainly gone at it by trial and error) their way up the first 90%.