GPT-4 is getting worse over time, not better. - eviltoast
  • Action Bastard@lemmy.world@lemmy.world
    link
    fedilink
    English
    arrow-up
    152
    arrow-down
    5
    ·
    edit-2
    1 year ago

    I’m not terribly surprised. A lot of the major leaps we’re seeing now came out of open source development after leaked builds got out. There were all sorts of articles flying around at the time about employees from various AI-focused company saying that they were seeing people solving in hours or days issues they had been attempting to fix for months.

    Then they all freaked the fuck out and it might mean they would lose the AI race and locked down their repos tight as Fort Knox, completely ignoring the fact that a lot of them were barely making ground at all while they kept everything locked up.

    Seems like the simple fact of the matter is that they need more eyes and hands on the tech, but nobody wants to do that because they’re all afraid their competitors will benefit more than they will.

    • goldenbug@kbin.social
      link
      fedilink
      arrow-up
      46
      ·
      1 year ago

      You point out a very interesting issue. I am unsure how this ties up to GPT 4 becoming worse in problem solving.

      • Action Bastard@lemmy.world@lemmy.world
        link
        fedilink
        English
        arrow-up
        46
        ·
        edit-2
        1 year ago

        I’d wager they’re attempting to replicate or integrate tools developed by the open source community or which got revealed by Meta’s leak of Llama source code. The problem is, all of those were largely built on the back of Meta’s work or were cludged together solutions made by OSS nerds who banged something together into a specific use case, often without many of the protections that would be required by a company who might be liable for the results of their software since they want to monetize it.

        Now, the problem is that Meta’s Llama source code is not based on GPT-4. GPT-4 is having to reverse engineer a lot of those useful traits and tools and retrofit it into their pre-existing code. They’re obviously hitting technical hurdles somewhere in that process, but I couldn’t say exactly where or why.

      • roguetrick@kbin.social
        link
        fedilink
        arrow-up
        15
        ·
        edit-2
        1 year ago

        I think this is just a result of the same reason reddit is doing what it’s doing, personally. Interest rates raised and companies are finding ways to make up the shortfall that accounting is now presenting them with. Reducing computing power by making your language model less introspective is one way to do that. It’s less detrimental than raising your prices or firing your key employees.