GPT-4 is getting worse over time, not better. - eviltoast
  • goldenbug@kbin.social
    link
    fedilink
    arrow-up
    46
    ·
    1 year ago

    You point out a very interesting issue. I am unsure how this ties up to GPT 4 becoming worse in problem solving.

    • Action Bastard@lemmy.world@lemmy.world
      link
      fedilink
      English
      arrow-up
      46
      ·
      edit-2
      1 year ago

      I’d wager they’re attempting to replicate or integrate tools developed by the open source community or which got revealed by Meta’s leak of Llama source code. The problem is, all of those were largely built on the back of Meta’s work or were cludged together solutions made by OSS nerds who banged something together into a specific use case, often without many of the protections that would be required by a company who might be liable for the results of their software since they want to monetize it.

      Now, the problem is that Meta’s Llama source code is not based on GPT-4. GPT-4 is having to reverse engineer a lot of those useful traits and tools and retrofit it into their pre-existing code. They’re obviously hitting technical hurdles somewhere in that process, but I couldn’t say exactly where or why.

    • roguetrick@kbin.social
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      1 year ago

      I think this is just a result of the same reason reddit is doing what it’s doing, personally. Interest rates raised and companies are finding ways to make up the shortfall that accounting is now presenting them with. Reducing computing power by making your language model less introspective is one way to do that. It’s less detrimental than raising your prices or firing your key employees.