AI bots hallucinate software packages and devs download them - eviltoast
  • Prandom_returns@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    25
    ·
    7 months ago

    Wow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help?

    And by literally, I mean figuratively.

      • Prandom_returns@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        18
        ·
        7 months ago

        I know it’s a big word, but surely you can google what anthropomorphization is? Don’t “ask” LLM, those things output garbage. Just google it.

        • bbuez@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          7 months ago

          Watch out those software bugs may start crawling out of your keyboard

        • QuaternionsRock@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          No fucking shit it’s an anthropomorphization, nothing that can be hosted on GitHub has true human qualities…

          The point is that everyone knows what it means within that context of AI, and using other terminology would only serve to obfuscate your message such that the average person couldn’t understand it as easily.

          Non-living things also don’t have “behavior” (“the way in which someone conducts oneself or behaves”, but—hey look! People started anthropomorphizing things so much that it got added to the dictionary! (“the way in which something functions or operates”.)

          It may not be ideal, and convince some people that LLMs are more human-like than they really are, but the one thing you haven’t done is suggest an alternative that would convey its meaning as effectively to the masses.

        • Flying Squid@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          7 months ago

          You call it a large language model, but there are much bigger things, it’s only approximating a human language, and it isn’t a physical model.