"Defining AI" - eviltoast

“I think we should shed the idea that AI is a technological artifact with political features and recognize it as a political artifact through and through. AI is an ideological project to shift authority and autonomy away from individuals, towards centralized structures of power. Projects that claim to “democratize” AI routinely conflate “democratization” with “commodification”. Even open-source AI projects often borrow from libertarian ideologies to help manufacture little fiefdoms.”

A post full of bangers.

  • Sailor Sega Saturn@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    13 days ago

    if you’re benefiting from some particular way of drawing a boundary around and thinking about AI, I’d really like to hear about it.

    A bit of a different take than their post, but since they asked:

    I’ve noticed a lot of people use “AI” when they really mean “LLM and/or diffusion model”. I can’t count the number of times someone at my job has said AI when solely describing LLMs. at this point I’ve given up on clarifying or correcting the point.

    This isn’t entirely because LLM is a mouthful to say, but also because it’s convenient for tech companies if people don’t look at the algorithm behind the curtain (flawed, as all algorithms are) and instead see it as magic.

    It’s blindingly obvious to anyone who’s looked that LLMs and generative image models cannot reason or exhibit actual creativity (c.f. the post about poetry here). Throw enough training data and compute at one and it may be able to multiply better (holy smokes stop the presses a neural network being able to multiply numbers???), or produce obviously bad output x% less of the time, but by this point we’ve more or less reached the bounds of what the technology can do. The industry’s answer is stuff like RAG or manual blacklists, which just serves to hide it’s capabilities behind a curtain.

    Everyone wants AI money, but classic chatbots don’t make money unless they’re booking vacations for customers, writing up doctor’s notes, or selling you cars.

    But LLMs can’t actually do this, so in particular any tool in the space has to be uninterrogated enough both to give customers plausible deniability, and to keep the bubble going before they figure it out.

    Look at my widget! It’s an ✨AI✨! A magical mystery box that makes healthcare, housing, hiring, organ donation, and grading decisions with maybe no bias at all… who can say? Look buster if you hire a human they’ll definitely be biased!

    If you use “statistical language model” instead of “AI” in this sentence then people start asking uncomfortable questions about how appropriate it is to expect a mad-libs algorithm trained on 4chan to not be racist.


    … an insurance pricing formula, for example, might be considered AI if it was developed by having the computer analyze past claims data, but not if it was a direct result of an expert’s knowledge, even if the actual rule was identical in both cases. [page 13]

    This is an interesting quote indeed, as expert systems used to be on the forefront of AI; now it’s apparently not considered AI at all.

    Eventually LLMs will just be considered LLMs, and image generators will just be considered image generators, and people will stop ascribing ✨magic✨ to them; they will join the rank of expert systems, tree search algorithms, logic programming, and everyone else that we just take for granted as another tool in the toolbox. The bubble people will then have to come up with some shinier newer system to attract money.

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 days ago

      It really is the same kind of issue faced by the original luddites around factory automation. There the intelligence being artificially replaced was that of experienced weavers, spinners, and other assorted craftspeople instead of programmers, insurance adjusters, clerical staff, and other assorted white-collar professionals but in a social and political sense AI is just a rebranding of automation for the modern economy, and one that more effectively obscures the actual human labor being supplanted. That’s a particular bonus for the current bubble because in being vague about what specific labor can be automated they can avoid the kinds of comparisons that make it incredibly obvious that the AI systems aren’t actually up to the task. The shift from cottage industry to factories massively increased the sheer volume of goods that could be created, transported, and utilized. (And set the stage for two world wars and the modern age of consumerism which sounds really bad so let me be clear: I like my shiny toys.) The current shift from humans making things to generative AI is trying to replicate that but because of the nature of goods and services we’re now talking about it’s pretty clear that there simply isn’t a comparison. A bolt of cloth is a bolt of cloth, but a book-length statistical prediction just isn’t useful or valuable in the same way that an actual book is.

  • monk@lemmy.unboiled.info
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    13 days ago

    AI is a garbage generating plagiarism machine. It’s not political outside of a single country where everything has to look political to prevent people from voting independent, and the only regulation AI ever needs is one declaring all it produces a derivative work of all the material it used for learning.

    Any attempts to ascribe further properties to that remixing machines are just natural intelligence equivalent of slop.

    • luciole (he/him)@beehaw.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      13 days ago

      Politics is more than partisanship, and what is meant by something being political isn’t necessarily that it aught to be affiliated to a political party’s position. It’s about making decisions in groups, status and power dynamics. It’s a very real problem among techs to cop out of the consequences of their actions upon society by stating being strictly apolitical, pointing to the narrow definition while occluding the most meaningful one.

            • swlabr@awful.systems
              link
              fedilink
              English
              arrow-up
              5
              ·
              10 days ago

              Ah yeah the folding mechanism which just appeared one day out of nowhere, invented by nobody for no reason.

              • monk@lemmy.unboiled.info
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                10 days ago

                It was invented by fleshy humans to sell even bigger phones to people with the same hand size. It doesn’t make it biological because “fleshy”, at best it’s economical and anatomical.

                • YourNetworkIsHaunted@awful.systems
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  8 days ago

                  Economics: the famously apolitical field that examines the distribution and creation of wealth, also a famously apolitical concept.

                  Ironically this whole exchange is an example of just how cooked American political discourse is. The culture war is so all-consuming that anything outside of that gets largely excised from political action entirely. Then when someone from outside the US tries to point out that basically unrestricted corporate looting and blatant violations of various human rights could be regulated or otherwise countered by political processes, people act like they’re speaking Martian.

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      13 days ago

      I sharply disagree, but this is a subtlety that’s lost on a lot of people. The tech industry’s success since at least the 1990s, up until the mid-2010s, was about making technology easier for the individual user, a more accessible and (potentially) more efficient means for accomplishing many routine interactions. Tech devices existed as tools in service of the will of the end user, and if you were really willing to drink the kool-aid, extensions of the user themselves, Jobs’ “bicycle for the mind.”

      The expectations being cultivated for AI now set it up as an entirely separate entity from the end user, and one that is potentially more capable at some point in the ill-defined future. This opens the door toward resources being reallocated towards this nebulously powerful entity, and the allocation of shared resources is at the very core of politics. This is a hard pivot away from how technology was designed before! You and I know it’s a load of complete hogwash, but that doesn’t prevent the potential bamboozlement of the lagging generation of policy-makers. Even someone as relatively young as Kamala Harris or her likely successor Gavin Newsom could be roped into this bullshit, if only because they know where their biggest donation checks come from.

      The future in which the current crop of AI retailers enjoy a successful political program is no longer one where a rising tide lifts all boats. But, for the time being, it can still be pitched as such due to deeply embedded cultural expectations.

      • monk@lemmy.unboiled.info
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        12 days ago

        relatively young as Kamala Harris

        Careful; were I in the US, the facepalm consussion bills could bankrupt me.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 days ago

      AI is a garbage generating plagiarism machine. It’s not political

      Yes. Think about what it is plagiarising. Datasets are biased; this is like statistics/ML 101.

      outside of a single country where everything has to look political to prevent people from voting independent,

      You can just say the country, and also, this doesn’t really make any sense. Am I to infer that, if things weren’t political, people would vote (a famously political action) for independents?

      and the only regulation AI ever needs is one declaring all it produces a derivative work of all the material it used for learning.

      I’m pretty sure there’s a lot more wrong with LLMs than just plagiarism.

      Any attempts to ascribe further properties to that remixing machines are just natural intelligence equivalent of slop.

      I’m not 100% sure what you mean here.

      • monk@lemmy.unboiled.info
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        10 days ago

        You forgot to say what you came to say.

        The only question I see is “would US people vote for people not affiliated with the two football teams masquerading for political parties if said mockeries of a word ‘party’ ceased to exist?”, and the answer to that is a resounding obvious “yes”.

        • swlabr@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 days ago

          I’ve said what I said came here to say. Meanwhile, you haven’t said anything at all.