What can an individual do to fight against tech billionaires? - eviltoast
  • TheReturnOfPEB@reddthat.com
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    1
    ·
    edit-2
    1 day ago

    for example:

    get off of facebook (easy). don’t buy tesla or use starlink (easy). don’t buy on amazon (difficult but doable). Don’t upgrade your iphone, and don’t buy new apple products (moderate). Don’t use CHATGPT (easy).

    • Diplomjodler@lemmy.world
      link
      fedilink
      arrow-up
      66
      ·
      1 day ago

      Use Linux and open source software. Contribute to open source projects. Buy hardware second hand. Use non corporate social media. Buy local. Get your stuff fixed instead of throwing it away. Avoid data harvesting where possible.

      • haui@lemmy.giftedmc.com
        link
        fedilink
        arrow-up
        22
        arrow-down
        2
        ·
        1 day ago

        Its as if the linux hardliners were right all along. Almost as if people laughing about our cautionary tales stand there holding the bag now.

        I‘m not saying „we told you so“ but…

    • Skasi@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      I’d add not using Amazon Prime, Amazon Web Services and other Amazon services. Not using X, being critical of SpaceX. Also, stop advertising these things, stop telling your friends about them, maybe even stop talking about them altogether. I think for some strange reason sometimes bad press is better than no press.

      • ramble81@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        1 day ago

        My company: “we’re going all in on the cloud! Specifically AWS” proceeds to pay them millions per year.

    • tiredofsametab@fedia.io
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      or use starlink I don’t know if I’d call that easy for some people in very remote areas. Easy for me, easy for you, but not necessarily easy in some cases. Here’s hoping a good competitor can get to those places.

    • dasenboy@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      I use chatgpt a lot, what is the best non-billionaire funded llm? I really need to change to one that doesn’t worsen the world…

      • iheartneopets@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        10 hours ago

        That may be hard, seeing as all AIs use ungodly amounts of electricity. So I’d say they all worsen the world.

      • knatschus@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        While deepseek is billionaire funded it still should be better if run locally I don’t think Foss llms are at that level yet

        • dasenboy@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Thanks, another person mentioned it, I’m trying it now, hopefully it suits my needs.

      • MajorSauce@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        Try hosting locally DeepSync R1, for me the results are similar to ChatGPT without needing to send any into on the internet.

        LM Studio is a good start.

          • shadow@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            8 hours ago

            Any relatively new gaming PC from the last, what, 4? Years has enough power to run local LLMs. Maybe not the ginormous 70GB behemoth models, but the toned down ones are pretty damn good and if you don’t mind waiting a few seconds while it thinks, you can run it completely locally as much as you want, and whenever you want.

          • MajorSauce@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            15 hours ago

            You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.