DeepSeek 'shared user data' with TikTok owner ByteDance - eviltoast
      • BakedCatboy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won’t fit anything useful in it’s vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.

        I have 64G and I run 8-14b models. 32b is pushing it (it’s just really slow)

    • jonne@infosec.pub
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      5 days ago

      Yeah, AI is even being trained in data provided by the Nazi Steve Huffman’s website.