If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won’t fit anything useful in it’s vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.
I have 64G and I run 8-14b models. 32b is pushing it (it’s just really slow)
No way… you’re telling me a free AI is profiting off my data?
Always run AI locally!
Is that feasible for someone with an office PC with integrated graphics? Asking for a friend.
If you have a lot of RAM, you can run small models slowly on the CPU. Your integrated graphics I would guess won’t fit anything useful in it’s vram, so if you really want to run something locally, getting some extra sticks of RAM is probably your cheapest option.
I have 64G and I run 8-14b models. 32b is pushing it (it’s just really slow)
Yeah, AI is even being trained in data provided by the Nazi Steve Huffman’s website.