Ollama - super easy to host local LLM - eviltoast
  • graveyard_bloom@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Ollama is pretty sweet, I’m self-hosting it using 3B models on an old X79 server. I created a neat terminal AI client that makes requests to it on the local network - called “Jeeves Assistant”.