Self hosted LLM - eviltoast

Hello internet users. I have tried gpt4all and like it, but it is very slow on my laptop. I was wondering if anyone here knows of any solutions I could run on my server (debian 12, amd cpu, intel a380 gpu) through a web interface. Has anyone found any good way to do this?

  • Zelyios@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    OP should try H2OGPT, it is somewhat technical but the UI makes it easy to configure. You can select many models and prompt types, and you can even input your own documents so that the AI uses them to answer