NVIDIA GPU on a docker headless system - eviltoast

Hi there! I have an old pc that I use as a server (I use Ubuntu Server) and I would like to add a NVIDIA 1050 to it (for jellyfin and guacamole).

In the past I tried to do it and somehow I corrupted the system when installing the video drivers, I have always had complications when installing NVIDIA drivers on Linux.

Could any of you help me know what is the right way to install the video drivers on a headless system and use it correctly with docker compose?

Thanks in advance 😄

  • m12421k@iusearchlinux.fyi
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    not exactly what you want. but check out gaming on whales Wolf docs. it’s configured for gpu accelerated X11 apps on docker. if I recall correctly one of the documents explained settings up nvidia drivers on docker completely.

    • Trincapinones@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Thanks for the help! I’m trying to install the driver from the official repo but I don’t know what version shoud I use for a headless docker setup with a 1050.

      I think that’s what went wrong last time, I tried to install the driver from the NVIDIA documentation thinking that “I’ll just work” just like in windows 😓

      Edit: I’ve solved it, now it works perfectly, Thanks again

    • Trincapinones@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thank you! Last time I used the official desktop drivers because most guides recommended them. When I install the NVIDIA Container Toolkit, I can include the GPU in the runtime of the container and it should work, right?

      • fraydabson@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        I used the desktop drivers as well (on arch from the extra repo) for my headless arch server.

        Regarding nvidia container toolkit once it was installed I added this to my Jellyfin docker compose:

        deploy:
              resources:
                reservations:
                  devices:
                    - driver: nvidia
                      capabilities: [gpu]
        

        Then to confirm, I did docker exec -it jellyfin nvidia-smi Which responded with my GPU. Note that (for me) the “processes” part of nvidia-smi comes up blank, even when Jellyfin is using it. I can tell it is working though from jellyfin logs and when it is not using it, instead of being blank it says “no processes”

        Edit for formatting and to add that I believe I also had to add an environment variable to jellyfin (I am using lsio’s version)

              - NVIDIA_DRIVER_CAPABILITIES=all
              - NVIDIA_VISIBLE_DEVICES=all
        
    • Dandroid@dandroid.app
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      I recently did this and found those instructions to be beyond useless. The repository URIs were all old and dead. Not sure if they updated this doc since then, but they combined all the deb-based distros into one repo and and all the rpm-based distros into another repo.

  • netwren@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Not docker but you could do k3s and use the Nvidia GPU operator to manage installing video drivers for you on your single node cluster.

      • netwren@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It actually was pretty straightforward. Saying this from experience as I used a tensortt container image with a 1060 for image clarification

    • notfromhere@lemmy.one
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Do you have a link to a tutorial on this? I’ve been thinking about adding my amd64 server with an nVidia GPU to my Raspberry Pi K3s cluster.

      • netwren@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        No! Maybe I should work on this because it was fairly simple for me to do after some research.