Git based Workflow for updating containers - eviltoast

TL;DR: I want to keep my containers up to date, currently Portainer based compose files updated by renovate. How do you do it?

Status Quo

I’m hosting a few containers on my Unraid Homeserver for personal use, but I don’t use the Unraid Webinterface to control them. I’m running Portainer CE in a Container on the host. Within Portainer I use the “Stacks” feature to define my containers. The Stack-files (basically docker-compose files) reside in a private Git(-hub) repository. I configured renovate to create pull requests to the Git repository in case there are new updates for the container images (aka new tagged images).

Issues

Currently I’m not really satisfied with that workflow. These are the issues I have:

  • It’s not really automatic. I still have to manually approve the Pull Requests on GitHub, even though I don’t test them before applying
  • I once updated a specific container but the database structure of the application changed. I had to manually restore the application data from a backup
  • Some containers I use don’t have proper versioning (e.g. only a “latest” image)
  • For some containers renovate doesn’t open Pull Requests for updates. I think it’s because the images are not in Docker Hub, but on GitHub or other registries.
  • Adding new stacks to Portainer is cumbersome, I have to specify the Git repository, the path of the docker-compose file and credentials everytime.

Wishlist

What I would like to have:

  • Automatic Updates to my containers (bug fixes, new features, security fixes)
    • Updates should apply automatically except if I pin the image tag/version
  • Before updating a container the container should get shutdown and a copy of the application data should be created
  • If the container exits unexpectedly after an update, an automatic rollback should get applied. Notification to me and no further updates for this container until I continue it.
  • Container definitions should be defined in a version controlled code/text, e.g. docker-compose files in a Git repo
  • Solution should be self hosted

Questions

I’m aware of watchtower, but as far as I see it only updates the live-configuration of the system. So no version control or roll-backs. What do you folks think? Are my requirements stupid overkill for a homeserver? How do you keep your container based applications up to date?

  • Haui@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Nice! For me its like 10+ stacks and maybe 15 containers. Also all managed my changing compose files which I‘m constantly improving. Adding .env files, changing database and heavy workload paths to ssds instead of hdds and so on. It‘s insane what you can do.

    Backing it us is less easy for me since I have to dump 3 databases and copy lots of config files. But tar.gz is my friend. :)

    • Bristlerock@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Yeah, it make for a nice workflow, doesn’t it. It doesn’t give you the “fully automated” achievement, but it’s not much of a chore. :)

      Have you considered something like borgbackup? It does good deduplication, so you won’t have umpteen copies of unchanged files.

      I use it mostly for my daily driver laptop to backup to my NAS, and the Gitlab CE container running on the NAS acts as the equivalent for its local Git repos, which are then straightforward to copy elsewhere. Though haven’t got it scripting anything like bouncing containers or DB dumps.

      • Haui@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s pretty awesome but I think I still need to improve a lot of stuff.

        Sadly, deduplucation doesn’t help me for my docker configs as there are not many files. Deduplication would help for my bulk storage backup but I think iDrive already has something like this in their scheduling program.