'Touch points' in HA - eviltoast

I was reading the HA roadmap and thinking about the points where everyone (else) interacts with my HA environment. I’ve wanted displays/dashboards for a long time but mostly have either battery powered buttons or smart wall switches. These are good in that I can automate them but with two teenage children we have a lot of variability.

Tell me how everyone else uses HA in your house. Do they love it? Do they see only that buttons ‘do things’? Do they read dashboards and crave data?

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    I’ve gone way too far down the automation path.

    All manner of temperature, humidity, occupancy, motion, and air quality sensors make all sorts of things do appropriate responses.

    For example, I’ve got a mmwave motion/occupancy sensor in the bathroom, and if there’s no motion/occupancy and the humidity is more than 5% higher than the hallway sensor, then turn on the exhaust fan until it’s not.

    Or, if the air particulate count in the kitchen is too high, turn on the exhaust fan until it’s not.

    Or, if the living room is occupied, and the tv is on and playing media, turn the overhead lights off and turn the RGB accent light on very dimly. And if the media is paused or stopped, increase the brightness of the RGB lighting so you can see where you’re walking, and if it stays paused or stopped for more than 10 minutes, turn the main lights back to whatever state they were in before media playback started.

    No dashboards though, since the goal is essentially that you don’t have to think about what is going on, because it should Just Work™ and never be something you have to deal with.

    …though, really, I’d say we’re at like 80% successful with that.

    For manual interactions I’ve got a bunch of NFC tags in various places that will trigger the appropriate automation in the case that you either want to do it by hand or it fails to do the needful, plus the app is configured to allow manual control of any device and to trigger specific automations.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        Both!

        The native automation is perfectly cromulent for what I want, usually, but there’s a couple of cases where the integrations either don’t exist or don’t return meaningful data.

        FOR EXAMPLE, the video playback in the living room thing. Sure, the roku integration says “something is playing” but it’s shockingly wrong and unreliable. What happens is it falls into ‘idle’ status between videos, or if you’re fast forwarding sometimes and thus the automation was not doing exactly what I wanted.

        The Jellyfin API, though, can look at the living room tv user and is spot on as to what is going on with play/pause/stopped statuses, so I have node red yank that data direct from the API and it works great.

    • Bluesheep@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Some cool examples there, I’m going to think about them. I particularly like the walking ones.

      I want to love dashboards. I love the idea of a control centre in each room but I just can’t get to the point of winning with them

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      I don’t quite get the NFC tags thing. Does this mean you have a tag on your wall that you tap your phone on? Is this something you prefer doing over opening home assistant on your phone and clicking the appropriate button? Not being critical or anything, I just see it mentioned often but I can’t really conceptualise why it’s a thing, and I’m curious what I might be missing.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Yeah, I’m just using some cheap NFC stickers from Ali Express.

        The thing is that I don’t use the dashboard: not every action has a dashboard entry and even if there is one, the amount of time it takes to load the app, open the correct dashboard tab, and then click a button is like, 10x the time of ‘tap your phone on the NFC tag, and thing happens’.

        On Android anyway: iOS requires you endlessly tap ‘Yes, yes I’m sure I meant to do that it’s fine just do it already’ for NFC triggered actions, and on Android, it just goes ‘boink’ and does it.

        TLDR: it’s super faster than hitting a button on the dashboard.

        • jonne@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          I think I’d still prefer actual buttons if I went that route, but I think I kind of get it. Can this be made to work with a smart watch too?