Looking for suggestions to limit access to a public self-hosted page - eviltoast

I’m looking to open up a site with a login portal to the internet, but I’m hoping to avoid the page getting scanned too much and avoid bruteforce attempts on the login. I know there are some solutions that already exist like Fail2Ban, but I’m hoping for something different if it exists.

My thinking is that I’d like to put an IP filter on the page, but that I could “automate” adding IP addresses somehow. I was thinking I could have some sort of authentication server where I could email someone a unique URL that they would click on and provide some kind of information confirming that they’re who they say they are. Once confirmed, the public IP that was used to access the unique URL would be added to a whitelist that would allow access to the login portal.

Is there a service that exists that could do something like this? I had a quick look at Authelia and SuperTokens, but I’m not sure if that’s what I’m looking for.

  • roomabuzzy@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Thanks for the insight. I’m not worried about people logging in from different locations to be honest. The access would pretty much just be for the day then that’s it. My main concern (or I guess I could say wish) is that I can leave the site “exposed” to the internet without having a bunch of bots scanning it all the time. So I was hoping for some kind of solution where the site would be completely hidden until someone authenticates themselves. I mentioned IP whitelisting because that’s all I could think of, but maybe there would be another way? Or maybe what I’m asking just isn’t possible?

    • 399ddf95@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      leave the site “exposed” to the internet without having a bunch of bots scanning it all the time

      This is not a thing. Everything exposed to the public internet will be scanned constantly forever.

      You can reduce your attack surface by limiting the software/ports you expose, or by having another service/computer act as a proxy. This is what Cloudflare does as their main business.

      You can get 99% of the benefit you seek by just signing up for the free version of Cloudflare and putting them in front of your webserver. You can configure the firewall on your webserver to only allow access to Cloudflare’s servers and let Cloudflare deal with the bots.

      Another approach would be to store data on AWS S3 or similar with a time-limited URL with a short expiration date, that you provide only to approved parties. You wouldn’t even need to run a public server to do this, and access will automatically be denied after the time period you specify. (This might make more sense if you’re distributing big files, versus displaying a screen or two full of information.)

      You could also do a web search for "port knocking; or configure client-side TLS certificates, which can be difficult to manage but would allow you to restrict full access to your server. (still vulnerable to DDoS, though)