Why are tech news websites suddenly bashing 8GB RAM in Apple devices? - eviltoast

I’m a web developer but I also do tons of work with large files being transferred across the network, I do some CPU intensive tasks from time to time, run Docker containers, etc. all on a 2020 M1 MacBook Air with 8GB of RAM.

Well it’s 2024 now and the thing still screams. So what I don’t understand is: why are there suddenly so many enraged tech news websites bashing on the 8GB base RAM?

I get it that some people need more than just 8GB, but for the cliche web browsing, email and social media user it’s not adding up to me why anyone is so enraged about this.

  • Prison Mike@links.hackliberty.orgOP
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    I realized recently that my Raspberry Pi 4 has just 4GB of RAM, but while syncing huge files to Storj I’ve noticed it doesn’t fill up whatsoever (even with slow spinning hard drives).

    I’m starting to think for most things I do CPU is more important than having tons of memory.

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 month ago

      I also have a Raspberry Pi with 4G and it handles its load perfectly fine.

      BUT lack of memory is a well known bottleneck so when I got a Raspberry Pi 5 with double the processing I also doubled the memory to keep it fed. While I haven’t really found a good niche for the new beast yet, if I’m spending money on a faster processor, faster board, why would I limit it by cheaping out on memory.

      While we know that Apples memory is much faster than anyone else’s, we also know the entire system is outstanding. If I spend so much on a system with such high throughput, why would I want to cripple it by cheaping out on memory to save a relatively small cost? It’s not that I really have a need but that I’m paying for a beast so it better be able to go beast mode

    • hperrin@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      If you’re transferring files over a socket (like through SMB or SFTP), the receiving end usually has a small buffer, like 64KB. It’ll just pause the stream if it’s receiving data faster than it can push it to disk and the buffer gets full. So usually a file transfer won’t use much memory.

      There is some poorly written software that doesn’t do that, though. I ran into a WebDAV server that didn’t do that when I was writing my own server. That’s where you could run into out of memory errors.

      • Prison Mike@links.hackliberty.orgOP
        link
        fedilink
        arrow-up
        2
        ·
        1 month ago

        That lines up with what I know about networking, but on the software side I figured it would chew through memory quick (especially because it’s encrypting it on the fly).