GOG will delete cloud saves more than 200MB per game after August 31st - eviltoast
  • FeelzGoodMan420
    link
    fedilink
    English
    arrow-up
    15
    ·
    5 months ago

    Most games never hit anywhere near that, but some large open world rpgs like Skyrim track the location of every single object in the game world. Like you can drop a piece of cheese on the bottom left corner of the map, come back 500 hours later, and it’ll still be there. now imagine all of the objects you’re buying and selling and manipulating over those hundreds of hours. Now add in a shit ton of script mods and other stuff that may add even more objects. And add in all of the quest data and interaction data that gets saved etc etc, and your save file can easily hit multiple gigabytes, with each file approaching 200mb.

    • smeg@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      It still feels like it should be orders of magnitude less. For example, if each piece of cheese has an ID number that maps to cheese, an ID for what area it’s in, three coordinates for where exactly it is, and maybe a few more variables like how much of it you’ve eaten. Each of those variables is probably only a couple of bytes, so each item is probably only 20B or so, which means that even if you interacted with a million different items and there was no compression going on then that’s still only 20MB of save data.

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        14
        ·
        5 months ago

        Bold of you to assume the data in save files is packed binary and not something like JSON where { “x”: 13872, “y”: -17312, “z”: -20170 } requires 40 bytes of storage.

        • addie@feddit.uk
          link
          fedilink
          English
          arrow-up
          9
          ·
          5 months ago

          Agreed. JSON solves:

          • the ‘versioning’ problem, where the data fields change after an update. That’s a nightmare on packed binary; need to write so much code to handle it.
          • makes debugging persistence issues easy for developers
          • very fast libraries exist for reading and writing it
          • actually compresses pretty damn well; you can pass the compress + write to a background thread once you’ve done the fast serialisation, anyway.

          For saving games, JSON+gzip is such a good combination that I’d probably never consider anything else.

        • smeg@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          That’s excusable in My First Game™ but surely professional AAAAA game would never cut corners and code something so lazily, eh?

          • vithigar@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            It’s not really laziness. Storing as JSON solves or prevents a lot of problems you could run into with something bespoke and “optimally packed”, you just have the tradeoff of needing more storage for it. Even then, the increased storage can be largely mitigated with compression. JSON compresses very well.

            The problem is usually what they’re storing, not how they’re storing it. For example, The Witcher (first one) has ~20MB save files. These are mostly a bespoke packed binary format, but contain things like raw strings of descriptions in multiple localisations for items being carried, and complete descriptors of game quests. Things that should just be ID values that point to that data in the game files. It also leads with like… 13KB of zero-padding for some reason.

      • tehevilone@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        5 months ago

        Save bloat is more often related to excess values not being properly discarded by the engine, if I remember right. So it’s not that the objects themselves take up a lot of space, but the leftover data gets baked into the save and can end up multiplying if the same scripts/references/functions get called frequently.

        It was a lot worse with Skyrim’s original engine, and got better in Fallout 4 and Skyrim SE. The worst bloat happens with heavy modlists, of course, as they’re most likely to have poor data management in some mod.

        • smeg@feddit.uk
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 months ago

          Aha, so unexpectedly it’s bad/inefficient code that’s ultimately to blame

          • iegod@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            5 months ago

            I wouldn’t say bad, but inefficient might be fair. Unoptimized I think is more representative.

          • tehevilone@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 months ago

            Inefficient/unoptimized would be an accurate description. I think it’s important to add, for bethsoft games specifically, that the save includes all changes to objects, even if the player themselves didn’t interact with them(e.g. Physics interactions, explosions moving things, npcs bumping stuff around), and also includes all NPC changes. Master files(ESMs) get loaded, then the save loads the changes it has baked in to the databases. So, when you load up a save that has traveled the world and loaded a lot of things into save memory, the engine has to sit there and reconcile all the changes with the ESMs, which can add up quick if you’re playing modded.

      • wax@feddit.nu
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        Each object also needs the orientation, possibly also velocity and angular rates.

        • smeg@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Yeah that’s why I rounded up a bit. But even if there’s triple the amount of cheese data then a million cheeses is still only 60MB