Let’s be honest: I only use Java for Minecraft. So I only debugged with it. But all version, server or client, all launchers. All of them use double (or more) RAM. In the game the correctly allocated amount is used, but on my system double or more is allocated. Thus my other apps don’t get enough memory, causing crashes, while the game is suffering as well.
I’m not wise enough to know what logs or versions or whatever I should post here as a cry for help, but I’ll update this with anything that’ll help, just tell me. I have no idea how to approach the problem. One idea I have is to run a non-Minecraft java application, but who has( or knows about) one of those?
@jrgd@lemm.ee’s request:
launch arguments [-Xms512m, -Xmx1096m, -Duser.language=en] (it’s this little, so that the difference shows clearly. I have a modpack that I give 8gb to and uses way more as well. iirc around 12)
game version 1.18.2
total system memory 32gb
memory used by the game I’m using KDE’s default system monitor, but here’s Btop as well:
this test was on max render distance, with 1gb of ram, it crashed ofc, but it crashed at almost 4gbs, what the hell! That’s 4 times as much
I’m on arch (btw) (sry)
When you control the memory allocation for Minecraft, you really only are configuring the JVM’s garbage collector to use that much memory. That doesn’t include any shared resources outside of the JVM, such as Java itself, OpenGL resources and everywhere else that involves native code and system libraries and drivers.
If you have an integrated GPU, all the textures that normally gets sent to a GPU may also live on your regular RAM too since those use unified memory. That can inflate the amount of memory Java appears to use.
A browser for example, might not have a whole lot of JavaScript memory used, couple MBs maybe. But the tab itself uses a ton more because of the renderer and assets and CSS effects.