You need to identify which process(es) eat up the memory.
For openhab this can be done by disabling binding by binding.
There are several threads about it.
Since OH3 is the only Service running i will disable the Bindings or Items and check which could cause the issue.
Update:
After disabling my Synology Binding it seems to be stable now. I will monitor it:
235 │ Active │ 80 │ 3.3.0.202207031140 │ openHAB Add-ons :: Bundles :: Synology Surveillance Station Binding
Hi Simon,
I’m unaware of any memory issues but I admit, memory profiling is not my field of experience. There are no libraries used which I’d suspect. If you have any debug output from this binding and / or core, please PM some.
BR
Pav
I got 6 GB on this VM running.
and set EXTRA_JAVA_OPTS=“-Xms2048m -Xmx4096m” and also set to ```
EXTRA_JAVA_OPTS=“-Xms128m -Xmx512m”
without any changes.
You should not give Openhab much more memory than it actually needs. 4GB sounds way too much for me. You are probably fine with 1-2GB Xmx.
It’s the Linux kernel that killed the Java process, it’s not the Java JVM that said that it ran out of heap space.
Xmx is the max heap space, but that’s not the only type of memory Java needs, so the actual memory usage overall of Openhab can be (for example) 5GB if you set 4GB heap. And of course there are probably a lot of other processes on your Proxmox that also need memory and your Linux kernel killed the process that uses the most memory.
Maybe try out -Xmx2g, depending on your needs.
Java is a memory hog, it will eat all the memory you give it until it comes close to the heap limit, then it starts seriously doing garbage collection and freeing up memory again which it doesn’t need anymore. But I think the JVM Garbage collector only watches its own Java process memory and doesn’t care if the whole Linux system is having memory pressure already.
i got 16 GB on the Proxmox Server and only 2 VM with 6 GB und 4 GB, i will reduce RAM to 4 GB per System and check if the error shows up again.
Thanks for your reply!