memory leaks in 3.1_beta4 #3218
Comments
neofutur uploaded file valgrind find memleaks |
vexed commented Well, looking at the valgrind log it shows
Yeah, we have some leaks, but, no where near 1.2GBs, so I assume the issue is most likely with your video drivers. |
Safety0ff changed _comment0 which not transferred by tractive |
neofutur commented
this valgrind file is just an small example, running wz2100 2 minutes, and not even creating a game ! try it with a complete session, creating a game and playing it 1 hour, you ll find 1 GB leak my computer is not powerful enough to run a complete valgrind session, including playing a real game, so I just attached a small file, 2 minutes valgrind debug.
|
cybersphinx changed status from |
cybersphinx changed resolution from `` to |
cybersphinx commented Can't reproduce this. I did a little plot here (possibly not exact, but if anything it's exaggerated), and memory usage of Warzone and all its libraries didn't increase by that much during an hour long autogame. The jump at the beginning is where it loads textures, then it gradually increases, but not by alarming amounts. The jumps at the end come from some saving and loading. Probably a problem of the drivers/libraries Warzone uses, not the game directly. |
cybersphinx uploaded file |
crass changed status from |
crass changed resolution from |
crass commented I have verified that it uses a ton of memory after playing for awhile the ram usage goes through the roof! |
T_X commented Could you rerun valgrind with "--leak-check=full"? That should help finding out whether and which memory leaks are warzone2100's fault and which ones are buggy/outdated libraries or drivers. But I agree with vexed, even if they were all Warzone's fault, they do not add up to 1.2GB. @crass: A valgrind output or at least a graph like cybersphinx created would be helpful. |
cybersphinx commented Made another graph of a netgame (4 nexus AIs autogame on rush), with
and then
in gnuplot. Still no abnormal increase in memory usage. The graph is not very useful though, unless you can correlate memory usage with in-game events. |
cybersphinx uploaded file |
vexed changed priority from |
vexed changed operating_system which not transferred by tractive |
vexed changed milestone from |
vexed commented Hmm, after hooking the allocators and doing stack dumps on each allocation, I can confirm that we are using gobs of memory and if you play long enough, it will hit the pagefile while it swaps out the memory blocks. Lots of these are caused by the new NET* routines that don't cleanup after themselves. Already have eliminated lots of leaks, from the effects to the UTF command line conversion, to (most of) the view data, but the NET* ones are still a issue. |
vexed commented Patches are being done in #3395 |
Safety0ff commented Replying to Warzone2100/old-trac-import#3218 (comment:6):
|
Safety0ff uploaded file |
Per changed status from |
Per changed resolution from `` to |
keyword_memory_leak_valgrind
resolution_duplicate
type_bug
| by neofuturon a small map , MP 4p game, warzone2100 is reaching more than 1200 MB RAM usage, is this normal ?
top information after the end of a 4p MP game :
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
8523 neofutur 20 0 1158m 277m 3236 D 1 56.9 56:59.24 warzone2100
in my case its swapping , but i can end the game, but when the game ends wz have a problem to free RAM and swap
10 mins after the end of the game its still using 1200 mb memory
I have to kill -9 everytime I end a game or memory will never be freed
kill -9 frees it immediately
its beta4 built with nodebug ( had same problem before building with nodebug and launched with ./warzone2100 --nosound --noshadows --window --resolution=800x600, with 128 for textures
I waited up to 30 mins to see if it finally frees the memory, but no it will never free it
a strange thing is that an strace -p 8523 ( wz2100 process number ) shows nothing, the process does nothing at this time when it should free the memory
trying to use valgrind to find memory leaks :
i found many ones like :
Issue migrated from trac:3218 at 2022-04-16 08:58:09 -0700
The text was updated successfully, but these errors were encountered: