Anyone got any tips for testing memory leak and utilisation on a web app? We have a suspicion one of our projects is not as efficient as it could be but I have no experience with testing any of this stuff
Hi … I am definitely no expert on this particular topic. Assuming its a server side application, do you have access to the servers hosting the application. Could you use the inbuilt OS tools for tracking processes and memory usage as a first step to check what it’s actually using? This would do your basic does it leak checks. After that I am afraid for web applications I’d have to hand over to someone more akin to testing memory leaks on web based stuff.
There is a reasonably interesting article here that might help.
This is also worth a read…https://developers.google.com/web/tools/chrome-devtools/memory-problems/
This is more than I know at the mo, so thanks! Will do some reading
There are tools for monitoring things like memory usage for applications. On Windows there is a tool called PerfMon built into the operating system. On Windows 10,
- Right click on the bar at the bottom of the screen
- Select Task Manager
- At the bottom of the window select More Details
- Go to the Performance tab
- At the bottom of this tab select Open Resource Monitor
Here you can select which application you want to monitor and you should be able to select what sort of things you want to monitor (memory, CPU usage, network, etc.).
If you want to automate this, you can run PerfMon as a standalone application. To run it,
- Right click on the Start Menu
- Select Run
- Enter perfmon.msc
In here you can create Data Collector Sets to collect the metrics. If you open Data Collector Sets then right-click on User Defined, you can define your own set to collect. Then you leave it running. The help for PerfMon doesn’t seem to work for Windows 10 (I think it stopped working in Windows 7). So search the internet for “perfmon” or “perfmon command line” to learn more about it.
If you want something similar for say macOS you’d want to look at “Activity Monitor”. Just go to Spotlight and enter “Activity Monitor”. Or for Linux system you want to search for “linux performance monitoring tools”. You’ll find a number of command line tools for monitoring each thing. For example, top or vmstat for monitoring memory, netstat for networking, etc.
All of this work would be done on the web application server if you suspect the application on the server is leaking memory.
As you can see, for a web application, there are many places that you can improve performance. The first step is measure what the current performance is at each stage. Next is determining where the improvements can be made. You also want to measure individual ‘things’. Page load might be slow because of database, web server, application server, load balance, network configuration, routing tables, etc. If you don’t know which one is the bottleneck, it will be impossible to fix.
As for utilization, this is kind of vague. Because there are so many parts to a web application, what utilization are you trying to measure? The tools mentioned about can help measure utilization as well.
I’d arrange to run a soak test in a load testing tool. e.g. I use Visual Studio but something like JMeter would do similar.
Develop/record a script to emulate the user journey you want to monitor. Run the test with a low number of virtual users (even a single user may be enough) for an extended period of time (typically 24 hours or so). Monitor resources like memory and/or handles. This should show up any resources not being released.
Typically I’d schedule such a test to run overnight, or ideally over a (bank holiday) weekend. You can probably exaggerate the resource leak by using more VUs (just don’t overdo it as this isn’t meant to be a load/stress test).
I’ve probably over simplified here, but that’s the gist.