Performance Testing: What to Consider Before Starting?

When I started performance testing for a product I was working on, I remember thinking “well this is slow”. But I didn’t know where to start.

I used Lighthouse to give me an idea of what exactly was slow, the low hanging fruit so to speak.

When it came to running JMeter scripts though, I wasn’t sure exactly what the goal should have been and the team wasn’t either.

What are some things you’d consider before starting with the performance tool? What questions would you ask to help guide your approach?

1 Like

I’d suggest starting with the question(s) you’d like to have answered, which for me has usually been some combination of:

  • Does this app feel fast enough to end users for a pleasant and productive experience?
  • Can the system handle the level of demand we expect during a major event e.g. Black Friday?
  • Are the system’s invariants maintained when it is pushed near its saturation point? Does it lose or corrupt data? Does it remain secure?
  • How much CPU/RAM/other resource do I need per n users in order to provide a good level of service?

If you’re focussed on the first of these, you’d probably end up spending most of your time in tools like Lighthouse and the Chrome performance tab. I’ve found it helpful in the past to try to figure out benchmarks from other, similar systems to determine what you should consider “good” performance. E.g. if you’re building an eCommerce website, you’ll want to ensure you’re at least in the same latency ballpark as your competitors for product page loads, image rendering, search etc.


Non-Functional requirements, this is something nobody knows when you ask about them. Before starting to performance test, you have to make clear what the requirements are. If you are going to test the performance and your API takes 100ms with 100 users You still don’t know if it’s good enough or not.

Many people always ask ‘can you performance test this for us’ but it’s never clear what kind of performance test they want, so clear it up before starting to write something. Do they want a stress-test, load, endurance, peak, volume, … etc

Kind regards


NFRs sometimes feel like a chicken-and-egg scenario, since in most cases a business or procurement team will have little idea of what a sensible target is and may not even have a mock-up of an app to be able to work out number of scenarios to cover. The literature always covers something easy like ‘a login’ or ‘buying a product from online store’!

Performance testing one scenario where it need more planning and wider broadcast than any other; may be beta testing(its an event for some companies) if I recall.

Go through requirement doc and find -

  • What is the expected response time for end user?
  • What is the X or max load expected in a defined longer duration(i.e 1 year, 2 year, 5 years)? This will decide what would be your max X test and pattern on which user base is growing.
    This will give shape to your performance code.

Few more:

  • Do we have correct monitoring and logging system to analyse both app log, system health, client side data, geo location variance (in case applicable).
  • Do we have dependent system? When we push the load for testing, will they able to handle the requests? Whats the alternate; may be mocks?