What's missing from current automation tools?

To anyone who has experience with automated browser testing:

What’s missing from the current tools?
What could be done better? For eg. action recorders?

I built a no-code browser automation tool for myself and a few of my customers and I’m trying to validate if there’s an opportunity to take it further.

It’s at - firelab.io if anyone is interested in taking a look, it’s really just an MVP right now.


The comparison by previous runs build into the reporting section of the tool.
So you can compare previous runs from run 1 to run 2 and to run 26 in a baseline to see how many passed/failed % you have etc.

That’s been missing quite often for me.


@kylegawley Few point from my side

  1. Reporting: clear data which can point either automation or application failure
  2. Same script to work on cross browser
  3. Easy to adapt for new team members
    4 . Failed test case debugging should be easier

the comparing of test runs results with good enough granularity to help you find the anomaly starting point date and possibly time, that is also getting my vote. This all depends on if you are pushing as a QA tool or as an “automation” tool though, the requirements and the actual scope gets much easier in the latter use-case.

Thanks Conrad, I’ll be focusing on the later, at least initially.

Thanks Kristof! It seems this is a common thing people mention :slight_smile:

1 Like