Identifying Key Requirements: What Factors Guide Your Tool Selection?

Selecting the right tools for our projects involves a complex evaluation of various requirements. I’m interested to hear about the range of requirements you consider important when choosing tools.

How to Contribute:

  1. Listing Key Requirements: What requirements do you prioritize when selecting a tool? This could include aspects like specific functionalities, cost, user experience, community support, compatibility with existing systems, and more.
  2. Impact on Tool Selection: How do these requirements influence your decision-making process?

Why Contribute:

  • Sharing your list of requirements can help others broaden their perspective on what to consider when selecting tools.
  • Every project has unique needs, and discussing these can help us appreciate the vast scope of what makes a tool useful.

Whether it’s alignment with project goals, budget constraints, or ease of integration, each factor is a vital piece in the puzzle of selecting the right tool for the job.

I’m eager to learn about the varied requirements you consider in your tool selection process and how these shape your decisions!

7 Likes

A disclaimer, I have never actually been in the described situation (I never evaluated any tools), but hypothetically, here are come considerations/requirements:

  • Is the tool’s userbase big, does it have a subreddit or a stackoverflow tag? This is probably what was meant by “Community” though.
  • Does it scale? Do we need it to scale? Will we hit a performance wall when the test base grows sufficiently?
  • Is it easy to maintain the test base, expand and refactor it? Although probably this isn’t restricted as much by the tech as by the conventions and frameworks.
  • When has the tool been most recently updated? Especially important for the free tools.
  • Is it easily automatable itself, does it expose useful interfaces and provide nice outputs so we can integrate into some pipelines?
9 Likes
  • What is being automated/ cost, can be expanded into- do we need two different tools (maybe cheaper ones) that can do the same job (for example automated screenshot tests are different to E2E or unit ones)
  • Experience is about what we currently have- but what about future developers/ testers- picking an obscure test framework because it’s “new and cool” might not work, if people are looking for new roles and have never used it.
  • The above is also linked to documentation/ community- older more established software might have an edge when it comes to learning about it, but it might be end of life- whereas a new product may be difficult to integrate/ not have much assistance from the development team (which may be small)
  • Will it work with what we’ve got? Sticking into the pipelines won’t be too painful.
10 Likes

Thanks both for sharing your thoughts on selecting tools.

@literallyme thanks for your sharing your considerations. I’d like to prod you a bit more on scale. What do you mean by that? Scale in amount of test/checks? Scale in usage? I think that’s an interesting criteria to go with.

@kelly.kenyon I really like your phrase “Experience is about what we currently have” and it speaks to what is happening right now in your team as an indicator towards direction.

I think everything that you have both shared is valid and goes to show how different factors impact decision making. It’s a lot more nuanced than ‘tool A is good’.

I encourage everyone to keep sharing their thoughts; your viewpoints are what make the community a rich resource.

2 Likes

So here’s my list of things I was able to come up with in 10 minutes:

  • Reporting: What reporting do I want/ need? Is this supported by the tool I am evaluating?
  • Integration with other tools in our belt (eg test management and/ or requirement management tools)
  • Can the tool be integrated in my CI/CD environment?
  • Maintenance effort (in terms of installing updates/ fixes and alike, of the tool itself): How much is it? Who will do this in my team/ organization?
  • Maintenance effort (in terms of script maintenance): How easy is it to adapt automated checks to the changes in my UAT?
  • Flakiness/ stability: How reliable is the script execution? Also in regards to the specifics of my AUT/ general system environment.
  • Use a multipurpose tool for different kinds of testing/ automation, or one specific tool per kind of activity? Heavily depends on the different tools and my specific needs.
  • Willingness of the programmers to also use the tool (very often they are more in favour of code-heavy tools/ frameworks)
9 Likes

What a helpful list, @christianbaumann.

I think you’ve created a heuristic. Perhaps something like this?

  • Reporting
  • Interoperability
  • CI/CD integration
  • Maintenance
  • Script Maintenance
  • Flakiness
  • Purpose
  • Willingness

RICMSFPW

Or perhaps FRIMSCPW is more memorable. :smile:

2 Likes

Awesome idea @simon_tomes, thanks for sharing!

For a heuristic we should also include the ones listed in the course, otherwise it would be pretty incomplete.

And also let’s see what others come up with, I’m pretty sure there will be a lot of other great contributions (that I didn’t think of), that are worth being incorporated.

1 Like

Some ideas I came up with were but not limited to are (without duplicating the above):

  • Training/Documentation - How easy will it be to train team members to use the tool, Open Source will likely have fewer materials than out the box (in a lot of cases)
  • Compatability with our clients for project handover, what tooling do they use internally and does it match with what we would like to use
3 Likes

Repeating some of the great suggestions above, I came up with…

  • Developer buy in - will the developers use the tool too?
  • Pipeline integration - will it be easy to integrate in the teams pipelines.
  • Close to Application Code - does the tool complement libraries already used?
  • Language - is it the same or similar family of languages that the organisation uses?
  • Maintenance - who will maintain these tests and how easy is it?
  • Updates - is the tool kept up to date and how often? Are there any risks to using it.
  • Wider Test Strategy - could multiple teams use it? Or is it just for your team?
  • Handover - if you were to leave the team and hand it over to someone new, how easy is it to do so?
  • Type of Test - can the tool support different types of test? Component and End to End for example. Do you need that?
  • Assertions - will the tool that drives the test need a separate assertion library.
  • Hosting - if the tool needs a driver or a grid, where will it be hosted? How easy is that.
  • Purpose - is it to find bugs, validate builds and/or describe the functionality? Which tools meet your specific purpose (or purposes)
  • Authentication - does your app authentication make tests easier or harder. If you use OTP for example.
  • Readiness - is your app stable enough yet to cope with test automation tooling?
  • Tactical or Strategic - is the tool just to cope with a particular scenario (like I had a team whose app didn’t compile in live so I implemented basic build checks) or part of a wider endeavour,
  • Level - is the tool aiming at the right level - you can use Cypress to perform API checks and E2E web tests, but should you focus on the cheaper API tests for faster feedback.

Phew! Don’t make me think. :slight_smile:

6 Likes

That’s a great list Ash!

Not sure to understand what you mean by “Close to Application Code - does the tool complement libraries already used?”
Could you please elaborate and/ or give an example?
Many thanks!

1 Like

Hey Christian

Good question, let me elaborate a bit.

I think if tests are with application source code, they have a better chance of staying up to date and getting developers involved in their creation and maintenance. You don’t need to dip in and out of multiple repos this way.

Also, if they are ‘close to the application code’ you can use the same libraries that your application uses (for http requests, db connections etc) and utils written for other types test (unit, component and integration), performed a bit lower down.

I’m a big fan of working closely with devs on test automation, having separate tools, libraries and utils in a separate repo means a divide in the team in my experience.

Ash :smile:

3 Likes

Thanks for the clarification!

And I totally agree to all of your points!

My list of requirements:

  • Goal to be achieved
    – Continuous Integration?
    – Load test/Performance test/function test, etc.
  • Features supported
  • Protocols supported
  • System requirements
  • Speed of execution
  • Reuse of scripts from other tools and vice versa
  • Size of the community of users
  • Feedback from other users
  • Documentation
  • Support from the tool developer
  • Support from community of users
  • Level of experience
    – of the team that is meant to set it all up
    – and use it
  • Ease of usage
    – Scripting
    – Executing
    – Changing sequence, etc.
    – Modifying parameters
    – Saving of logs/results/analysis
  • Cost
    – Short term
    – Long term
    – Licensing
  • Ease of maintenance
3 Likes

I like the aspects you bring up.
On the other hand, I like to use separate/independent tools for my tests to bring an “outsider’s” view on things.

Hi Sunitha

I really like your list, especially the deeper list of ease of usage.

With regard to the outsiders view. Its often the testers balance to strike between critical distance and social closeness.

In my experience, the benefits of the outsiders view start to wane quickly, whereas social closeness with the team means your automation has a better chance of engaging the whole team. I speak only for my experience here of course but its telling that every DORA State of DevOps report strongly correlates that the best teams (and the quality of what they build) share the testing load, with developers taking the lead on automated testing.

Ash

2 Likes

Ash,
Thank you so much for sharing your experience.
That helps me immensely, because we are starting to automate our tests now and I can make good use of such input. :+1:

Cost:

  • Training and time to implement automation - time costs money
  • Consider scalability costs, such as potential increases in licencing fees or infrastructure requirements as your project grows.
  • Ensure ongoing support and updates are feasible
  • Vendor lock-in risks and associated costs.
  • Do we need to hire anyone to get us started or to help with training?

Experience:

  • What languages are currently being used - does it make sense to learn a new one?
  • Who needs proficiency with the tool?

What We Are Automating:

  • Considerations for data privacy, security, and regulatory compliance.
  • Ensuring the tool aligns with the project’s specific automation needs.

Integration:

  • Check the types of systems the tool must integrate with (e.g., databases, APIs, third-party services).
  • Assess the ease of customisation and extensibility for future integration needs.

Maintenance:

  • Consider ongoing support from the vendor or community for bug fixes and security updates, especially for open-source/new/old tools.
  • Evaluate the scalability of the maintenance process as usage grows.
  • How easy will it be to maintain clear, well-documented code/scripts for ease of maintenance and troubleshooting?
4 Likes

Not too sure if it is mentioned already. If we are talking about UI automation tools, some of my clients also want to consider no or low code options, so that the Business Analysts or Manual Testers can also create UI automation test cases (with support from Automation Engineers to make sure the test cases have good test case design, and can actually run)
“LogiGear - here to help shape the future of test design"

1 Like

Here are a few requirements I came up with:

  1. Extensibility and/or compatibility with other tools: We don’t want to get stuck if we need capabilities beyond what the tool offers by default.
  2. Workflow: The tool must be (sufficiently) smooth to work with. Reusability is an important part of this, as a big part of the appeal of automation is the reduction of repetitive work.
  3. Type of interface: I personally prefer to build my own tools using an appropriate stack of libraries, but a GUI may be preferable if we want to involve people without programming experience.
2 Likes

Here is what I came up with:

  • Maintainability - Is the tool itself easy to maintain in addition to is it easy to maintain any code from the perspective of a tester and a developer?
  • Compatibility - Is the tool compatible with the hardware/software being used for development? Is it also compatible for end user systems? Does it match up with client requirements?
  • Client Requirements - Is it even suitable for the client based on the brief the team have been given? If the client needs to use it, can they use it effectively?
  • Reporting - How often do we need to report test results? Does the client need frequent reporting?
  • Documentation - Is the documentation comprehensive enough so the tool can be used without requiring intensive training?
  • Security - How easy is it to ensure a well-secure system for the project? In some sectors (i.e. government, military, finance, etc.) this will be extremely important to make sure that the automation can still be carried out despite strict security requirements. In fact, this can also make use of automation to test system security for protection against cyber attacks
  • Updates - How often does the tool need to be updated? How easily can the tool be updated? Button click? Terminal?
2 Likes