Ideas on test case upkeep

Hey all, new user here.
I’m searching for ideas on updating manual test cases once they become out dated.

I have my team write test cases early in the development process and evidently GUI gets changed during coding so our screenshots get outdated (we rely heavily on screenshots since our software is highly complex being aerospace & military engineering modelling).

Recently we’ve just had an issue occur where a small part of the GUI was changed and had a huge impact on all of our tests in that certain new feature, therefore will take some man hours to update the tests.

There’s probably not a solution for updating them other than manually, but maybe there’s some ideas floating around on how to over-come this sort of thing and maybe even how we write our tests to help minimise the work needed to update.

I’ve worked in many different companies over the last 14 years and each one of them write tests differently to suit their needs, so of course it’s very company/project specific.

We currently use Xray (Jira plugin) for our tests cases/test cycles.

We also do a peer review after each test is created & I’m curious on how everyone else’s process works.

Thanks in advance for any tips and hopefully keep this discussion open as I’m keen to hear your processes/workflows.


You have mentioned that you are using screenshots in your test cases to compare the expected and actual state of the system during testing. Usually I would recommend writing the test cases in a way that testers can adapt to UI changes during testing, which usually means being less specific with the test case descriptions and relying less on (full) screenshots. But if this is not possible, have you considered automating the process of updating the screenshots?

E.g. for our product Testmo we are automatically generating all our product screenshots from code, which makes them very easy to update for new product versions. I wrote an article about this before here (the article is focused on updating website/marketing screenshots, but generating screenshots for test cases would work the same):

1 Like

Maybe try to think about how you word your test cases, are you very specific in your steps that you reference implementation details? If that is the case maybe it’s worth trying to have your tests more focus on generally describing the behavior of your SUT (the logic, not the name and colors of buttons for example)?

1 Like

A possible solution is to use DRY or Don’t Repeat Yourself. A piece of code which is used frequently, is put in a separate method or function. This will save some lines of code. The updates still need to be done manually.

Another possible solution is auto healing: automated tests are updated automatically, if possible . A company like Mabl offers a product with this feature.


How often do you repeat the test cases? If it isn’t very often then I would either mark them as inactive/disabled, lob them in and archive folder or get rid. If you have test cases that are likely to only ever be ran once, perhaps including them in a test framework that requires them to be updated isn’t what you want. Consider a way to treat “one off” tests as disposable.

We had a scenario where our test suite had >1000 manual test cases of which we had faith that maybe 100 were actually up to date - because they were the ones. Our solution was to keep the “useful” test cases together then everything else what we were keeping because, well I never understood why, were in an archive folder.

We eventually completely solved this by abandoning having traditional test cases. If there’s nothing there, it can’t get out of date right?

One final thing for what I think is your specific case. How unique are the screenshots? i.e. does the same screenshot show in multiple test cases? How about creating a “screenshot repository” on a wiki or some other tool. Then you can reference it in each test case and update in the source of truth.

1 Like

This actually sounds like a genius idea and I’m happy to explore it. I would assume I’d need to migrate all my tests across to this platform though, right? I’m not sure if we’d have the ability at this stage to do that. I’m going to look more into this though.

It’s something I’ve thought about, being less specific and not over detailing the steps. Although, the screenshots are still an issue here.
I’ve also thought about not having the tests over a certain amount of steps, say 20?
One of my testers had created a test with over 60 steps, and there’s obviously going to be some domino effect here once one of the first steps/screenshots is out of date… they eventually almost all fall.


I’m with @mirza that these sound overly detailed. Unless you’re outsourcing or have a ton of churn, I would start shifting towards outlines and checklists, with the must haves clearly spelled out in descriptions as opposed to screenshots.

This also frees up time and energy to focus on helping your testers develop, so that they can get stronger expectations around UX, maintaining a consistent look and feel, and probably do some side-trips/exploratory-ish testing while they’re verifying the items on the checklists - essentially empower your testers, rather than having folks rely upon rote test cases/scripts. If they are just rote scripts, then you might as well automate them.


You wouldn’t need a specific tool for this. We use Selenium in our example, but any test automation tool should do, as most test automation tools can take screenshots.

1 Like


Is that 20-60 steps that are all necessary in order to test a single thing? In which case consider preconditions to cut those down. Alternatively if it is 20-60 steps that covers a range of things within a feature, definitely look at breaking them down. My team were really bad for writing test cases that would have you running the software, performing all aspects of CrUD with validation testing then viewing the end results. i.e. a dozen test cases in one.


Yeah tell me about it!
We have 3 tests that are over 60 steps, they definitely need breaking down to smaller tests. It’s something we’ll work on doing going forward into next release.

1 Like