Testing the removal of features

Hello,

I’m very much accustomed to testing software that is built up, features being added. I now find myself tasked with testing a piece of software that will have features removed.

My first thought is of course, run the relevant regression tests. But is there more to it? Have any of you tested a lot of removing of features?

Apologies for this being a bit wooly, I may be reading too much into it! All advice welcome.

TIA :slight_smile:

3 Likes

Sure if the things you care about keep working and the feature is no longer accessible that’s all you care about from a testing prospective.

Obviously anywhere with integrations with that feature are higher risk. I.e. is are the links and buttons removed from all locations.

Devs have a bigger job, cleaning up any dead code to keep code clean and maintainable.

We find in a practical note is the assumption a feature can be removed. There is always one or more customers relying on it and it’s very hard to remove things without complaints being made. So be very careful to check the feature isnt being used and you have had clear Comms out to customers with lots if notice.

3 Likes

In my opinion, regression testing is the key here. The software should keep working as usual except for the removed features. Something important - considering the scope of the regression - is providing early feedback, in order to save time and costs.

As @crunchy mentioned, there will be features with a higher risk than others because they could be closer related to the removed functionality. You might expect more potential issues there, so you could think about prioritizing these risky features first, considering you will need to retest some things if failures are found.

I know it sounds trivial, but it might not be so easy in practice, so this is something that should be properly addressed, explored and planned.

2 Likes

In the past I always tested on data. Basically there are two categories:

  1. Actual data
    This data contains information which must be processed. For example what does my order contain and what is my contact information like email address?
  2. Process data
    This data is added to simplify the handling of the actual data. If an order picker has collected all items for an order, then it can be sent to the delivery service provider. In this case the status of the order is ready for delivery.
    Notice that some process data is not always visible. E.g. customer information identification number.

During a regression test there is a product risk to focus on data which has not been processed much. I might enter an order and cancelled it immediately. But in production orders can be modified multiple times using different features which can add process data. In turn this data can be used by other features.

Deleting a feature might lead to different process data which might have negative consequences for the system.

My black box approach is to test the system using production data. With the current privacy laws this can be quite tricky. A solution is a deployment test before going live.

An extra test is to

  • use some features of the current system based on a product risk analysis.
  • update the system
  • use the updated system to process the migrated data.

A white box approach is to model the state transition diagram of the data. This can require quite some time.

I also suggest to involve the users to explore the consequences. Some users might have workarounds which need some features which are to deleted.

2 Likes

Seems like everyone is mentioning the regression, but I’d point out having some tests to verify the removal is also important. Better to make sure it’s gone, and not leave it as some vestigal “feature” that customers start relying upon and scream when it’s gone.

3 Likes

Complementing @ernie point: “customers start relying upon and scream when it’s gone”

Depending on the feature, your customers may be able to exploit it to continue to take some advantages (a voucher-like feature for instance).

Your exploration has to cover not only other features and the path users usually used the removed features, but try to find ways to continue to use the feature in a sneaky way.

I’m assuming you are talking about feature hiding, not removal. Feature removal can happen due to a desire by the business to go advert-supported. Just don’t do that, it’s expensive to the brand image. Or are you talking about a product re-write? If you are talking about removing or hiding or licensing, then you really want to have a thing called feature flags.

Basically feature flags is a digitally signed blob (a license of sorts) the enables or disables each feature. You create a tool to output a flag file, the app reads it and then enables features or disables them. That way you have one binary and you don’t have to rebuild the binary, you don’t even have to re-install it, which saves a load of time. By being able to generate this signed flag file, you can exercise all the possible sensible or even semantically legal combinations of features, just be copying this file and having the app under test read it. The file needs to be signed to prevent customers hacking the revenue system. For more detail, just google “feature flags”

2 Likes

Back in the old days this was done by having a config file for each module. The feature was enabled/disabled via a #define and the code was littered with #ifdefs.
Not ideal if you have many different features!

1 Like

I recall those days with fondness, back when memory was tight too.

Oh, there is the option to instrument your app with analytics, but it’s probably too late for that now. All of this depends on how much people are paying you for the app, is the license annual, or monthly or just perpetual, is there a hosted component to the service and a lot of context can help you decide. Do tell us more @stacey-sloth

1 Like

Thanks very much for your advice everyone!

I’m talking about embedded software, so it’s a one-off fee when the customer buys the device. The reason for feature removal is we’re creating a new bit of software based on bits of the old one, but instead of having a ‘big bang’ we want to test it as each feature is removed.

Well, not that many of us have experience of provisioning constrained devices to save on BOM , together with updating costs, and the way you want to do that is going to be very different. LOL.

I’m still unclear how you provision devices with fewer features without changing the peripherals and interfaces at the same time, or are you merely chasing down on the BOM costs , or perhaps a combination on cost to remotely manage devices and other concerns? I’m a big fan of “context” lately, and I know it’s hard to describe a problem without leaking many trade secrets or strategies. Please do tell as much as you feel comfortable Stacey.

Do you not just test the features that are still required?

Lets say you have features A, B and C.

You remove B but still require A and C. Therefore you only need to test for these two.

Or am I missing something?

There will be a few semantic rules around which feature combinations are legit. Put them into a table, and do table-driven testing. which is made easier if you do not need to re-provision.

Following on Peter’s comment on feature A,B,C: are the features standalone or do features interact as in feature A leads to feature B, feature C involved feature B (as a prerequisite), etc. when you remove feature B.

If there is interaction, then the test themselves likely need some kind of update. You can find that out by doing analysis of the test suites you have or just running all the tests and see which fail and debug and fix from there.

If there is no interaction/relation, then that is easier to figure out what tests to run and what not. And in this case you can also run tests for removed feature B and verify that they fail due to the feature removal (or you could chose to update the logic of these tests to validate absence of the feature, but it is easier to just “tag” the tests as for feature B which was removed but leave the logic as is in case you reinstate it in future)

2 Likes