As I start about another test plan and put together a plan on a page. I started to wonder, compared to development time, what % are you spending testing. A lot depends I guess on level of TDD, BDD etc. Does the % vary a lot for us all. I have worked in various industries, methodologies and % of project v Development has varied from 25% to 400%
This is key, currently we are doing allot of acceptance testing. This is based on insurance product and covers the full quote journey, from software house (SWH) front end system (as a broker) right through to the back office systems internally.
What the business sees is the internal developers on say the iSeries back end, something in the region of 5 - 10 days development time. So they assume that testing this will be equally speedy. Sadly it’s never the case…and they don’t see the SWH integration development from the external parties.
The quote journey for a single product may touch upwards of 11 SWH, many with dual platforms. This leads to a higher (often much higher) % of testing to development. Mainly as development seems to be a one to one relationship but once we hit test it becomes a one to many.
This of course leads to the age old questions, “why does testing take so long”, “why is it stuck in testing again”, “why cant testing be faster” etc. etc. Which as you can imagine leads to a continual battle to not only defend the position of testing but justify our work to the business.
So to answer your question, what % do we spend testing v developing…too much if the business is to be listened to.