I’d like to get your reaction to this excellent quote from Jitesh Gosai (@jitgo). It’s about quality engineering.
Quality engineering is more than testing earlier in the software life cycle. It’s about looking at all the facets of software engineering. From delivering the product to the processes we use to build it and the people involved. It’s about taking a holistic approach to quality and understanding how quality is created, maintained and lost throughout the software life cycle. It is then, using this insight, we build quality at the source.
What do you think? What examples can you share where you’ve demonstrated the value of quality engineering? Where have you led the way where others might have been sceptical?
Totally agree. There are so many changes I’ve helped, instigated or supported in my organisation over my 7 years with them that are less about testing and more about quality engineering. I’ll try and bullet them, happy to go deeper on any if needed:
Early on changing the “Test Team” to be known as “Quality Assurance” as I knew we stood for more than just testing. But only recently I realise that “Quality Engineering” would have been more accurate, but at least it was a step in the right direction in changing our identity and expectations.
Introduced a Quality Gate step to our release process - a 15 minute chat with all stakeholders prior to deployment to ensure we are confident we built and tested the right things and the customer knows whats coming and we’re ready to support it. The ultimate goal being, we shouldn’t need a quality gate.
Saw a link in production incidents where there was a misunderstanding in the requirements. So supported the introduction of refinement sessions so we can discuss requirements not to just make sure we understood them, but to confident it fulfilled the customer expectations
Introduced a managers delivery planning session to ensure that the product roadmap beyond the current and next sprint is tangible - i.e. we know where we’re going. With that, proactively support the teams by being prepared for the direction of travel and become more predictable in planning whats next.
Introduced KPI’s to measure engineering effectiveness and objectively highlight areas for improvement. Thats been the one greeted with the most sceptism. Even though they were built with the teams involvement, being open to their effectiveness being challenged, evolving them together etc…it is still to this day tagged as “Garys stats”, they get dropped from management agendas if there is too much on and so on . All the time I’m open to discussing and changing them, but the simple fact I’m the one that puts them together they’re seen as my responsibility, not ours. So always trying to learn lessons on engaging with the wider organisation so we all feel responsibility for quality outcomes.
After 20+ years in the world of QA I am still surprised what I am about to say is not common practice (even in an Agile way of working)….
I have worked at several organisations where we would be due to receive a system or part of a system, and we would have no sight of the supplier’s test coverage or detailed results until almost handover (before commencing SIT or UAT). We would only get high level test progress reports and a final report (not overly detailed).
I would ask the PM’s, RM’s or product owners if I could ask the supplier if I could get some detailed information on their coverage and results intermittently as the development progressed. The response would normally be why, also I doubt in the contract it says this information is required of them (apart from a final report).
My response would be:
If we see the coverage it may highlight a miss understanding in the requirements/stories.
If the coverage is lacking in some areas we can highlight that to the or we may need to put more effort into those areas when we do our testing.
If the information takes ages to arrive or is not clear it may indicate unstructured testing.
If they point blank refuse to provide the information without a valid reason (e.g. not in the contract) again it may mean testing is unstructured.
By reaching out to the suppliers requesting this information on each project has saved a considerable amount of time in rework, defects being raised etc…
Those PM’s, RM’s, and product owners now make it clear that this information will be required on the projects they work on.
Oh crikey, probably too many examples to list them all here specifically.
But in the wide, I see that the value has been appreciated when other engineers pick up the practices that I’ve coach them in. Whether that be starting to shift left, thinking outside of explicit asks (documented requirements / ACs) in terms of quality or using feedback loops across the lifecycle of work.
After recently starting my new role as a QA Engineering Manager I had a sudden realisation. I have been doing this “stuff” for years with my previous organisation. There are tones of things I have put in place in my 10 years with them that I now consider to be engineering. One example that was not popular was taking our customers testing and recreating it in our automation framework. This resulted in reducing the customers time UAT testing, it reduced defects raised as we found them first and speed up time to market. Taking a 150 days process on average, down to 3 just days meaning they could move to monthly releases!
The long and short of this has lead me to one idea. I am not here to just test I am here to add value to someone or some team. By adding value I am adding quality not just into the product but the whole SDLC. Its not simple, as there is many “battles” to have but picking up one thing at a time is a start. My current example I am working with a team who down tools for 6 days to UAT test a release from a 3rd Party. Preventing them from having to do that is a massive value adding activity to them. Not to mention we gain some QA Engineering supporters as a department (we are a pretty new concept to the organisation).