This Week in Testing, I jumped on stage with Simon Tomes & Aj Wilson to talk about my efforts to establish a test review process in-line with 8 engineering principles;
- We Deliver Value to Customers Frequently and Reliably
- We Care About Quality
- We Document Our Systems
- We Build Observable Systems
- We Automate Tests
- We Value User Experience
- We Build Secure and Available Systems
- We Build Performant and Scalable Systems
Each of these principles covers factors which can improve quality and/or testability, for example…
- We work in test envs representative of live (managed through IaC), to simulate real-world scenarios
- Code is easy to work with and has few areas of technical debt
- Test cases document the business logic on which they rely
- Logs can be used to identify the steps to reproduce an issue
- Automated testing targets the appropriate areas & layers of the system
- Accessibility is always considered
- Security scans are performed (and actioned!)
- Performance is tested with realistic data volumes
The idea being, that a more experienced tester will have a regular check-in with a projects lead tester, and rubber-duck the state of testing on the project. This discussion should facilitate highlighting areas of improvement, in-line with the engineering principles.
So I’ve got to ask, how do you review in-flight projects?
For my own context, I am working at a small consultancy where we often have Test Engineers working on their own, with a development team of 2-4 software engineers. I am looking to embed the test review process as a method to support the Lead Test Engineer on the project. The engineering principles outlined, are organization-wide principles, already in-place. We have other check-ins with testers, to support their wellbeing (line manager) and professional growth (career coach), so this process need only look at the technical and testing health of their project (in my opinion, but I am happy to be challenged on this)!