I am assigned to a project to build a new product for triggering alerts.I am the only QA in the team. The project is fairly new without any QA process as of now. I need inputs from the community on 1. What are the questions I should be asking to the client team 2. How can I get started with QA process ? Please help
Are you completely new to the company also?
Is there not a certain test plan within the company or best practices?
Well the list is pretty long but I would at least start with:
- What’s the purpose of the alert system (like email, sms, notifications, …)
- Ask anything about the product itself, like who are the stakeholders, end users, …
- You could play the Headline Game to see what they absolutely don’t want to see
- Then start with all functional requirements
- Then go for non-functional requirements
It’s kind of a broad question
- Might be interesting to start some sort of testplan
- Learn about the product & identify priorities
- Define your scope (automation, functional/non-functional, api testing, ui testing, … )
- Select some tools that might help (bug tracking, automation, test case mgmt)
- Define a test strategy
- Setup test cases
- Introduce shift left, 3 amigo’s etc
- Setup some reporting/dashboarding for the bugs
Just some ideas but I don’t know how big the “triggering alerts” project is but you’ll just have to adjust it to whatever works for you.
I find in these types of discussion, QA are the experts are asking those simple questions that often get assumed. So for your point one, I love being the person “in the room” that asks the obvious questions in the simplest way:
- Why is this product needed?
- Who is going to use it?
- How many transactions and users does this need to handle?
- What is the planned architecture?
- How will we know this product has achieved our and the customers goals?
- How do we see this product being managed in production?
- What considerations are we giving in the architecture for testability?
That’s a sample and obviously, there could be more contextual questions you’d ask based on any domain knowledge you already have.
The answers to those questions will drive your considerations around the process. I say “the” process because you need to think of it more as a holistic process than just QA. So I think the answer to question 2 is, take that information, work with the team and find out what process fits to get the product built the most efficient way for everyone.
From my experience, you don’t often get straight answers to all those questions but the cool thing about that is you are actually beginning to influence the design before dev moves too far forward.
I would try to write down a relatively short but quite general list of questions for such cases (some might not be relevant to each situation). You need to ask the client, the team, management, yourself, etc. And as usual, the context really matters I’m trying to build it on some general issues teams/companies often have and on best engineering and quality engineering practices. You may ask a bit different questions but it’s important to cover some points I mentioned in my list
- What problem does this product solve, and what are the key business success criteria?
- What are the critical risks if this product fails in production?
- Are functional, non-functional, security, and performance requirements clearly defined and documented?
- How are requirement changes managed, and who owns them?
- What development processes will be used and how does QA presented in them?
- What is the DoD for features from both dev and QA perspectives?
- What are the high-risk areas, and how will they be prioritized in testing?
- Should we focus on short minimal simple test documentation or do we need detailed artifacts to improve efficiency and/or quality?
- How will regression testing be handled, and should it be automated?
- What are the automation priorities and approaches (unit, API, contract, UI, E2E, etc.), and how are they implemented in CI/CD?
- What automation frameworks, tools, and strategies will be used?
- How many staging environments are available (is it enough?), and how close are they to production?
- How will test data be generated and maintained?
- How will security testing be integrated (e.g static analysis, pentesting, automated security scans)?
- What are the performance expectations, and how will performance testing be conducted?
- What monitoring, logging, and alerting systems will be in place for fast issue detection?
- How will tracing and debugging be implemented across the system/microservices?
- What are the release processes, rollback strategies, and deployment automation mechanisms?
- Can features be released independently without breaking integrations?
- How do we ensure QA has a voice in product and technical discussions and drive continuous improvements?
If there is any documentation for the project/product I would spend some time doing an in-depth dive. Try to become a SME. If none exists, identify the SME’s who do exist and are willing to talk and then take them out for lunch (or coffee, beer, pie - whatever works).
Edit: make sure the documentation isn’t 5 years old and hopelessly outdated (has happened to me more times than I can count).
Did you ask AI? Below is from ChatGPT. Use these as basics and ask experts further questions to get more inputs.
You’re in a critical position to shape the QA process from the ground up. Here’s how you can approach it:
1. Questions to Ask the Client Team
Understanding the client’s expectations and requirements is key. Here are some questions to guide your discussions:
Product Understanding
- What types of alerts will the system generate? (e.g., email, SMS, push notifications)
- What are the primary use cases for these alerts?
- Are there any existing systems we need to integrate with?
- What are the expected volumes of alerts? (e.g., thousands per day, real-time processing, batch processing)
- What kind of alert customization or filtering will users have?
- What are the key success criteria for this product?
Functional & Non-Functional Expectations
- Are there specific performance expectations? (e.g., latency of alerts)
- What are the critical failures we must avoid? (e.g., missed alerts, duplicate alerts)
- Are there SLAs (Service Level Agreements) defined for alert delivery?
- What kind of logging and monitoring should be in place for failures?
Security & Compliance
- Are there any compliance requirements? (e.g., GDPR, SOC2)
- What data is included in the alerts? Are there any data sensitivity concerns?
- What authentication/authorization mechanisms are needed for alert access?
Testing & QA Expectations
- What environments are available for testing? (e.g., dev, staging, production-like)
- Are there any existing test data or test cases available?
- Do you have any expectations regarding automated testing?
- How will we measure the quality of this product? (e.g., error rate, delivery success rate)
2. Getting Started with QA Process
Since there’s no existing QA process, you’ll need to introduce a structured approach. Here’s a roadmap:
Phase 1: Understanding & Planning
- Gain access to requirements, wireframes, or prototypes.
- Identify core functionalities and define high-level test scenarios.
- Understand the release process and deployment strategy.
Phase 2: Setting Up Basic QA Foundations
- Define a test strategy (e.g., manual vs. automation).
- Start with exploratory testing to identify potential risks early.
- Document test cases based on requirements and user workflows.
- Set up test environments and ensure access to relevant tools.
Phase 3: Building a Scalable QA Process
- Introduce test automation where it makes sense (e.g., API testing for alert triggers).
- Define key QA metrics (e.g., defect leakage rate, test coverage).
- Establish a reporting structure for defects, test execution, and quality risks.
- Collaborate with developers on unit testing and CI/CD integration.
Phase 4: Continuous Improvement
- Regularly refine test cases based on changes in requirements.
- Introduce load and performance testing if scalability is a concern.
- Establish regression testing cycles before major releases.
- Gather feedback from stakeholders and users to improve test coverage.
These are some of the questions with which you can start:
- What is this project ?
- For which client/customer the project is being developed?
- Is the client the end user or some else will be?
- In which country will this application be used?
- What are legal and regulatory laws that needs to be considered while testing the app?
- Who will be in the testing teams?
- What are the expectation from me as a tester in the project?
- What are the expected time duration for testing ?
- What kind of testing is expected manual or automation or hybrid?
- Are the design and PRD ready?
- Are the confirmation from stakeholders on design and PRD taken?
- What kind of testing are to be performed like will accessibility testing also a part of testing ?
Depending on the answers of these question you can plan testing strategies accordingly and work on test plan.
However since you are planning to start a project from scratch, I would suggest to start with RACI document as it will keep transparency that who is assigned which role and what is expected from everyone.
You can also review what are the skills of your testing team members and you can accordingly plan on allocating them tickets, like some people are good at API testing, some at UI testing.
If you are planning for automation then that need another brainstorming session with team that what tools and framework needs to be choosen, depending on what your team members are familiar with and in how much time they can upskill themselves.
Also you can start making flow-chart and mind-map for the project, it will help you in getting better understanding of the project.
Wow, how many great answers from all of you guys
I’m also solo tester and QA in my team and I often think that I must know some things, but here I see that it’s great to ask that questions.
Thanks