What Elements of a QA Program Are Suitable to a Startup?

I work for a company that is technically a startup and has about 12 employees. I was hired as a DevOps Engineer but they also want me to ā€œtake the ball and run with QAā€ despite my lack of actual QA experience. I have been searching and reading as many resources related to QA, but a lot of what Iā€™m reading seems like it either requires a team or more than just me, or is overkill for our small company.

We have four products that require testing:

  • A custom Linux filesystem
  • A CLI server and client program
  • A NodeJS/React web application that communicates with the CLI serverā€™s API
  • A custom S3 object storage solution

Each of the four projects is managed by one or two people who do things their own way and itā€™s really up to me to accommodate the way they want to do things instead of trying to bend them to my will. Because of this, I canā€™t just create a standard workflow that can be automated and then apply it to all of the projects. I need to take a small approach that starts building up a QA program without trying to change too much too quickly.

Based on what Iā€™ve read, Iā€™ve only been able to come up with the following steps I think would be relevant to a startup:

  1. Create a simple list of QA responsibilities so the rest of the company know what and what not to expect from QA. For example:

    • Test for regressions
    • Detect and report defects
    • Test in accordance with each projectā€™s quality expectations
    • Provide feedback to developers
  2. Work with each project owner to come up with a list of 5 or so expectations of ā€œqualityā€ for their project. Some examples:

    • No regressions
    • All inputs validated
    • Meet defined minimum performance goals
  3. Based on the expectations, create a traceability matrix for each product, with the goal that a product isnā€™t a release candidate unless the tests in the matrix have all passed.

  4. Once I have a handle on the above, start automating some of the repetitive steps.

Thatā€™s pretty much all Iā€™ve been able to come up with. I think itā€™s a good start, but I feel like I might be missing a lot. Itā€™s one of those situations where I donā€™t know what I donā€™t know, so Iā€™m hoping I could get some feedback from folks who do this for a living.

Thanks in advance!

1 Like

If you are a part of startup and wanted to setup QA related things then you can consider following suggestions:

  1. Choosing the right tool to maintain your test cases.
  • Writing test cases as per your product requirements includes Usecases, Test plans, Scenarios
  1. Bug management system
  • All the bug base and the related test cases to perform regression testing.
  1. Performance testing tools like Jmeter
  • To do load test or check front end performance testing, Jmeter tool should be handy and simple scripts to check user load can be created.
  1. Confluence links where you can document your findings for future references.
  2. Reports of all your work in sprint and retrospective documentation will help to improve for future tasks.

I hope this information helps to streamline more processes.

1 Like

Donā€™t overlook running end-to-end tests replicating the final user workflow. Youā€™ll need to do this at some stage anyway - otherwise, the first time some bugs will manifest themselves is in a sales demo, and believe me, thatā€™s not the right time to find bugs!

1 Like

A couple of things.

Outlier first thought, here is what I think is a good way to look at things that could be totally aligned with your current role. In devOps a lot of your focus is to accelerate the team and product as a whole, now consider both QA and Testing as the exact same goal but with a bias towards maintaining acceptable quality along the way, this may help you align your thinking and application of both as you already are familiar with a common goal.

First 12 employees. This is a small group that you can engage and establish what problems you are trying to solve and then work together on covering those things. The most agile teams I have worked with never mentioned the word agile nor needed heavy process they just talked regularly with each other and got things done.

Your devops but want to run with QA. You will need to define what QA means from that aspect.

Are they looking for say a quality management plan for what happens within the devops cycle, this would normally include at minimal business goals, quality assurance, testing itself and how lessons are learned and acted upon. So here just talking with the team and agreeing what they believe are good practices and provide a plan that helps support them in choosing practices and applying them well and learn from things them when chances arrive. For example the plan may include code reviews and retroā€™s, you trust the team on how these are done but you guide them to apply good practices in general.

As a start up you want to encourage good development practices from day 1, things like code review, developers writing automated coverage themselves for example can save so much time and effort later when features get out of control.

Now if by QA you actually mean testing then thatā€™s a different thing. Firstly developer good practices are key to long term efficiency but you may also benefit from having an actual pro tester on board as well.

Bringing in a tester before developer good practices are in place carries a risk they will want to pass things over to the tester that they really should not, both testing and automation are examples of things I have seen developers want to pass over as soon as they can, once they do that they may aswell pass on quality ownership too as they slide down that dysfunction slope.

For me, small group Iā€™d focus on good developer practices first with testing support focused on risks that either need deeper investigation or primarily on the very human fallibility factor of developers i.e catch the things they may naturally miss.

In all of above a list of potential risks can be a great discussion point to guide your next steps.

I would not bother with test cases or repositories at the moment, they carry a lot of waste and often only become needed when you get all the basics wrong in the early stages of product development.

2 Likes

Oh, welcome to the MOT community James. Really cool choice of handle there, hope you do find the community builds confidence in your QA project choices.

Hmmm. Small eng. team. And Startup.

Until you are anywhere near getting anything out to a ā€œcustomerā€ QA is very different to once you are shipping. So there is that, you are might be coming from a base where QA was only concerned with finding the easy and glaring communication gaps between 4 teams, to ā€˜ok so now we have a productā€™. So there is a lot of shift to deal with. Hence, gently shifting.

I would try to pare it back or down a bit. Quality is everyoneā€™s responsibility, and in a small company I would emphasize that and take steps to make it easier for each mini-team to do their own testing but providing them with great CI/CD tooling. I would also not get fussed with dashboards, they just cost time in an ecosystem that will likely be fast moving.

Assume that things will change and break often. And to this end I would only build a few basic E2E tests, and design them in such a way that each teams components can be upgraded/downgraded (to eliminate integration/merge/feature branch pain), and can all be deployed into a staging environment(s). Environments cost money to build, but I would focus on making them as performant as possible and as useful to each teamā€™s needs as possible within budget. One of these environments might be your stress/performance environment. Write some tests, but try to make it the devs responsibility to monitor and maintain. Let them ā€œownā€ their own quality bar. They can then add as little or as much other dashboard or pipeline fluff as they require to hit their targets. Work out what deployment tool works for you, and automate the ####t out of the deployment tool until it makes all the teams happy.

1 Like

Thanks for this recommendation. Current plan is to perform end-to-end tests against the ā€œmainā€ branch every morning with the goal of automating it as I go.

From @andrewkelly2555

Outlier first thought, here is what I think is a good way to look at things that could be totally aligned with your current role. In devOps a lot of your focus is to accelerate the team and product as a whole, now consider both QA and Testing as the exact same goal but with a bias towards maintaining acceptable quality along the way, this may help you align your thinking and application of both as you already are familiar with a common goal.

This description of QA from a DevOps perspective is exactly what I needed. I had a discussion with the interested parties and the developers will ensure unit test coverage. Iā€™ve come up with a draft Requirements Traceability Matrix (RTM). Itā€™s a draft because I am not personally aware of any official requirements. Iā€™ve posted the draft for the input of anyone who wants to provide it. Iā€™ve mapped those requirements to test cases that exercise the basic functionality that covers that requirement. The entire RTM will be treated as the ā€œmasterā€ end-to-end test which Iā€™ll run against the ā€œmainā€ branch everyday and provide the results to the team. The requirement for me to run the end-to-end test by hand is my punishment for not yet having automated it.

That gives me enough confidence to tell leadership that a potential release is expected to at least meet every requirements as described in the RTM. Does this sound like a good start?

We have different projects with different project owners, who are also the lead developers for those projects. Theyā€™re not traditional PMs so thereā€™s sporadic Jira usage, or none. Everyone has their own way of doing things. Itā€™s not a complaint, itā€™s actually what makes the culture and products so successful, and I donā€™t mind taking on the challenge of figuring out how to provide important information without burdening the SWEs.

2 Likes

Yes it does sounds like a good plan. I would be wanting to be in your shoes or in your team myself. Just be sure to pace yourself, and treat every thing as a serious experiment that you intend to get some feedback our outcome from. I long for the joy of startup energy.

Thanks for helping with this. Very useful and interesting.

There are several benefits to using wireframes in the web software development process https://mlsdev.com/blog/how-to-build-a-social-network-website-from-scratch. First of all, they encourage discussion. Secondly, they are quick to create and can be easily changed, so they are ideal for a rapid feedback cycle. Third, wireframes are designed with the final user experience in mind. Having screenshots of the final product can confuse stakeholders and lead to wrong assumptions about what the final code will look like. Using wireframes removes these risks.