Technical tests for testers?

The jobs market in Belfast is pretty busy right now. A lot of companies are looking for testers. A lot of those companies are also adding technical tests to their interview process.

I think technical tests are becoming a lot more common, more than in Belfast for sure. In your experience of looking for testers what kind of technical tests have you set? What were you hoping to achieve by setting them?

I’ll start off with my own example, it’s not a test per se. I used the idea behind my blog post Dynamic Tables in Automation. The application I used to work on was full of dynamic tables. I wasn’t necessarily interested in their automation experience but more the thought process applied to find something in a table. It was more a pairing exercise, they could keep asking me questions, everything except “what should I do?”.

I would start with an example table from w3schools. I would let them know that the table headers would always be the same but the contents would change. They would need to think about writing some code to search the table. I would then ask them something like “If you wanted to find out how many companys there were in the table in Germany what would you do?”.

The answers to this really varied and it was an interesting way to get an insight into the candidates thought process.


As one of those currently hiring in Belfast and struggling to find good candidates, this is a very interesting topic for me. That being said, I’m fairly happy with the technical approach I’ve taken to the interview / testing process.

For me, it all began with defining what exactly I meant by ‘technical’ - did I mean the ability to code automation frameworks, create them from scratch, utilise existing ones, make good use of CI/CD, get involved in DevOps or really just fully understand the domain of testing and how to critically think. I settled on the latter. The former (automation, frameworks, devops etc) can be taught fairly easily. The latter, not so much.

So, the test approach I have in our recruitment process hones in on the critical thinking of a candidate. You have an hour. Here’s a well known thing. Come up with some tests for it in whatever approach you want - so the candidate is free to write an automated script if they want, or a gherkin based feature file is fine, or even scrawled notes on what they would test - I frankly don’t care, what I’m looking for is their approach to testing. That’s the technical skill I’m looking for.


I really like technical tests for testers at interviews but also think these tests can be really great exercises to develop your skills and knowledge even if you aren’t applying for jobs

“Greatness comes from practising; applying the theory over and over again, using feedback to get better every time”

Below is an assignment I was given after applying for a mid-tester role and is an exercise (kata) I like to play around with every now and then. It’s interesting as I find each time I try the assignment my approach differs slightly and I try to test using different techniques, tools and languages and so I find it’s a great learning exercise.

QA Engineer assignment

Please access the following sample application -

  1. Create a series of manual test cases that cover the CRUD operation plus the edge cases. Make sure you give detailed instructions for each test case (pre conditions, steps, expected results). You can use any format you want.

  2. Write scripts that would automate the manual test cases that you see fit to be included in a regression test set. Please use any of below programming languages:

  • Javascript (preferred)
  • Java (preferred)
  • Python
  • Ruby
    (Please avoid frameworks that only record test cases.)
  1. When the assessment is completed, please push the file containing the manual test cases and the automation project to GitHub.

It think it would be a great group learning exercise if others tried the exercise and shared your results so we could discuss the different approaches we all took.

This was my attempt at the above assignment


Similar to Neill’s thoughts, I’d be asking what do they really mean by ‘technical’, maybe a decade ago when the idea of SDET’s were on the rise it was taken as meaning ‘can code and build automation frameworks’ but these days in many companies I think that is no longer the case.

I’ve recently switched to having a preference for highly technical investigative testers but I would not be asking them to code anything or build automated checks as part of a recruitment process.

I would want to assess if they can hold technical discussions, utilise technical tools as part of their testing investigations and provide solid technical information when they communicate issues alongside being able to leverage from their own technical awareness to see risks for investigation that they might not otherwise see.

The emphasis on the technical side in addition to the core testing investigative skills is something I am still working out so will keep an eye on this discussion for ideas.


In my opinion, the most critical technical skill for tester - is ability to understand how the system’s components work together and where can be a potential problems in it.

It can be assessed on the interview in the various ways.

First of all, the tester should be able to read and understand the basic code. You can offer a sample portion of code - one or two classes and then offer to tester to find bugs (which were intentionally left here :slight_smile: ) and vulnerabilities in the code. Of course, the tester should express her / his thoughts out loud, while analysing the code.

Secondly - the tester should be able to create basic test automation scripts for UI / API level. The assessment can be not hard, but it should demostrate the whole spectrum of programming skills of the candidate. @vivrichards example is really good at it.

Thirdly - general tools knowledge - like code review tools, GIT / SVN, etc.

For more senior level testers it’s expected, that they will be aware the types of system’s architecture.

Personally, i think that asking software testers to implement algorithms is in general not really applicable, unless your software totally depends on high efficient computing and performance.


I kind of agree and I kind of don’t :stuck_out_tongue:

I think it depends a lot on what the tester is going to be doing on a day to day level. I’m not too worried if a tester can’t understand what’s happening in the code if they can understand what the flow through the application is.

If a tester can expertly deduce the flow through an application and/or process and see where potential issues are at that level, then I’d be very happy with them joining my team. Customers don’t see code. Customer’s don’t see how components of an overall system fit together, customer’s see one application / process / cloud system and handle it from there. I’d want my testers to be able to do that. If they can do more, then great.

What @andrewkelly2555 said nailed it for me - I’d like to assess whether they can hold their own in a technical discussion and understand the tools that the job spec calls for.


In my previous comment I’ve concentrated on describing the technical “type” of testers. Of course, in order to get the most value from the testing team - you need to have testers with customer view and huge domain knowledge. And such kind of testers may have deep coding skills - at least in order to run automation suite and interpret test results.

I’d like your answer multiple times over if I could Neill. If you’re not hiring someone to build an automation framework from nothing, you don’t need to know how well they can code (or in which language). You need to know if they can look through a situation and find potential weaknesses or inconsistencies. You need to know if their reaction to something unexpected is to create a bug report immediately (not good) or dig deeper to trace as much information as they can about the issue, and make a reasoned decision about whether or not what they found should be reported. Those are the technical skills that most testers, even automation testers, use far more than actual coding skills.


I had to do QA interviews before and came up with some options that I blogged about, you might find them useful (or not). These are more geared to be part technical in terms of coding skill capability, part investigative/debugging capability, part architectural design/modeling skill/understanding, part creative thinking. It’s nice to find well rounded tech testers.

Mark Siemers’ talk Refactoring the Technical Interview is very interesting: He shows better ways to prepare questions in order to filter candidates that can really jump on the team, by showing them real code from the company code and focusing on refactoring and understanding.
The examples on this thread seem very align with this, but unfortunately many people go just with Cracking the Code Interview type of question.

I don´t believe in technical tests for testers, I think it´s far more effective discovering how the candidate thinks.
Describe him an imposible mission to acomplish (per example a problem with no solution) and let him speak out her mind. A tester should think out of the box by his very own nature, he should came out with many differents, interesting, impossible… solutions to the mission.
He should amaze, surprise, shock you in many ways.

Deep inside, this question it´s the same as the “classical”: attitude vs abilities (I prefer the first one over the second)