Software Engineering in Automation

Speaking to interview candidates on the phone, or reviewing coding test solutions, I am struck by the disparity between practices we expect of our fellow product engineers compared to what we follow for test framework code.

I had a look over the courses and material here on MoT and there isn’t much guidance or talk on the subject.
https://www.ministryoftesting.com/dojo/lessons/extra-extra-automation-declared-software-paul-grizzaffi?s_id=40208 – seems, from the description, to touch upon the subject.

Quote from recent call - “unit tests are for developers. It is not industry standard to test our [the test framework] code”

What do others think?

Do you unit test your core framework code? Is it version control? Do you use feature branches, code reviews, etc… to make changes?

If no to any of these, why not?

2 Likes

My general advice for test code is that it should be part of the production code and if that is not possible at least treated equally. As an example:

Lets say you have versioning and releases for your production code 1.0, 1.1 and 2.0. You need to have test code versioning and releases that either are 1.0, 1.1, and 2.0 or something of the likes TC 1 -> 1.0, 1.1 and TC2 -> 2.0. The simple reason for this is that if you change the code under test between 1 and 2 your test need to change to. If you have a critical issue with version 1.1 and you want to fix and test those you have a problem if your tests do not map to your code.

Then the testing the test code. I would say that your production code is in most cases a good enough test for your code. I.e. if your code misses bugs or reports issues that is not issues. Those are the same as a test is failing. That being said there is a test coverage technique called Mutation Testing that could be added to your tests that can help uncover when your test code do not cover production code. https://en.wikipedia.org/wiki/Mutation_testing

Also if you are working more in the end to end space with your tests you probably should have some unit tests of your code since those typically tend to have more logic in them that you could test. Same goes for all helper functions basically. Just to save time when people are doing changes in shared code that might alter the quality of your tests.

Finally how to hire these people. You basically want a person that is 30% tester and 70% developer or something like that. And in my experience it’s easier to get someone who is 30% tester and 0% developer, which I think is the reason for this topic.

Great topic!

Some interesting points and nice to see hear from another in favour of more SE in automation. I am not sure I quite agree with the production code being good enough to test the test code, but I guess it depends on what code you mean.

As for hiring, sadly it is difficult. It was the spark for the topic, but not the reason. I just feel SE practices for automation are not really given much space. There is a lot on how to get started in coding for testers, maybe not so much for those who already have experience and promoting good SE in the domain.

Interesting point on percentages. I would count myself as 100% tester and when I write code for testing (tools, frameworks, etc…), I do so as a 100% software engineer. But maybe you meant something different.

Thanks for sharing thoughts. Sometimes feels lonely as a Software Engineer in Test. Stuck between worlds :slight_smile:

1 Like

What I mean with the percentages is how much of your time you know of that domain. I.e. Test Strategies, Test Techniques, Test Methods, Test Tools vs. Design Patterns, Software Architecture, Version Management, Build Tools, Development Tools and so on. And of course ideally you should know it all. But given limited time and energy I would try to look for persons that have a little more than basic understanding of test and a very sound understanding of software engineering practices.

You may feel lonely now, but you are part of a growing field.

1 Like

After completion of manual testing, everyone would like to learn automation testing, but we need to have an interest in automation and basic knowledge over it. As automation is the trending future technology which is to be learned by everyone but need to select the right way or path of learning it. I suggest W3Softech is a leading software testing company which helps in the learning of automation testing and outsourcing services also.

There is no clear separation of test and production code at my company.

Where possible, end to end tests, both UI (Selenium) and API are written in the same language (C#) as the code they are testing. They are checked in to source control via pull review by the developers writing the production code.

Where frameworks have been written to support testing e.g. for case generation, unit tests are written to check functionality. Given how rarely this code is changed, these unit tests are generally not run as part of builds although I’m starting to think they should as my team grows.

The key point for me is that developers are responsible for general maintenance of tests - if a functional change breaks tests, modifying the tests is the responsibility of the developer concerned unless the change is significant enough to specify a system test change backlog item separately.

We only have a handful of additional non C# (postman) tests running as smoke tests.

For me, minimising the friction for running and fixing tests during development is important.

I have always set myself the goal of having my test code at least as well engineered as the code it is testing.

2 Likes

Great to hear Alexander Dunn.

The key point is a great one, something we do too. There should be no separation of duties. If tests break or they need adding, updating, deleting, the product developer should do so as part of the same ticket of work.

For framework unit tests we trigger the tests when the code has changed. No remembering to run tests. But great that you have some. A practice to be encouraged in software engineering for testing.

I presume you are the same Alexander Dunn I met a year ago, if so then no surprise at awesome practices.

I’m going to page @friendlytester for this discussion because I know he has some stories to tell of interviewing people who build automation frameworks who are unable to articulate principles like abstraction, encapsulation, polymorphism, etc. That’s why we created a free course on our AiT site to help individuals brush up on these terms: https://automationintesting.com/programming/course/

It’s important that we are able to understand, implement and share these concepts with our teams to help increase awareness and support for automation efforts as they should always be owned by the team.

My hope is that this is something the Dojo can help with. I am actively looking for people to teach others not just how to use a tool, but the principles to consider when using it and putting a name to activities that some individuals already do without realising.

Some good resources on the free course link, thanks. Though again not so much from the code quality angle. I would maybe expect to see something on unit tests, mocking, TDD, etc… rather than say Big-O-notation.
I have heard many interesting interpretations of what TDD is from people who have it on their CV.

I liked the comment:

diving straight into a language or tools/frameworks, googling and stackoverflowing their way to something that ‘works’,

That is sadly true with the use of the page object model by many.

There seems to be a lot of material and courses on how to get started in automation, but It would be lovely to see more focus in helping automation testers embrace software engineering practices. e.g. SOLID principles, DRY, unit test your code, code reviews. naming, etc…

If we are building tools or frameworks that help us give the green light to releasing our production code, then it needs to be at a quality level at least equal to that of the production code.

1 Like

Oh and I don’t mean to sound highly critical or holier-than-thou. I have written many tools, frameworks, etc… that would fail to meet these standards.

I have been lucky enough to soak up these practices and beliefs from working with supportive excellent production software engineers. Which I appreciate not everyone has access to, hence my original statement for the thread on the lack of material in the automation space.

Well, from the get go, I always try to keep the framework simple. Any over complex code, if not needed, should be avoided. When you have a big team and lots of people working on the same repo, following a good branching strategy and doing static analysis with something like Sonarqube should be enough to keep code smells, bugs and other unwanted stuff out of the project.
When you start testing the code that tests…all hope is lost.

Well I won’t wax lyrical about the many benefits of unit testing code when writing any software, there are plenty books, blogs, etc… on that subject.

It does seem to be the engineering practice that I have the most conversations about with developers who give justification why their code doesn’t need them. But I haven’t lost hope…

Coincidentally, this week I am rereading Chapter 9 from Clean Code for an internal reading group and love the bit:

Test code is just as important as production code. It is not a second-class citizen

Although 100% with you on the KISS principle; over engineered solutions are a major risk to quality…

1 Like

I really have to disagree here. The only way to avoid writing any tests is to build your framework entirely from published libraries, and I am sure you wouldn’t even consider using them unless the maintainers have written and run tests.

The same is true where you just have to write your own, typically a test case provider. The alternative often ends up with unmaintainable copy paste test code that is a nightmare to fix when the original functionality is altered.

Oh and hi Jack, yes it is the same Alexander. I didn’t realise it was you when I first answered.

You won’t avoid errors by using published libraries neither. What I’m saying is that there is a line between test code and having another app to test. Keeping the frameworks simple and tidy, with some static analysis and good pull requests practices should be enough to avoid having testception within the automation scripts.

I wasn’t meaning to suggest that any libraries are bug free, simply that you are using tested test code whenever you use one.

When testing complex systems at the system level, it is not always possible to avoid complexity .

Perhaps it is a matter of language, but I would not recognise what we are discussing as automation scripts.

A script provides a description of a single test scenario, something I cannot imagine requiring a unit test.
A framework should provide the ability to reuse the same code for testing multiple scenarios. The most complex parts may benefit from TDD to write and I see no downside to saving that in source control. I will happily trade a little complexity for the ability to reduce duplication and enhance maintainability.

2 Likes