CS and SoftEng degrees without Dedicated Modules on Testing / QA

I’ve mentioned this before on twitter but thought I’d mention it here. So many CS and Software Engineering degrees don’t have any dedicated modules for testing and QA, despite the fact that most of their grads go into industry as devs and other techies and every development team either works with testers or does some of its own testing. The more prestigious the university (and sought after its grads), the less the interest in software testing as part of the curriculum.

As an example, the undergrad CS tripos for 2016-2017 at one of the world’s super-elite universities, Cambridge, has no dedicated module for QA and the only mention of it is in a single Software Engineering module. The closest there is to a QA module is third year Hoare Logic and Model Checking!


However you can still do such modules as Complexity Theory, Unix Tools and Denotational Semantics.

Does anyone else find this odd and very worrying?

1 Like

It leaves the door open for organisations such as ISTQB to corner the market on software quality and training, which may be concerning.

But as someone who’s aiming to study a masters degree in the next few years, I’ve been looking at Software Engineering degrees that have a strong QA element. The course leaders I’ve spoken to have all been welcoming to the idea of doing a research thesis in software testing.

I would like to see dedicated software testing degrees one day, indeed I’d like to have a hand in designing and teaching them. Though it remains a dream for now, it’s good to have something to aim for in life at least :slight_smile:

1 Like

Testing is just only one element that is missing from Degree courses and is just one symptom of the value degree courses delivers.

Even though the courses do lean towards programming there are many aspects that even missing for modern software development. I was so fed up with graduates with our the right skills I started offering apprenticeships as an alternative. These apprenticeships were for 16-17 years old not some of the re-branding of interns as apprentices that has gone on.

Our first batch was so successful I was asked to contribute to a government report on STEMs eduction within the UK. One conclusion of the report was that the average STEM degree was at least a decade out of date with Industry practice.

There has been a response (It is slow) to this problem in the UK with the introduction of degree apprenticeships and new apprenticeships subject areas and curriculms. This even includes Software Testing but not enough people from the Testing community particapated in setting the curriculum so ended up a little bit to similar to ISTQB


I used to see it as odd and worrying. I even asked how could we get it into the curriculum.

Since that post I’ve thought a lot about what I learned in the first year of my CS modules vs what I used every day when I was a developer. I’ve also spoken to the people who continued on with CS modules past first year (I moved to pure science). They said that testing was taught as part of their curriculum but largely they don’t use the teachings now.

I know some universities are looking to integrate it into their degrees but having spoken to people who have come out of these degrees I wonder at the format it would be taught in. Many don’t use the development teachings at all, due to outdated information by graduation. By the time students graduate, will the testing teachings be outdated? Since testing doesn’t really have a best practice (I use this term loosely but may hang myself with it), who will decide what the university should teach? Is it good or bad if the teachings in the universities vary wildly?

I agree that it should be at least mentioned and not with the “sure testers will fix your mess” type of approach. I’m just not sure of the usefulness of it in a university setting over perhaps an apprenticeship or learning on the job.


I do have a hypothesis on why University are generally bad at teaching in relation to industry; Staff are often measured on how many papers they produce from research and not just there teaching. For many staff their primary interest is research and they have to do teaching as part of the job. The two elements are often conflicting requirements.

The university of Surrey recently offered some guest lectures slots to me and @mwinteringham but since accepting its disappeared down an admin black hole.


That sounds about right @alan.parkinson. Add to this the tendency of the research to go into areas that have little if anything to do with current industry practice, and you’ve got people teaching who are way out of sync with employment practice.

Throw a little bums on seats funding into the mix, and you’ve got a recipe for seriously sub-optimal education regardless of the field, but particularly in fields where current knowledge and practice shifts as rapidly as it does in software development and in testing.


To be fair to the universities and playing a bit of Devil’s Advocate, they would argue that they teach academic CS that focuses on base principles and fundamentals - discrete maths, data structures and algorithms, compiler theory, computer architecture, type theory, automata and state machines etc. Their focus has traditionally been on presenting an academic discipline (like Physics or Pure Maths) over “job training” as such.

The reason why this looks different from what we may call current industry practice is that the above have been successively abstracted from and made less relevant to us with frameworks and high level languages.


I completely agree @paulmaxwellwalters. The problem as I see it is that many of the people taking the CS degrees expect to learn current industry practice and receive what could be considered “job training”. The way so many job postings require a CS degree doesn’t help.

The unfortunate result is that the principles become divorced from the practice, which in turn leads to inaccurate ideas about what software development is and what testing is.

And I stand by my comment about bums on seats funding. I don’t know how to fix the funding issues in higher education without making things worse, but funding universities and colleges based on the number of students just encourages sub-par education. That, however, is a rant for another day.

It’s not just job postings causing the problem, schools and government are very much marketing degrees as Job training a ‘must have’. Certainly in the UK the destruction of vocational-technical qualifications since 2000 has left degrees as the only option for people to get into the industry. Luckily in the last 4 years they have realized the damage are starting reversing the situation but I think we are decades away from fixing that problem.

I don’t actually disagree with @paulmaxwellwalters devil’s Advocate position of teaching “base principles and fundamentals” but the academics have become boiled frogs… the market and expectations of industry, customers (students) and government have shifted since 1994 but the way they work and teach hasn’t changed or kept up with the pace.

1 Like

I agree with you both @katepaulk and @alan.parkinson The lack of a respected alternative to CS and SE degrees as entry routes into IT (although modern and degree apprenticeships may start to change this), the mixed response to coding bootcamps and the tendancy for some companies to fall to a default recruitment option of CS degree as mandatory allows university CS departments to rest on their laurels. IT is very unusual in that the real experts in the field are not academics linked to universities like you would have in History or Physics but practitioners in companies using cutting edge technologies and processes. Academics in IT get cloistered in ivory towers very quickly.

However @alan.parkinson I would say that it isn’t necessarily the case that degrees are seen as the only option to get into IT. IT is very unusual in that a large number of its practitioners were self taught or came in via industry certs. It isn’t like the past but it is possible to get an entry level dev job somewhere on self-teaching and a strong Github profile. That would never happen say in Law or Finance.


@paulmaxwellwalters I understand there are alternatives (We run apprenticeships) but whats promoted at school and the majority of companies hiring policy is you need a degree. It takes a brave soul to go against that kind of grain

I think you are right to a point, and also think the pendulum will shift towards more testers needing CS degrees in the future, however the one thing that makes IT unique is that unlike most other professions the cost of self learning and proving your skills is really cheap.

It is still quite possible to sneak into a job in programming with networking, a good Github portfolio and a few decent performances in hackathons. IT also has a great diversity in company size, budget and need. A poor startup may be willing or forced to take a chance on someone with raw skills but who wouldn’t get past the degree filter in a Big 4 firm or consultancy.

1 Like

Perhaps, Paul, that’s how startups grow and evolve - by casting the net wider and taking in all sorts of people where the company perceives talent no matter what it looks like. Having set ideas about candidate profiles, fitting square pegs into square holes and recruitment run by HR templates rather than managers acting on hunches might just be a sign that a company has grown too successful to be truly innovative any more.