I used to see it as odd and worrying. I even asked how could we get it into the curriculum.
Since that post I’ve thought a lot about what I learned in the first year of my CS modules vs what I used every day when I was a developer. I’ve also spoken to the people who continued on with CS modules past first year (I moved to pure science). They said that testing was taught as part of their curriculum but largely they don’t use the teachings now.
I know some universities are looking to integrate it into their degrees but having spoken to people who have come out of these degrees I wonder at the format it would be taught in. Many don’t use the development teachings at all, due to outdated information by graduation. By the time students graduate, will the testing teachings be outdated? Since testing doesn’t really have a best practice (I use this term loosely but may hang myself with it), who will decide what the university should teach? Is it good or bad if the teachings in the universities vary wildly?
I agree that it should be at least mentioned and not with the “sure testers will fix your mess” type of approach. I’m just not sure of the usefulness of it in a university setting over perhaps an apprenticeship or learning on the job.