In my view there is no such thing as “best” practices. There are a lot of good practices, some of which contradict each other, because what is best for one situation and type of software is not necessarily best for a different situation.
For instance, the best practices for testing embedded software in an MRI system are going to be completely different from the best practices for testing casual games played in a browser. These will be different again from the best practices for testing social media sites. And so forth.
What is best for a particular application will depend on what the application does, whether the application serves an industry that is heavily regulated or not, whether anyone’s life depends on the application working correctly, customer and user expectations of the application (payroll software has a rather higher requirement for accuracy than casual game software), and the company culture - as well as many other factors.
Trying to eliminate all bugs might be feasible - or even required - for software that monitors the release of controlled medication to a patient. For casual games, it’s not worth trying - better to focus on eliminating anything that stops the flow of the game and as many of the more annoying other bugs as possible because otherwise a competitor will get ahead.