Most companies I’ve been at we needed some production testing or dumping prod data to lower environments. It’s not always the cause but real use data like what’s in prod is a major source of bugs and missing test cases. Staging environments are useful as they should mirror production but this almost always means infrastructure wise. It’s rare I’ve seen high quality real use data in staging unless it’s been dumped from prod.
Ideally the system design wouldn’t let data get in to bad states. I’ll be honest and say it’s rare for developers (I say as a senior dev) to even think about preventing bad states let alone actually pulling it off.
*Test locally, test in dev, in staging, in production. Run it before you push up changes. Pull down PRs and run them.
*There isn’t a one size fits all answer unfortunately. Every code base has different testing needs. Every change has different risks and testing needs. Being better requires missing things, feeling the pain, and developing your gut.