The things that I ponder on this topic - and I’ve not said, perhaps I should, but I’m looking at this from the context of the Employer here, not the new guy.
The main thing I should add, then, is not every tester is me. I worked very hard to get very good at evaluating software quickly. Some testers have been told that following boring instructions or writing simplistic checks are a valid substitute for what I consider testing, and those people may struggle more to learn quickly, or apply what they’ve learned effectively. Not everyone considers themselves responsible for product learning, and may need more hand-holding. So if you’re looking at me as an example I’m an example of one particular kind of tester and ymmv.
However, if you’re taking time to make your position explicit I should ask why you’re asking about learning applications. Are you trying to make on-boarding easier or quicker?
How much time are you happy to hear people drone on and on, compared to how much time you want to sit there and do some exploring yourself (provided you have access to the right artefacts).
Interesting people talking about interesting things? Otherwise? Depends on the people, and the substance and quality of the droning. My job is to take whatever I can to learn in a useful and efficient way. If someone talking to me about the domain is helping that, then I want to hear. Again, just my answer, everyone’s different.
I.e. if you have some brief explanation then then detailed test cases, that might be enough to get your started at least
If I have them or if any new tester has them?
I despise written test cases, and because of the kind of tester I am I’m hired to be very good at exploring, learning and reporting. I don’t accept being told what to do by people who don’t trust me to be good at my job and I’ve never worked for any company that says that I must. Test cases restrict freedom of learning - if you want to guide learning I recommend charters with checklists. I have written so much about test cases on MoT in the last few years, so feel free to check my history if you want to know why I think they’re detrimental and unnecessarily cruel to testers - or I can talk about them with you if you have specific questions for me.
If I can explore a product and gain, through previous experience of software and living in society, some understanding of how I think it achieves what it’s supposed to achieve (and I understand what it’s supposed to achieve in a general sense) then I don’t need instruction, I can just learn. Others might need more guidance, and for them I recommend setting them tasks. “Create an admin user” is a perfectly okay task, and if it needs detailed instructions on how to achieve it perhaps that’s an indicator there’s a problem with your software. If your software is convoluted and obtuse then you’ll have to take the hit on the costs that come with it, including testability and learning for new staff. You may be better off pairing the tester with someone and getting them to show them how to achieve their tasks (also makes the task list much easier to maintain).
For example - one of the applications I work with, it could be at least perhaps 1 month to to skim across the topics a bit.
The topics of the domain, or the topics of the application itself? Because domain learning is part of what can make testing harder.
I assume that you’re investigating how to on-board new testers, because you haven’t before or you’re looking to make it better or easier or faster. I didn’t know how to sum that up, so I wrote a 2000 word article draft about it and submitted it to MoT, and I’ll link it here if it gets published.