This seems like a vibrant community I stumbled across, so figure I could repost a thread I’ve posted in another QA forum long time back for additional input:
How do you structure and document your test cases for functional testing? Just wanted to get to know how QA people in the field generally do this.
Let me know if this is better posted in the lobby/general discussion area.
Here’s how we do it:
A high level feature area to test. We’re in telecom field, so say SIP/VoIP phones as an example. We would then document a set of test cases for this area, like having a test plan, so we would have a test plan, or more correctly, a set of test cases or SIP phones. We document our tests in Excel spreadsheets. Personally, I prefer Word documents with tables over Excel but company practice is to use Excel. I have a colleague who used to do his tests along with test plan documentation all in Adobe FrameMaker and publish that output to PDF (what a nightmare for a newbie not familiar with FrameMaker to maintain).
Sometimes, the feature area has more categories to be broken down by, so we use multiple worksheets within the Excel spreadsheet. An example could be different SIP phone types like Aastra 9133i, Aastra 480i. We could have same & different tests for each phone type and group them all in a single Excel spreadsheet, separating them by worksheets.
Finally, we breakdown the test case definition in the following tabular spreadsheet format (presented as CSV here):
Test Cateogry (or keyword/grouping),Test Case ID,Test Name/Description,Test Procedure,Expected Result,Status,Defect ID,Comments,Test Scope,Automation flag
Test category represents a feature group to test like call transfers, or putting calls on hold, etc.
Status is for pass, fail, blocked, skipped, etc.
Comments for additional info not put in the other columns.
Test scope defines whether test case is for basic acceptance testing or regression testing, etc.
Automation flag indicates whether test is automated (yes/no).
The test procedure includes parameters indicating if there is a matching automated test script, the preconditions and postconditions for executing the test case, and includes detailed steps to execute a test.
In hindsight, in terms of test management and usability (in execution and maintenance) between testers, it would seem best to have test procedures be generic with test data values and allow the tester to define the test data used as needed. We could use the same data/configuration or different ones. But for automation, and reproducibility, it is best to use the exact same configuration (except for random data testing).
So I figure it is better to modify the template format and add a new column for automated test procedure, which would spell out the test script and matching test data to use, and define what the automated test script actually does with the test data (so tester doesn’t have to open up the test script to analyze it to see what it does). This column would also mention the automation’s preconditions and postconditions which may be different from the manual test version. The original test procedure column then would spell out a more generic version for manual test case execution.
Let me know your thoughts on our approach (the good, bad, and ugly). And the approach your company takes.