I’m looking to find out a bit more about framework documentation, specifically:
What details do you put in framework documentation
Documentation can sometimes feel unnecessary, but if we get the right balance, we have the ability to share valuable details without drowning in writing. So, what do you add to your documentation?
I would think in general, it would depend on how complex or simple the framework is.
What might a user need to know to use the framework?
Is it straightforward and intuitive to use, or will users need help on how to use framework?
Steps to install, setup, use the framework, a quickstart tutorial
Would the framework need an FAQ?
What features would you want to highlight about the framework that isn’t obvious? Or features that might not always be used but which are good to have for the 10-20% use cases?
Example code to highlight use of a specific feature of framework. Not all features are intuitive how you would use them.
Assuming that by framework you mean something that adds generic functionality to one or more tools AND that each source file is well-written (and therefore easy to read and maintain by itself):
For maintainers:
A picture of the architecture / design. At least a high-level one and additional ones if certain parts are complex enough to warrant it.
What important decisions were made and why? (For what the picture shows but also other things, such as what design pattern(s) are being used.)
For users:
How to set it up in your project (= a READ.ME file).
Code examples that show representative usage if that seems useful.
Normally nothing that only concerns a single file. Documentation on that should go in that file.
I’ll be specific to my test automation repositories and how I have them organized. I have found that having a good readme/documentation, saves me a lot of time on training, as all the information is there to get started and learn for anyone who puts in the time.
Overview
A summary of what the repo is, explaining what it’s purpose is. In this section you could add an image or diagram of how it’s used in the Software Delivery Life Cycle, explaining where it runs automatically, and what steps, and what set of tests are run.
Getting started
Include links to the documentation of the libraries or tools you are using ex: selenium documentation for ____ coding language.
Include links to tutorials on how to use the tool (this could be an internal video recording or a good tutorial online
Installation
Commands to install all dependencies (include windows, mac, linux depending on what types of workstations your team uses). This can also include installing any webdrivers or other dependencies
Running Tests Locally
How to run the tests. This should include the different ways to run tests.
A single test
A group of tests
Tests in debug or headed mode, etc
Tests against different environments
Walk through any docker configurations necessary for emulating a CI run locally
Running Tests in CI (cloud)
How you kick off a run in the cloud
Where do you go to see reports, failures, videos, screenshots, network traces, etc
Instruction on who is responsible for monitoring failures (this could be a rotation, a team, or a specific individual).
Writing Tests
walk through how to write a test, what tools, utilities, libraries are available. (faker, etc)
Cover how to setup before and after hooks within tests, and the best practices
Discuss authentication and state of how the test should start (logged in?, as a user or admin?, logged out?, etc)
How information is abstracted away from the test view such as Page Objects, and how they work.
Include specific helper methods or data factories that help get data in a good state that are available to contributors.
Include how to run any linters, formatters, or pre-commit hooks and how they work.
Include any global setup or tear down steps and configuration files (where to find them)
Writing Tests Best Practices
Include specific framework best practices established by the team.
Best way to access UI elements (example by data-test-id) or whatever the standard is within the codebase
Interacting with ShadowDom or iframe elements
How to chain elements if there is an established pattern
How to organize your test file including spacing preferences
Timeouts within tests and how to override if needed
Discuss how to track coverage/organization/tagging of tests (smoke, happy, admin, etc)
It’s worth showing off an example of a good spec in this section as somewhat a starting point for new contributors.
Description of how best to debug your code while writing or troubleshooting an issue.
Visual Testing (or any other type of testing your framework is doing)
Include how to update snapshots locally and for CI runs
One of my tasks is to assist colleagues into Automation with general overviews of what we are doing and more specifics on how to.
There is also a lot of involvement with Developers so documentation also considers them.
A lot of what I do is coded in-house so documentation is paramount.
The applications worked with are complex in what they do, but pc based so testing is very much focused on results produced from the applications.
I would go more for videos as documentation than written work, so videos are backed up with documents detailing best practices, best contacts, links, next steps etc…
I look to ensure each video is 30mins max. Smaller is better so I try to average 10mins.
General:
An overview of the Regression Suite and the components therin
Details of Team City and Octopus configuration.
Details of resources; .Net and SQL versions, any nuget packages used, VMs etc…
Technologies and Tools used and why theses are used; XUnit, FluentAssertions, Entity Framework, Linq etc…
Basics of automation code e.g. the 3As and what this means.
How ro run tests from TC and VS. How Test Explorer works in VS
Specifics: This is more code based for the how to.
How to create a new test.
Using XUnit as the test framework, detailing how to use ClassFixture for injection and TheoryData for paramaterising test data
Creating test data to pass to test; from file, database or in-memory
Calling methods to execute an action
Using FluentAssertions especially more complex parts such as EquivalentTo with Ignore options
Developers
This would be more about the abstracted code for such as EntityFramework, the Linq queries, extension methods etc… Basically code that is hidden from those who will write test classes and methods only.
Activity 2.6.1: Capture Information That Is Useful In Documentation
Butch et al put together quite the comprehensive list, so I’ll just add what I think might be useful on top of that:
Test naming - have an example of naming conventions, as this can get real messy.
If you are using gherkin syntax for example, good examples of step naming, like keeping the ‘and I click’ type naming to a minimum.
Complements exploratory testing - I think its a good opportunity to add a little note saying this suite complements the testing done by humans, not replaces.
Any meta testing - if any scanners run for accessibility, performance etc as part of page loads.
People and teams involved - maybe a note for anyone picking it up regarding contributors and teams who use the framework. Nice to help people out I think.