We have a user documentation review activity for every release where the OpenAPI YAML gets validated for the content. We check
Does it have enough data,
is the data correct,
does it have enough examples
We also share a postman collection for the Devs which makes it easier for them to get started.
For the actual API responses especially the error cases - we can check whether the messages are clear and specific.
“The request could not be processed due to an error in this param” is better than “An error occurred”.
Thank you very much for sharing your approach. I haven’t defined UX for APIs yet. I’m very curious to find out what others are doing so I can up my game!
I think it’s difficult to think of UX in isolation of the integrated system. Maybe I am better off thinking about the “Developer Experience”.
But then, I think API and service design can impact how users interact with the downstream products.
For example, a lack of good paging of filtering might cause poor UX, because you end up with the client pulling way more then it needs and processing locally.
We design API’s before the are created and we adjust accordingly.
We groom them also, some examples at what we look at:
if we need a patch when a put isn’t sufficient
http codes (we have standards for it) including error handling
the resources of an API are very important, it should always be plural nouns so for example /user/jeff should be /users/jeff
we validate the design and order of the values in the json, date values are consistent
API’s should be easy to read and every value should be clear to what it does/stands for
we check if we need to limit the GET requests, let’s say you are retrieving all customers but you only want to see the first 100. Do we need it/Do we need to implement it?
If it’s in the design phase, we just do it like a grooming, like it should be this and that.
After the design of the api has been made, it’s like a “code review” basically. (most of the times it’s like this)
Where I’m at right now the BAs and/or developers work on the API documentation. Since I’m the only tester in the team they always ask me to review the docs and share my input. A well document API is easier to test and having understandable API documentation helps whomever is consuming the API.
Do you, or your BA’s ever get feedback from customers on the documentation / API integration experience? (before or after developing the API)?
Yes, we do, the project is about developing APIs which insurance brokers can consume to get data required to re-sell insurance to their customers so based on what they need (and the feedback we keep getting from these brokers) we make new endpoints or adjust existing ones to accommodate their needs. Before an API goes officially to prod they usually have some remarks or change requests which they send to one of the BA, who kind of acts like a proxy in this situation.
One of the more interesting things I’ve done as a tester is to take an API and create Swagger documentation for it as part of the testing - complete with examples.
That was a fun learning experience. There are still aspects of using Swagger with a dot net project that I’m not sure of (like adding links to static web pages and how to set up authentication so that users only see the parts of the API they’re allowed to access), but it was definitely rewarding.
We’ve done it with Postman also, and you can publish the API’s as Docs easily and everyone of your team can check them out also. On the website of postman itself, you can view the docs also and change the environment also (QA/DEV/ACC/Prod) and all the data will change with it, also (live in the docs).
From a testing perspective, I’ve loved working on projects that use JSON:API spec in their design: https://jsonapi.org/ Looks like it confers some additional benefits but I primarily liked it for its readability.
I second the people mentioning error handling/messaging. That goes such a huge way towards making integration testing doable and efficient.