Hello folks,
Before you start with API automated testing, do you create test cases for this? If so, how do you create API test cases?
Thanks,
Rafael
Hello folks,
Before you start with API automated testing, do you create test cases for this? If so, how do you create API test cases?
Thanks,
Rafael
Hello!
Each test is comprised of test actions. These are the individual actions a test needs to take per API test flow. For each API request, the test would need to take the following actions:
1. Verify correct HTTP status code. For example, creating a resource should return 201 CREATED and unpermitted requests should return 403 FORBIDDEN, etc.
2. Verify response payload. Check valid JSON body and correct field names, types, and values — including in error responses.
3. Verify response headers. HTTP server headers have implications on both security and performance.
4. Verify correct application state. This is optional and applies mainly to manual testing, or when a UI or another interface can be easily inspected.
5. Verify basic performance sanity. If an operation was completed successfully but took an unreasonable amount of time, the test fails.
Regarding test scenarios :
Basic positive tests (happy paths)
Extended positive testing with optional parameters
Negative testing with valid input (eg the same username)
Negative testing with invalid input (eg usernane as null)
Destructive testing (sending a huge payload body in an attempt to overflow the system).
HI ,
I still use the gerkin syntax to write the API tests for example
Given I send a POST request to an X endpoint
When I get a status 201
Then I parse the response and store the values
This is helpul to capture all the different Acceptance criteria related to a story or scenario and I can reuse the the same to write the Java cucumber automation
Regards
Chitra
Yes, I think Gherkin can apply to API testing quite nicely. Since we provide APIs to our products, which our customers then use, I think in terms of a hypothetical application (and by extension, the developer I may never meet) as the actor in this type of scenario, so tend to write about that in the third person.
Yes, I think it’s important to write a test on paper or as a test case before starting to code.
I have recently tried validating API responses by generating a JSON schema and validating against that.
I’m an API tester and as said above I will write Gherkin scenarios with my PO and a developer (a 3 amigos session) to flesh out how the API should behave - ideally before development begins. They can be refined as you go along and new information is uncovered, but it helps everyone get on the same page right off the bat.
Then automate with Cucumber and Rest Assured.
@stevecager I agree Because API’s can be complicated for not technical people to understand.
So writing Scenarios in Gherkin language makes them more accessible to the broader business people.
Definitely! APIs are difficult to demo to business folks as usually it’s just a JSON response in Postman on the screen. They are for systems to talk to each other, so not really meant for human eyes, but it’s worth spending the time helping your PO to understand as they can often spot business related things a dev or tester wont spot.
Essentially I test:
The headers that are passed in, are they mandatory or optional? What if I pass an invalid header, or miss off a mandatory one?
Test the request body. Again what happens if no body is sent, or an invalid body, invalid fields, invalid values?
What happens if I try an operation that’s not allowed? If I’ve built a GET API, what if someone tries to PUT, POST, DELETE? etc
For all of the above, do I get the correct status codes and error messages that would be easy for a consumer to understand? You want them to know what they are doing wrong so they don’t ring you up and get you to fix it for them.
Does the API spec detail all of the above, and does the API behaviour match what the spec says.
Does your API response always contain the mandatory fields that it needs to? Are there edge cases with your data that cause the API to behaviour weirdly?
I like to create Request and Response classes for each API call, and importantly if at all possible I’ll try to make sure that the request class can be instantiated with minimal input in (using random VALID data for what can be randomised if not specified). Then create fluent methods to override the request for negative testing. Then a test looks something like:
// java
NewRestItemRequest request = new NewRestItemRequest(args);
NewRestItemResponse response = request.withName(“AnInvalidName”).getResponse();
assert response.getSuccess() == false : “Invalid name was allowed”;
From this point I can modify the request base class to do things like adding & removing JSON properties so that I can very quickly create many test cases in very few lines of code.
Then assuming I’ve also coded for the happy path scenario (in this example to create an item), I have 2 lines of code that I can use elsewhere in other tests that require an item to be present (without having to create that item with the UI or DB)