API automated cloud tests

Hi,

I’ve got a question about a test automation idea a friend of mine and I have.

We want to do a platform that runs tests against production APIs regularly (every minute/hour/day) and send notifications if something breaks. The tests are just series of requests with variables between them (e.g. list all users, then get the details of the first returned user, expect the result contains an email field and status is 200).

The tests are written in YAML, but with additional features like shell commands and templating.

You can update the tests from the CI/CD pipeline when you release so that they are up to date with the code in production.

The tests can also be ran locally e.g. during development.

The question:

  • Does this sound like something you/your team needs? I.e. will it solve any problems you have?
    • If yes, what part do you like the most?
    • If not, how can we make it better?

Thanks for reading! :slight_smile:

2 Likes

How would is be different from Postman or HTTP libraries implemented in any major programming language?

2 Likes

It’s not by a large margin.

  1. It isn’t an http client, so it’s not meant to be used to just make requests.
  2. It’s declarative, so you don’t have to write javascript for your tests.
  3. Mostly, it’s a hosted service - so instead of running the tests yourself in your DCs (or public cloud), it will run them for you
1 Like

How granular is it in making requests & validating responses? What can you send & validate? HTTP headers, status code, request/response payload, dealing with JSON vs XML vs plain text payloads. File upload/download? Redirection requests, HTTP (basic) authentication, cookies.

1 Like

You can send anything you can write in plaintext, so XML, JSON, graphQL.

It can validate headers, status code, plaintext & json payload (with the option to validate just the schema, not any primitive values), we’re planning to add gRCP & graphQL.

It doesn’t support file upload/download but it won’t be hard to add it (not sure how that’ll fit in the cloud version though).

Redirects aren’t followed.

You can inline shell commands inside any field, so stuff like basic auth will be something like Authorization: {% echo $USER:$PASS | base64 %}. You can save stuff that a request has returned. So you can save the header from one request and then provide it in subsequent requests and do cookies like that.

2 Likes

For file upload/download, I can imagine two scenarios:

  • mimick local run design, with a proxy to the cloud, however your framework works - file uploads are proxied from localhost to your cloud and your cloud then uploads the file with the actual API request made at the cloud side (not locally). Similarly for downloads, when the response data is binary and not text format (MIME type), the cloud forwards the data to the local client as binary content (application/octet-stream, etc.) for the client to then process it accordingly, usually like a file download (when made through a browser). That’s typically the case for file downloads, for text based context, it’s ok to deal with as text and do an extra step to save the text to file instead of an inline download. This approach, at least with uploads, is as I recall sort of how Selenium Grid supports file uploads for driver bindings that support it. Behind the scenes, the file is sent from local machine running the test to the grid node with the content encoded and sent as base64 text, and the grid node then uploads that to the target website/application.

  • have cloud version of file upload/download implemented differently, where you have cloud storage/repository for hosting test input files used for uploads, and to store download files to later download to customers machine. You’d have separate access later/API to upload/download to this cloud storage. The test framework/service then can be passed a path reference to the cloud storage for uploads and downloads to/from the cloud storage. So meaning the user has to do an extra step to provision the upload files and extra step to download a copy of files when using cloud version than when running locally.

1 Like

Hell yea, it’s just monitoring. => https://learning.postman.com/docs/designing-and-developing-your-api/monitoring-your-api/intro-monitors/

A lot of companies do this also but they only fire the GET requests themselves.
If you wish to test PUT/PATCH/POSTS/DELs yourself, I would advise to add a custom header or a specific user in your production data which indicates it’s a test. (something that is often required in bug bounty programs)

You can then even run your full API-Test suite against your PROD-Environment & clean everything with that specific alias/user-account/header.