How to automate Data Analytics Dashboard

Hi all,
I am a manual tester and looking to advance in automation with my current job. My company develops products and services for manufacturing sector.
The product is basically a Data analytics Dashboard website that analyzes data or values from manufacturing machinery unit, calculates the same and displays that calculated data in form of graphs.
The Backend works in Java Spring Framework and GUI is written in JavaScript.
I wanted to know, from where should I start automating the Data Analytics Dashboard website and its features? Should I begin automating Backend or Frontend?
All suggestions are welcomed.
I have learned Selenium with Java basics. But want to know an approach to automating a dashboard.

Thanks

3 Likes

I would suggest to go step by step in this order

  • Do Tests on Backend Level first to understand and check what data your Dashboard gets
    ** Understand the API
    ** Send calls to the frontend and then check that it is shown correctly
  • Then do some Frontend testing
    ** Check that the Dashboard charts are there
    ** Check that the data is correctly shown ( this would need to be in relation to the data that you sent from the backend)
  • Do visual testing
    ** For this check for example Automation University from Applitools for how to do visual testing

Hope it helps as a starting point

5 Likes

Thanks a lot @restertest :slight_smile: . These steps can help me start testing Dashboard right away .

@restertest I needed one more help. Could you please suggest which tool I can use to test Backend level (to test API calls) ?

It depends on what you want to do

  • Postman or Insomnia if you want to make exploratory testing on the API
  • If you to make some tests to add to your build then maybe somenthing like Rest Assured since you mentioned you have Java experience.

There are a lot other tools out there, but the important part is to start and to understand what and how to test.

You may want to do for you and for fun the 30 Days Challenge for API testing from the Ministry of Testing.

It is fun and you will learn nice things in a non boring way

2 Likes

Thanks a lot @restertest :smiley:
30 Days of API testing would be a great start for me to learn about APIs . Thanks for sharing the link.

Hmm I wouldn’t jump straight in.

Have a look at this: Setting a Foundation for Successful Test Automation

You’ll need to be clear why you want to automate anything. Are you doing it to learn about automation or is there a particular problem you need this to help with?

Worth having a discussion with your team aswell if possible. for something like this, it’s good to understand whats happening at the different layers, which aspects are good candidates for automation, quick wins, what are the challenges?

check what’s being unit tested if any. How is the data analysed is it based on some rules , filters , database functions, API’s ? Where’s that logic?

Calculations are a great candidate for unit tests

Do you have a pipeline ?

Would be good to know more about your system architecture, what your release process is etc

2 Likes

I’d echo what both @restertest and @rforjoe said. As in: the plan for automating things that restertest suggests seems good, assuming that it’s the right area to automate (as rforjoe).

To build on what rforjoe says, I’d suggest you look at the how the data is created in the first place. Is there some kind of data preparation pipeline (also known as data engineering, ETL etc.) that does things such that data ends up e.g. in a database available for the backend to query and send to the front end?

I.e. it could be that the front end is displaying data nicely, and the backend is fetching the right data correctly, but the data itself is bad because of bugs in the data engineering pipeline.

Where is the biggest risk? Where is the biggest value to the end user? Where is the cost least to you as a tester? As rforjoe says, look at what existing tests exist e.g. unit tests, to see what makes sense for you to add.

If you decide to test a data engineering pipeline, here’s something I wrote based on doing this in the past. Testing data preparation for a BI database – Random Tech Thoughts

2 Likes

@rforjoe I agree with all your mentioned points but would also like to put forth why I want to automate my company’s Dashboard .
I want to learn automation that is also one of the reasons for me to try automating my company’s product. Thankyou for providing this link - [Setting a Foundation for Successful Test Automation ].
Another reason being, the risk area in my product is the data part. That is were many bugs are discovered . And as the release build grows, testing the data part gets more complicated. That’s why I was thinking if any automation can be done to that part of software.

My company has very limited resources with developers being busy all day . While I tried discussing about the different layers and data pipeline of the product with the development team. I never get clear answers from them.

The only thing that I can say data on the dashboard is calculated using a logic/formula defined in admin by us. We call the dat as KPIs (Key Performance Indicators) for which the logic is provided by the client .

And after defining that logic, calculated KPI is visible on Dashboard. This logic consists of a parameter on which the formula is set for KPIs. The parameter data in the database for each timestamp is received in a queue using RabbitMQ since the MYSQL database won’t be able to handle continuous stream of data that is captured from manufacturing machines. This is what I know about the data that is seen on Dashboard.

@bobs I need to read about ETL. I’m not much aware about it. Also thanks for the link provided to test data engineering pipeline. Will definitely look into it . Thanks.

There’s a bit variety of tools available that might be being used at your company to prepare the data. It might be conventional code e.g. C# or Java, in which case it might not have an obvious pipeline structure, but hopefully normal testing stuff is happening on it.

If it’s more pipeline-y, then it could be stored procedures in a database, a specialist on premises bit of software like Microsoft SQL Server Integration Services (SSIS) or Talend, or something more exotic like Apache Spark. Or it could be something more cloud-based, like Azure Data Factory.

I’m saying all this not because I want to confuse you, but to help you identify what tools might be in use at your company, and so the appropriate ways to test them.

At my current place we use a combination of SSIS and stored procedures. The bulk of the business logic is in stored procedures, and SSIS is the control logic that strings them together. This means that the business logic can be unit tested using tSQLt, leaving less to do at system test time (where it’s harder).

Also, it might be referred to as ELT rather than ETL (the T and L swapped around). ETL means Extract, Transform, Load (i.e. read, process, write). ELT is Extract, Load, Transform. (Again, just to help you work out what your organisation is using.)

ETL is where data is read from somewhere, processed, and only then written to where it would be used in e.g. a dashboard. The data is in a good form before the dashboard reads it, which means that there’s less work for the dashboard to do. However, it means that the dashboard can’t change until the ETL has changed to prepare data in a new way. In ELT the data is made available e.g. to dashboards in an earlier state, so the dashboard has more work to do, but it’s more able to change without worrying about what upstream processing / teams are doing.