Blog

Tale of Exhausting API Regression to Efficient Automation with Postman

by: Mamta Joshi

What do you do when have to perform API regression testing manually, and you are required to re-run the APIs multiple times just to make sure the outcome you are getting is still correct?

Seems like a common issue faced by many quality engineers and Meera’s story is very similar to this. So, let us introduce you to Meera, she is a dedicated and passionate Senior QA Lead in Testing Mavens. When she started working on the API testing project for one of the clients Meera had no idea that they don’t have an automation framework and test automation efficiency is a bit slow on their part which is much needed while performing any repetitive regression testing.

Whenever any new feature got added to API, it took many iterations of repetitive regression testing, and this has been a very time-consuming and tedious problem for Meera. To compensate for these challenges resources were stretched the working hours and weekends are included so that they can deliver successful releases. Even, with an extended turnaround time, it was getting hard to fix the bugs found at later stages of the regression testing. This in some instances caused the releases to postpone at the last minute.

Since these issues started hindering the releases, she communicated this to the client and asked if the team can build an automation framework for this, but the client is disinterested about it to create a new framework. Though they want to reduce the time framework and wanted us to explore more options so that the period can be reduced without much additional cost. At this stage, she is bewildered as to thinking how she is going to make this work out as she has to perform manual regression testing on all those APIs in a very less amount of time without any automation framework.

Meera started looking for ways in which they can help clients with minimum time and effort, and they started exploring this amazing platform called Postman, which they were using for manual API testing. Upon exploring she found that postman provides a scripting capability and provides a collection runner to run all APIs in a collection automatically. She leveraged the scripting capability of the postman to communicate between the interlinked APIs and to test the fields in the API response as per the requirement.

The collection runner and scripting part has made us relaxed, however, a new question arises:

How do we share our results with the stakeholders?

To resolve this thing, Meera used the Newman Cli for Postman which is a command-line Collection Runner for Postman. It enables to run off a Postman Collection directly from the command line. It has been built as a library so it can be used programmatically. She leveraged this feature of Newman and by writing a few lines of Node.js code we were able to share the custom report as CSV files. Now anyone can see results even from the client site a person can use this Newman to trigger the regression test and can view the results. The cherry on the top of the cake is that initially this script building is just limited to regression but now Clients are using it for other testing purposes like creating test data for testing the reports engine.

After developing the script, the execution time of regression testing has decreased to half an hour, and it has added a huge benefit to the clients as well as to the team. The whole team has shown sportsmanship and channelized the “good” and “not so good” conversation as an opportunity to create something for a client which is not written in a proposal and hence showed Testing Mavens as the best software testing company.

Do you ever get stuck in a similar situation?

How have you overcome these similar situations?

We serve as the best QA testing company and have a dedicated strategy whenever we are in similar circumstances and Conscious communication is a big part of it. This is integral through the requirement gathering phase and yes, like others, the Testing Mavens Team is also a bit conscious while pitching this scripting idea to the client but there are many driving factors here that led to the successful build of that project.

Key driving factors that lead to successful projects:

  • Conscious communication is one such factor. As we stand as best testing service company in testing sphere this is a crucial part for us. Our ability to communicate clearly and compassionately with ourselves as well as with others made us go above and beyond. Empathy with our clients and with a team made all the difference.
  • Quality listening and, patiently working on client feedback is the driving force that leads to such huge success and happy clients.
  • Knowing what service/product you can offer: Product knowledge is an essential sales skill. Understanding your products’ features allows you to present their benefits accurately and persuasively. The client responds to the enthusiastic team who are passionate about their products and eager to share the benefits with them.

After the successful deployment of this scripting, we can proudly stand as a Test Advisory and Consulting firm, we can custom-build test automation frameworks that reduce manual efforts, maximize efficiency, and accelerate development.

Our solutions are tailored to your needs, including product and tool evaluation, and framework implementation. Testing Mavens has the best test automation efficiency and serves as the best software testing and quality assurance company in this testing space.

Relevant posts

Fluent CI/CD Pipeline – Not just a pipe dream any more thanks to GitHub actions

Automate, customize, and execute your software development workflows right in your repository with GitHub Actions. You can discover, create, and share actions to perform any job you’d like, including CI/CD, and combine actions in a completely customized workflow.

Automating data validation checks in Database Testing

Validating data in the tables becomes very cumbersome if we do it manually. Apart from the longer duration ensuring the accuracy of the data being validated is the biggest challenge. The best way to tackle this scenario is to automate this task which will ensure the data integrity is intact, duplicates are omitted, and the task is completed in a fraction of the time than it used to be.