As a first step, we engaged with Clipper and one of their retail customers to understand the range, load and volume per API. We created a JMeter framework to test the system and each of the Inbound and Outbound APIs thoroughly from an end-to-end standpoint.
Prolifics UK implemented a MuleSoft integration solution for our client, to allow integration between both in-house and customer applications and provide a common layer to build upon. This was done via the delivery of a set of reusable custom APIs, which were delivered and functionally tested.
From a non-functional perspective, there was a need to ensure that the Warehouse Management System (WMS) and integration layer were able to provide a timely response in under one second while under a load of 700 order requests per minute, with no data loss. At a granular level, our client needed to understand the capacity of each individual API, as well as the system as a whole, end to end.
Within the scope for performance measurement were the API endpoints, the capacity of Mule Experience, Process / System APIs, and the containers hosting them.
The initial brief was that only the MuleSoft APIs would require testing. However, after detailed planning, it was agreed to expand the scope to include end-to-end performance testing on all Inbound and Outbound APIs, including the WMS and their client’s applications.
Our Solution:
Prolifics Testing engaged with the client and one of their retail customers on a performance project to understand the range, load and volume per API. To comprehensively test the system and each of the Inbound and Outbound APIs from an end-to-end standpoint, we used Apache JMeter, with a custom framework.
Each API was scripted individually and then combined to form a model that would simulate real-world activity on the interconnected applications, via the APIs. This included unique fields, security headers, content types, and many other parameters, which were identified in the planning phase.
Initially, our tests focused on splitting up the APIs into logical groups, noting interdependencies and whether each was inbound or outbound. During the inbound test, MuleSoft APIs (Experience, Process & System) were monitored thoroughly, using the MuleSoft Anypoint platform. The WMS and database server resources were also monitored during test execution, to ensure all API requests were reaching their intended destination, with no loss of data/transactions.
Result:
With the Inbound and Outbound APIs running together, our team were able to accurately simulate expected volumes of load, to measure the capacity of the MuleSoft platform, WMS system, and customer applications.
Performing end-to-end performance testing revealed several problems that were rectified during the testing window, and successfully re-tested. Issues identified included the following:
- Limitations with infrastructure were identified, leading to a failure to handle more than 350 requests per minute by a vital API, around half the expected level of load
- Data loss was noticed during test execution, it was determined that the excessive payload and Mule's maximum payload size configuration were to blame, which allowed the team to make the necessary changes.
Once the required fixes and configuration changes were made, the end-to-end response time was verified at under 1 second for a planned transaction volume of 700 Order requests per minute.
During the development of the performance framework for this exercise, our team were able to design it in such a way that it was possible to use it after these tests had been completed, as a re-usable performance regression pack. With some minor adjustments, it is also possible to re-purpose this pack for our client’s other customers, to provide additional value.