Prolifics Testing managed the overall test programme for a release of the Race for Life programme, which included a comprehensive update to the web front-end, in addition to a new internally-developed call centre management system - the largest and most significant bespoke development ever carried out by the charity.
We deployed a Test Manager, who was responsible for developing an overall high-level test plan for the programme, which included defining each test phase, standards, scope and responsibilities, as well as a high-level estimate for each of the relevant test phases.
A risk-based approach was taken, with test phases corresponding to the nature of the part Agile and part V Model Methodologies being employed.
Test management throughout the programme included defect analysis and management, reporting to stakeholders, third parties and Subject Matter Experts in different areas of the business, at all levels.
Our consultants were deployed at numerous points during the programme, which included System (both Waterfall and Agile), integration, end to end, device compatibility, Performance, User Acceptance and Security Testing.
Testing the performance
Performance tests were executed on the integrated application, injecting load via the AWS cloud. The tests included normal, peak load and soak tests, simulating access via a range of device types. The tests identified several significant issues stemming from concurrency and increasing load, with both application code and infrastructure, which needed to be resolved before launch.
Automation Framework
As a part of this programme, a Test Automation framework was designed and delivered using Micro Focus UFT to incorporate the automated overnight testing of all key business scenarios and combinations of race entry types and merchandise purchases. These tests were integrated into Jenkins to allow the automated running of the test pack whenever a change was made.
Initial results
The results of the test programme provided a significantly increased level of confidence that no regression issues had been introduced whenever a change had been made, as well as highlighting problems for investigation where they had. The automated test pack was run against a range of browsers to establish a level of compatibility check each time a change was made.
User Acceptance and Device Compatibility Testing
Next, a User Acceptance Testing phase was initiated, where tests and data were specified by business area and the execution and results managed centrally. A series of mobile device compatibility tests were executed against a matrix of devices to eliminate problems.
Programme results
The outcome of this programme of testing, lasting around nine months and involving 10 different consultants at different stages, was that the campaign launched successfully, to major fanfare and promotion, with an anticipated spike in registrations once the season commenced.