Client
Our client is a US-based company that specialises in creating innovative interoperable highly customized digital platforms for their customers in the insurance domain. They see their mission in providing efficient and cost-effective software for the collection and distribution of information for life insurance applications. They build products to simplify the process of obtaining insurance for end-users. They needed a team to maintain software quality excellence, and that is how they found Elinext.
Project Description
Our client needed a development team,the size and capabilities of which would be enough to comprehend all the workload connected with their digital platforms.
They needed to augment part of the work to an experienced software designers team, and they chose to address Elinext with the request of overseeing their platform’s quality assurance. Their main goal was to delegate that part of the development process to experienced professionals. Elinext tasks are: ensuring that the platform is bug-free, compliant with the industry standards, and a high-quality product overall.
Challenges
The main challenge was ensuring that our client’s digital platform for the insurance industry was free of defects, met the industry's regulatory requirements, and delivered a seamless user experience. The project is huge as the client company has over 20 years of industry experience, most of which have involved our cooperation, and they need continuous QA services. Their project demands rigorous testing across multiple devices, browsers, and scenarios to prevent any potential issues from reaching end-users. Business objectives that were given to Elinext:- Ensure the platform's functionality is reliable and error-free.
- Achieve compliance with industry standards.
- Increase user satisfaction by delivering a stable, high-quality product.
Process
Our collaboration with the client, the product owner, was close. We regularly have sync meetings and provide them with detailed bug reports. The communication is assessed as smooth from both sides, and issues that arose were quickly resolved.
The approach from our team included the following:
- Comprehensive test planning and strategy development, including risk-based testing.
- Manual and automated testing to cover all functional and non-functional requirements.
- Cross-browser and cross-device testing to ensure consistent user experience across platforms.
- Performance testing to ensure the platform handles high loads without degradation.
Tests delivered by our team follow the same scenario:
Stage 1: Test Planning and Strategy Development - Developing a comprehensive test plan based on project requirements.
Stage 2: Test Execution - Performing manual and automated testing across different stages of the project.
Stage 3: Bug Reporting and Resolution - Collaborating with the development team to ensure quick resolution of issues.
Stage 4: Final Validation and Sign-off - Conducting final testing to ensure all issues were resolved before the product launch.
Our client is delighted with our team and the processes we’ve embraced and is planning to use our QA team for all of their future web projects.
Solution
As we provide full-scale QA services for our client, it is difficult to pinpoint a single solution to the challenges presented.. However, we can describe the types of tests we conduct and frameworks that we’re using in the process.
The QA team implemented a robust and flexible automation framework using the Playwright testing framework, designed to address the complex needs of testing a monolithic application.
The solution was built around the following key components:
1. Playwright Framework Adoption
Cross-Browser Testing
Playwright's ability to run tests across multiple browsers and environments was leveraged to ensure compatibility across Chrome, Edge, macOS, and Windows.
API Testing
The framework was designed to validate API services, focusing on data input/output validation, performance enhancement, and message consistency.
2. Test Architecture
Test Pyramid Model
The architecture followed the test pyramid model, prioritizing unit tests for individual components, integration tests for component interactions, and end-to-end tests for full application workflows.
Unit Tests
Focused on ensuring that individual functions and components work correctly in isolation.
Integration Tests
Validated that different components and services work together as expected, with a particular focus on API interactions and data validation.
End-to-End Tests
Simulated real user interactions to validate entire workflows, providing the highest level of confidence in the application's reliability.
3. Data Object Model
A robust data object model was constructed to support flexible and reliable test data management. This model allowed the team to handle complex data scenarios, such as varying vendor-specific data inputs and outputs.
4. Automation Framework Features
Reporting and Analysis
The framework included advanced reporting features, allowing test results to be grouped by vendors, functions, and priorities (e.g., regression, smoke tests).
PDF Comparison Function
A specialized function was developed to compare PDF outputs, ensuring that data was accurately reflected in the generated documents.
CI/CD Integration
The framework was integrated with GitLab CI/CD pipelines, enabling automated test execution with every code change, thus ensuring continuous quality assurance.
Page Object Model
For UI testing, a base Page Object Model was created to streamline the creation and maintenance of test cases across different pages and components.
5. Initial Test Case Implementation
Business/Workflow Testing
Test cases were designed to cover the full business workflow, from creating new cases to verifying the status updates and final data outputs.
Integration Testing
Focused on validating API services, ensuring data accuracy, and maintaining database consistency.
6. PDF Validation
Test cases were developed to validate that data was correctly represented in PDFs generated by the application.
7. Continuous Improvement and Adaptation
The QA team continuously built and updated the data object model to adapt to new vendors and application versions. They also enhanced performance testing capabilities and revisited API testing to ensure ongoing reliability.