QA & Test Automation for STRABAG Data Platform
Client:
STRABAG
Country:

Type of service:
Manual Testing,
Automated testing
Background:
The company – client is STRABAG. One of the top 5 largest construction companies in Europe that provides the construction services for more than 180 years, implementing over 15,000 projects worldwide projects per year.
Scope:
STRABAG is a multi-project company. The product of the Client required the full-cycle testing and was live when the QA engineer provided by Bergmann Infotech joined the team.
The task was to provide a large scope of testing services for its internal Data Storage and Analytics platform with respect to the fact that the platform was already live and had thousands of active internal users. In such a case, it is critical to align the QA expert’s work with the existing policies, to cause zero interference while bringing from the first day the maximum positive impact.
The client was looking for a QA service provider that would not just perform web functional testing, but would also do the following:
- Review, analyze and test design of the product, its functional requirements, integration requirements.
- Execute full regression tests before every release.
- Become part of the development team for better defect analysis through all the defect life cycle.
- Investigate and report defects; verify bug fixes.
- Prepare the test documentation in the large variety of the documents required according to the internal standards of the product development.
- Develop an automation framework to optimize regression testing, to make tests a part of the CI pipeline.
Solution:
The QA engineer had joined the development team during the development stage. Starting the first day there were two main goals set up:
- to test thoroughly the product
- to see the options to improve the testing process at the same time.
Before proposing or making any changes to the QA process, the QA engineer made sure to get very familiar with the product via various ways: executing tests, reading all available documentation, running exploratory testing.
Initial important suggestions that were made after having a clear picture of the testing process included the following:
- To have a stable testing environment that must be as prod-like as possible.
- As platform is a data storage, it was crucial to have a clear analysis of data mostly used by users on prod and to include this data analysis into product requirements.
- Taking into account results of data analysis, to plan relevant testing activities to test the processing of the required data types on platform.
- To prepare a protected set of test data.
- To test designs of the product before the development has started – to get an earliest understanding of the planned functionality and to find future possible omissions, conflicts and possible defects as early as possible.
- To run smoke testing on prod after every release as soon as possible in a seamless way for active users.
- To automate the quality status reports.
First scope of work that existed on the early stage looked like a combination of
- testing the product with respect to all required testing activities
- the test documentation set up
- along with active implementation of the test process improvements.
Second scope of work became possible once the test process was improved and stabilized. It finally became possible to develop the automation of testing using Java, Rest Assured and Selenide framework.
The testing services continue to be provided covering all the requested types of testing within the agreed timeline.
- Functional testing: to verify the product behavior against the requirements.
- Performance testing: to verify the load / time parameters set for the product.
- UX testing: to validate the interface and the related user experience.
- Compatibility testing: to run cross-browser and cross-platform tests.
- Integration testing: checks of the stability and correctness of the communication of isolated platform’s modules.
- Large End2End testing: detailed testing based on use cases and user behavior on a System level of the product under test.
- Confirmation testing: to retest and verify fixes of previous defects in the functionality.
- Smoke testing: to check the basic functionality after every release.
- Regression testing: to run a planned clearly set scope of tests to get a reliable status of the product quality and release readiness between the development and deployment.
Results:
- Every product release candidate is tested on pre-defined browsers and platforms before the release.
- The detected problems are logged, prioritized, and quickly communicated to the developers. Only test management and bug reporting tools are used for that, no side ways to track or monitor defects.
- The constant, clear and fast communication between the QA engineer and the developers, product managers is set up.
- The quality status reports are automated using test management and reporting tools.
- The reliable and constantly maintained library of test documentation exists and is used.
- The QA engineer not just actively participates in testing activities, but supports oher QA engineers working and joining the project, reviews other QA engineers, autotests, monitors quality gates parameters.
Also it is reasonable to highlight the fact how much automation performed by QA engineer improved the testing process overall:
- Full manual regression gets more and more replaced by automated test runs. Manual efforts are required to support in new or currently not automated cases only.
- Time and costs saved by implemented automation are significant. For example just one of the modules of the platform required 32 h / month of manual work to execute regression and e2e tests. After the automation was made the every night automated test runs with zero manual work required.
- Automation gave more stability of tests and larger test coverage that is clearly measured.
Domain:
- Data Storage
- Big Data
Testing services:
Full testing process including:
- Manual functional web testing
- UI / UX testing
- API testing
- Automated testing (Java + Selenide)
- Test documentation development
Tech Stack:
JAVA
Gradle
Rest Assured
Allure Report
Selenide
Bug Tracker:
Jira
Test Management system:
Zephyr
Author:
Alena Badzilouskaya
We take care of the search for a solution for you.
Inform us about your needs. We will call you for a first chat after receiving your request.