It’s safe to say that UI is a critical step in the development of any application. The way an app looks and feels has a massive impact on its reception among the target users.
That’s why development teams need to test the UI they build. But the task is hard to accomplish. In fact, most companies set up specialized QA teams whose primary task is testing new versions of their applications. Unit tests and integration tests are helpful and provide great assurances on the functionality of the code. But they’re not useful for ensuring that users will see exactly what they expect. Changing the color palette or moving elements around the interface makes the app interface unfamiliar. Moreover, if the site looks as if it was about to fall part or critical data isn’t displayed correctly (for example, in fintech products), it can have a detrimental effect on the user experience and even cause customers to lose trust in the brand.
That’s where regression testing comes in. To help development teams track all visual changes and compare them to previous app versions, our experts have developed a new tool that uses powerful Machine Learning algorithms. QAcheck.dev helps teams to avoid unexpected style changes and streamline the UI design process.
Read on to find out how cutting-edge Machine Learning technologies can help make regression testing easier in UI development.
What is UI regression testing?
UI regression testing basically compares a screenshot of a page to a specified baseline of what it should be. It then outputs a diff image when applicable.
When a difference is found during the testing process, teams can take various steps. For example, they can send a message to the responsible team, trigger a pager duty alert, or set up their CI/CD pipeline in a way that makes the build fail.
But here’s the problem: full regression testing requires a lot of time and effort. Especially when teams want to make sure that nothing has been damaged when applying changes or they aim to perform full regression tests on data-sensitive applications such as fintech products. Ensuring that both the data and UI layer is correct and preventing unexpected bugs/regressions is challenging.
Teams can automate the process using the classic approach and covering it with test cases and scenarios. But covering the application by 100% is next to impossible. And performing the full regression testing manually isn’t an option either. Spending a few workdays on that before each release would be a waste of resources.
We decided to bring a breath of fresh air to UI regression testing by developing a tool that takes advantage of the latest data science innovation. It helps to automate the process for the best results.
But first, let’s see what QA automation is all about.
QA testing automation – definition and benefits
Automated testing is based on the use of testing tools that execute scripted sequences to confirm that software functions appropriately and meets specific requirements before being released into production. The main benefit of automated QA testing is that it removes the manual effort involved in testing and locates it in scripts. For example, if unit testing consumes a lot of QA team’s resources, it’s a viable candidate for automation.
Automated testing tools not only execute tests but also report results and compare outcomes with previous test runs. A team can run automated tests – for example, for the purpose of UI screenshot testing – any time and easily fit it into the CI/CD software development practice.
All in all, automated testing increases the efficiency of QA teams, offers higher accuracy of testing, improves reporting, coverage, bug detection, and resource efficiency.
QAcheck.dev – a data science-powered solution
At first, our team was looking for a solution to make screenshots in order to create a base and later on, during the regression. Our team aimed to view the app exactly as different users do. To accomplish that, they took advantage the PhantomJS and Selenium framework using Facebook webdriver, with the same Chromedriver as the end-user. To mimic a larger variety of browsers, the team used the Browserstack platform and several scripts that allowed making a full set of screenshots of different application pages. Once the base screenshots were ready, the team could perform the regression.
To create a new set of application screenshots, QAcheck.dev uses the same procedure. Teams can later compare the new screenshots with the base ones using any image diff tool that allows pixel-to-pixel comparison.
The result is a list of screens where differences were found, even in single pixels. These screens need to be checked by testers. If the change was intentional because the development team introduced a new functionality – the change needs to be tested. If the change wasn’t intended, it means that you’re dealing with a bug that needs to be removed before the UI reaches the production environment.
QAcheck.dev also allows choosing items that are to be ignored on purpose when the tool compares screenshots. That can be useful, for example, for a the displayed hour which will almost always change.
Adding data science to the mix
The MVP of QAcheck.dev was based on a sitemap and Selenium-based crawlers that can reach different parts of the app. Those that are hard to reach using classical crawling (for example, multi-step forms), required the team to record/write automated scripts with tools like Selenium or Cucumber. Moreover, the team aimed to automate the process of creating Selenium scripts that would cover test cases and scenarios that are not accessible from the sitemap itself.
Machine Learning implementation
Extracting more data from the picture comparison offers a more accurate description of bugs. Automated image recognition is offered in QAcheck.dev – the tool can track what has changed and generate the right bug ticket.
For example, QAcheck.dev creates a difference map that displays all the pixels that have been changed between the two app versions (as a map or short summary). Moreover, instead of pixel-to-pixel change, the solution shows which abstract part of the target has changed (like the login button, which was moved to the upper-left corner and got smaller by 20%). QAcheck.dev uses deep learning image captioning: Recurrent Neural Networks (LSTMs) with Convolutional Networks complemented by more standard programming approaches and image processing techniques.
Taking UI regression testing to the next level
QAcheck.dev offers a brand new approach to UI regression testing, offering its users a simple and resource-efficient method for testing the interfaces of their digital products before a new release. By automating the process, QA teams get to save time and dedicate their attention to mission-critical improvements instead of time-consuming manual testing.