QA metrics report overview: Which metrics testers measure

Anton Bravin
Updated on
Sep 9, 2024
reading time
Try Zebrunner

Do testers and software developers need to measure QA metrics? How do other professionals perceive QA metrics? What are the crucial metrics, and what values do they offer for QA and project teams? Zebrunner has conducted research into the metrics QA engineers measure to find answers to these questions.

Why do we do this?

As developers of software products in the quality assurance domain, we at Zebrunner continually explore trends and challenges in our professional field. QA metrics, in theory, seem important, mostly for meticulous professionals but not for all testers. We have product metrics and general KPIs, but do we really need special metrics for the QA department? On the one hand, they appear to be a crucial aspect of testers' work. On the other hand, QA metrics demand additional time and human resources, as well as the application of special software. We explored the answers of 98 professionals and gathered a range of insights concerning QA metrics.

GET THE REPORT

Being active in the QA market since November 2013, we've observed a shift in how IT companies perceive the entire QA process. What was once affordable mostly for large corporations is now a mandatory part of the SDLC for all software developers. Forecasts enhance this observation. The entire software testing market is witnessing significant growth, estimated at $45 billion in 2022 and projected to reach $90 billion by 2032.

Studying crucial QA metrics enables us to uncover the primary purpose of QA reporting for various software companies. These metrics serve as indicators that highlight the success or failure of software product development. The reporting tools employed by testers provide a comprehensive view of QA solutions for evaluating success. Understanding the value of measuring QA metrics explains the advantages that QA teams derive from this practice. While demanding in time and human resources, many companies have already integrated it into their workflows. Therefore, we will explore the reasons behind companies adopting this approach.

Demography: professionals, team size, and market segments

We sought respondents for our research through our network and professional QA communities. Among the respondents, 49% are manual QA engineers, 29% are test automation engineers, and the rest include QA team leaders, developers, and others. Regarding market segments, 36% of respondents work in technology and electronics, 26% in financial services and banking, 16% in retail and e-commerce, and others in consumer goods, healthcare, media and entertainment, and education and e-learning market segments.

Regarding team size, our research revealed that most companies where respondents work prefer small departments with 1-5 QA team members.

The role of testing in overcoming software development challenges

In this section, the research found that challenges related to minimizing risks and costs are considered less important. QA teams tend to underestimate test reporting capabilities and focus mostly on basic metrics. Illustratively, 73.9% of respondents pointed out the improvement of software reliability and stability as a key challenge where testing plays a crucial role.

QA reporting insights

Practice and stakeholders

Although QA teams tend to underestimate reporting capabilities, reports are crucial for 80% of respondents. This indicates dissatisfaction with basic test results and the need for QA analytics for future improvements.

QA reports hold significance for various stakeholder groups, with 68.8% indicating their importance for project and department teams. Additionally, 43.8% respondents specified that senior management is also a stakeholder in QA reports.

Essential metrics

The situation with essential metrics in the research shows that most companies value QA metrics connected with routine QA processes and short-term results. At the same time, long-term results and possibilities to increase efficiency, as well as non-evident insights, are less important. While test execution metrics are crucial for 81.3% of respondents, test automation metrics are important for only 40%.

Time to prepare

Regarding the time to prepare a QA report, 27% of respondents apply the automatic generation feature and get a report within 2-5 minutes. 55% spend 1-2 hours on QA reporting preparation. However, a significant percentage of respondents – 17% – spend from 2 to 8 or even more hours preparing QA reports, representing a significant waste of time and human resources.

Tools for reporting

Exploring the landscape of tools that our respondents apply for measuring QA metrics, we found that 42% of respondents who have adopted the practice of preparing QA reports don’t apply special tools for measuring QA metrics. This suggests that organizations may undervalue the benefits and features that QA reporting solutions offer. 

Delving into QA reporting, we understood that companies apply software testing tools, as well as issue and project tracking software, testing frameworks, application lifecycle management tools, and more for QA metrics calculation. This indicates a variety of reporting requirements and needs, as well as different attitudes toward QA reporting within the company or QA department.

QA metrics values

Concerning the key QA metrics values for the team and project, we get similar results as for points about software development challenges and essential metrics. Companies see the values of QA metrics in a quantitative domain, while advanced reporting capabilities such as data-driven support for decision-making are rarely used. 63% of respondents pointed out that the calculation of quantifying testing efforts is valuable for them. Data-driven support for decision-making is important for only 26.1% of respondents.

Takeaways and conclusions

Team size and QA practice. Smaller teams are prevalent, with 51% of respondents working in teams of 1-5 people. Despite their size, these teams prioritize QA reporting, emphasizing a commitment to quality.

Diverse tool landscape. QA engineers employ diverse tools, ranging from software testing tools and issues/project tracking software to spreadsheets and application lifecycle management tools. This highlights the adaptability and innovation within QA reporting. Moreover, the variety of tools chosen by QA engineers for reporting demonstrates their flexibility and willingness to adopt different technologies.

Data-driven decision making. Only 26% of respondents prioritize QA metrics for enhancing data-driven decision-making. This suggests that QA teams may be unwilling or unaware of the advanced capabilities offered by QA reporting tools. 

Many-sided QA value. QA metrics show multi-dimensional value, aiding in quantifying efforts, assessing team performance, and achieving project transparency beyond just finding defects.

GET THE REPORT

Related Articles:

Free Test Case Management Tools

TestRail Alternatives

Automation Reporting Tools

About the author

Anton Bravin

Product Marketing Manager

8 years on a mission to bridge the gap between cutting-edge products and the audiences who need them most. Anton speaks the languages of developers and customers, allowing him to craft compelling narratives showcasing true value.