A Better Way of Reporting Performance Test Results

A Better Way of Reporting Performance Test Results

Powerful reporting of test results is one of the holy grails of our profession. If done correctly, it improves the project's quality and helps us focus on the actual issues. But if done badly, it adds confusion and reduces the value that testers bring.

Reporting the results of functional tests is relatively simple because these tests have a clear pass or fail result. Reporting the results of performance testing is much more nuanced.

Let's start with a definition: For the purpose of this guide, I use the term performance test to imply any evaluation that performs a measurement, with a range of numerical values all, considered an acceptable result. It may be the measurement of energy intake, the number of users a website serves in parallel, the rate that information can be read from a disk, etc.--any measurement of a nonfunctional requirement.

The first challenge in performance testing is deciding what's considered a"pass" Often this is neglected in the requirements definition stage. I've seen many requirements that read something like, "Data extraction time from the database shall be under 10 mSec," or"The rate of processing a movie file will likely be at least 100 frames per seconds (fps)." Such requirements are incomplete since they do not include the actual target we want to hit. We just know the worst result we agree to tolerate and approve the product. There are two issues here.

First, let's assume I conducted a test and found that video file processing is done at a rate of 101 fps (recall that the requirement was"at least 100 fps"). Sounds great, right? But does this mean we are close to the edge (that is, the product hardly meets the requirement) or everything is fine? If the requirement was well defined, it would have contained both the goal and the minimal --for instance, goal: 120 fps; minimal: 100 fps. With such a requirement, a result of 101 fps clearly indicates the product hardly meets the requirements.

Second, when a test fails slightly (e.g., 99 fps), the product manager is under pressure to be"flexible" and accept the product as is. How often have we heard, "Really, we are below the minimum, but we are nearly passing, so we can determine it's fine"? If the full requirement were available (goal: 120 fps), it would be clear how far the results are out of the target and the product has a real problem.

For the sake of completeness, I will mention that a nonfunctional requirement should not just specify target and minimum, but also the test method because the test method influences the results. By way of example, when measuring CPU utilization, the results could vary significantly depending on how we perform the measurement. Can we measure the maximum value listed? Over how long a time? Do we average dimensions? How many dimensions a second? What's running on the CPU in parallel to our test?

In theory, reporting performance test results should not be an issue in any way. Just present the results and indicate a pass or fail. But again, we do not just wish to know the result; we would like to get an idea of how the result is related to the target. Crafting a report that is not overly complex but still delivers a comprehensive picture of the status is a balancing act.

We can use a table:

But because most products have many performance requirements, we will end up with a huge table filled with numbers. It'll be hard to quickly see where there's a problem. We could use color to improve readability:

But this brings up more questions. Does it make sense that frame processing rate and CPU utilization get the same color code? One is practically failing, while another is well within the acceptable variety. So perhaps color frame processing in red? But then what color would we use for a collapse? And how long would we believe a result green before it should become yellow? Not to mention the problems that could happen because of some people having color-blindness.

I was thinking about this issue when my doctor sent me for my yearly blood check, which I really do meticulously--about every 3 years. Anyhow, the results from the laboratory included a list of dozens of numbers displayed in this format:

Despite the fact that I'm not a physician, I could tell immediately which results were fine, which were marginal, and which were something I should discuss with my physician.

A light bulb went on in my head: Why not use this method for reporting performance tests? I took a few data points and experimented with PowerPoint:

Notice that I still use colours, but the axis explains the selection of color and explains where higher is better and where lower is better in a color-independent way. The reader can clearly understand the position of each measurement inside the allowed range; the colours serve mainly to focus attention where there is trouble. Creating such a report takes some time, but it could be automatic.

I haven't yet seen this idea implemented in a real project--I am still working on that--but if you do use this idea, I'd be delighted to learn about your experience and the response from your organization.

Similar Articles

free-invoicing-software-benefit-business

Prompt and accurate invoicing is an asset for any business. It ensures that the company is timely paid for its services. As the company grows, managing and tracking invoices belonging to different projects manually is impractical.

Business Expense Tracking Software

Nowadays, every business organization can be seen looking for high-end software solutions to increase the productivity and overall efficiency of their business operations. One such digital tool that has gained immense popularity

Mobile Apps

Today, we exist in a highly digital world. After all, we have come a long way from doing processes manually, in-person; instead, we mostly rely on digital tools and solutions, especially mobile apps, to execute most of our day-to-day tasks

Online Proctoring Software: Factors to Keep in Mind

The world had started to, slowly but steadily, engage with the concept of digital learning and online exams long before the coronavirus pandemic came along and disrupted the education ecosystem across the globe.

Online Food Ordering System: Key Features

Unless you have been living under a rock, there's a good chance that you have ordered online at least a handful of times, if not more. Do you know whom you have to thank for that? Online food ordering systems.

5 Ultimate Mobile App Development Trends for 2021

The average time spent on smart devices daily has been increasing over the years. Mobile app development technology is continuously evolving and flourishing in this digital era. Thus, you have to keep up with the latest mobile app development trends.

Penetration Testing

For security-conscious companies, being compliant with industry-standard security policies is a top priority.  Being in non-compliance with industry standards can result in fines.  More importantly, it can lead to increased cyber-attacks on your company, employee data, and customer data.

Magento vs Drupal Commerce: Which One Should You Pick?

As the world continues to evolve at a break-neck pace, the market has witnessed numerous changes. Among all these changes, perhaps none has proven to be quite as consequential as the emergence of e-commerce.

Time Tracking Software

Tracking your employee's time is an essential component of many businesses. It enables business owners to accurately charge their clients for the time spent on their projects. Luckily, time tracking in the digital age is not a challenge anymore.