Tech topics

What is Performance Testing?

Illustration of IT items with focus on a laptop

Overview

Performance testing is a non-functional software testing technique that determines how the stability, speed, scalability, and responsiveness of an application holds up under a given workload. It’s a key step in ensuring software quality, but unfortunately, is often seen as an afterthought, in isolation, and to begin once functional testing is completed, and in most cases, after the code is ready to release.

The goals of performance testing include evaluating application output, processing speed, data transfer velocity, network bandwidth usage, maximum concurrent users, memory utilization, workload efficiency, and command response times.

Best practices for implementing Performance Testing

Learn how to adopt a combined "shift left" and "shift right" performance engineering approach to build a highly productive software development organization.

Learn more

Performance testing

Reasons for performance testing

Organizations run performance testing for at least one of the following reasons:

  • To determine whether the application satisfies performance requirements (for instance, the system should handle up to 1,000 concurrent users).
  • To locate computing bottlenecks within an application.
  • To establish whether the performance levels claimed by a software vendor are indeed true.
  • To compare two or more systems and identify the one that performs best.
  • To measure stability under peak traffic events.

How to do performance testing?

The specific steps of performance testing will vary from one organization and application to the next. It depends on what performance indicators the business considers most important. Nevertheless, the general goals of performance testing are largely the same across the board so there’s a certain workflow most testing plans will follow.

Identify the test environment and tools

Identify the production environment, testing environment, and testing tools at your disposal. Document the hardware, software, infrastructure specifications, and configurations in both test and production environments to ensure coherence. Some performance testing may occur in the production environment but there must be rigorous safeguards that prevent the testing from disrupting production operations.

Define acceptable performance criteria

Determine the constraints, goals, and thresholds that will demonstrate test success. The major criteria will be derived directly from the project specifications, but testers should be adequately empowered to set a wider set of tests and benchmarks.

Plan and design tests

Think about how widely usage is bound to vary then create test scenarios that accommodate all feasible use cases. Design the tests accordingly and outline the metrics that should be captured.

Prepare test environment and tools

Configure the testing environment before you execute the performance tests. Assemble your testing tools in readiness.

Run the performance tests

Execute the tests. Capture and monitor the results.

Resolve and retest

Consolidate and analyze test results. Share the findings with the project team. Fine tune the application by resolving the performance shortcomings identified. Repeat the test to confirm each problem has been conclusively eliminated.


Tips for performance testing

Create a testing environment that mirrors the production ecosystem as closely as possible. Without that, the test results may not be an accurate representation of the application’s performance when it goes live.

  • Separate the performance testing environment from the UAT environment.
  • Identify test tools that best automate your performance testing plan.
  • Run tests several times to obtain an accurate measure of the application’s performance. If you are running a load test for instance, run the same test multiple times to determine whether the outcome is consistent before you mark the performance as acceptable or unacceptable.
  • Do not make changes to the testing environment between tests.

What is the difference between performance testing vs. performance engineering?

Performance testing and performance engineering are two closely related yet distinct terms. Performance Testing is a subset of Performance Engineering, and is primarily concerned with gauging the current performance of an application under certain loads.

To meet the demands of rapid application delivery, modern software teams need a more evolved approach that goes beyond traditional performance testing and includes end-to-end, integrated performance engineering. Performance engineering is the testing and tuning of software in order to attain a defined performance goal. Performance engineering occurs much earlier in the software development process and seeks to proactively prevent performance problems from the get-go.


What are performance testing tools and how opentext can help?

Since performance testing seeks to establish how well a system runs when subjected to different workloads, it’s difficult to execute such tests efficiently without using automated testing tools. Testing tools vary in their capability, scope, sophistication and automation. Find out how OpenText Testing Solutions can move the effectiveness of your performance testing to the next level.

Related products

OpenText™ Professional Performance Engineering

Deliver a flawless customer experience with project-based testing

OpenText™ Enterprise Performance Engineering

Foster collaboration and improve application reliability with testing

OpenText™ Core Performance Engineering

Ensure application quality with scalable, cloud-based testing

OpenText™ Service Virtualization

Speed up software testing with realistic API and simulations

Footnotes