Testing Terminology
What we mean when we say...
Quality - Value to some person. - Jerry Weinberg
Bug - Anything that threatens the value of the product. Something that bugs someone whose opinion matters. - James Bach
Software Testing - An empirical, technical investigation conducted to provide stakeholders with information about the quality of the product or service under test. - Cem Kaner
Exploratory Testing:
An interactive process of simultaneous learning, test design, and test execution. - James Bach
A style of software testing that emphasizes the personal freedom and responsibility of the individual tester to continualy optimize the value of his or her work by treating test-related learning, test design, test execution and test results interpretation as mutually supportive activities that run in parallel throughout the project. - Cem Kaner
Heuristic Testing - An approach to test design that employs heuristics to enable rapid development of test cases. - James Bach
Risk-Based Testing - Any testing organized to explore specific product risks. - James Bach
Test Design - The process of creating tests. - James Bach
Test Execution - The process of configuring, operating, and observing a product for the purpose of evaluating it. - James Bach
Test Logistics - The set of ideas that guide how resources are applied to fulfill the test strategy. - James Bach
Test Strategy - The set of ideas that guide test design. - James Bach
Software Performance Testing - An empirical technical investigation conducted to provide stakeholders with information about the quality of the product or service under test with regard to speed, scalability and/or stability characteristics. - Scott Barber
Software Performance Investigation - A deliberate data-collection and data-interpretation activity typically focused on data related to speed, scalability, and/or stability of the product under test. The collected data are primarilly used to assess hypotheses about the root cause of one or more observed performance issues. - Scott Barber
Software Performance Validation - A deliberate activity that compares speed, scalability and/or stability characteristics of the product under test to the expectations of representative users of the product. - Scott Barber
Software Performance Requirements - Performance related characteristics of the product under test that must be met in order for the product to be released. Performance requirements are mandated via legal contract or service level agreement. - Scott Barber
Software Performance Goals - Performance related characteristics of the product under test that are desired to be met prior to product release, but which are not strictly mandatory. - Scott Barber
Performance Testing Objective - Information to be collected through the process of performance testing that is anticipated to have value in determining or improving the quality of the product, but are not necessarily quantitative or directly related to a performance requirement, goal or stated Quality of Service. - Scott Barber
User Community Model - Models that enhance the application usage profile(s) by adding distribution of activities, hourly usage volume and other necessary variables to design realistic performance tests. - Scott Barber
What we hear when other people say...
Load Test - A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations. - Scott Barber
Stress Test - A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models, and load volumes beyond those anticipated during production operations. Stress tests may also include tests focused on determining or validating performance characteristics of the production under test when subjected to workload models and load volumes when the product is subjected to other stressful conditions, such as limited memory, insufficient disk space or server failure. - Scott Barber
Spike Test - A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes that repeatedly increase beyond anticipated production operations for short periods of time. Spike testing is a subset of stress testing. - Scott Barber
Endurance Test - A performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations over an extended period of time. Endurance testing is a subset of load testing. - Scott Barber
Application Speed - Characteristics of the product under test related to the product's overall speed of response, or sub-system's speed of response, to a user initiated activity - Scott Barber
Application Scalability- Characteristics of the product under test related to the number of users the product can support. These Characteristics or Qualities of Service may be related to user load, network or data capacity and/or product failure modes related to the product's inability to scale beyond a particular level. - Scott Barber
Application Stability - Characteristics of the product under test related to the product's overall reliability, robustness, functional and data integrity, availability and/or consistency of responsiveness under a variety of expected and unexpected conditions. - Scott Barber
Application Usage Profile - One or more descriptions of how the product under test is, or is anticipated to be, used during production operations. Usage profiles are typically expressed in terms of business activities and usage scenarios. - Scott Barber
Standard Testing Terms?
None of the software testing glossaries that have been proposed as an industry standard have achieved widespread acceptance by testers or development organizations.
At PerfTestPlus, we have chosen to use the terms listed on the top half of this page based on their general usefulness.
The terms listed on the bottom half of this page are not terms that PerfTestPlus uses, but rather terms that are so commonly used (and mis-used) that we find it valuable to publicize what we hear when others use these terms.
Each definition is credited to individual who either created or popularized it.