Articles, Columns and Papers
Performance & Software Testing Books:
Performance Testing Guidance for Web Applications Microsoft patterns & practices -- by: J.D. Meier, Scott Barber, Carlos Farre, Prashant Bansode, and Dennis Rea
(Also as a free .pdf download here.)-
Web Load Testing for Dummies, Compuware special edition by: Scott Barber and Colin Mason
-
Beautiful Testing O'Reilly Media -- Edited by: Tim Riley, Adam Goucher
Chapter 4: Collaboration Is the Cornerstone of Beautiful Performance Testing by: Scott Barber -
How to Reduce the Cost of Software Testing CRC Press -- Edited by: Matthew Heusser, Govind Kulkarni
Chapter 16: Rightsizing the Cost of Testing: Tips for Executives by: Scott Barber -
Improving .NET Application Performance and Scalability Microsoft patterns & practices -- Forward and select content contributed by: Scott Barber
Recent Articles and Papers:
Rightsizing the Cost of Testing: Tips for Executives; Chapter 16 of written for How to Reduce the Cost of Testing; CRC Press, 2011.
Right Click -> View Source and other Tips for Performance Testing the Front End; Tips for Performance Testers based on High Performance Web Sites: Essential Knowledge for Frontend Engineers by Steve Souders, O’Reilly, 2007
High Performance Testing; written for LogiGear's newsletter in 2007, this article first appeared in Better Software, May/June 2005.
NASA's Anomaly: A Lesson for Software Testing; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in September, 2005
An Explanation of Performance Testing on an Agile Team (Part 1 of 2); written for LogiGear's newsletter in 2007, this is an adaptation of some of Scott's contributions to Microsoft's patterns & practices: Performance Testing Guidance
An Explanation of Performance Testing on an Agile Team (Part 2 of 2); written for LogiGear's newsletter in 2007, this is an adaptation of some of Scott's contributions to Microsoft's patterns & practices: Performance Testing Guidance
Performance Testing Plus: Do the Math!; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in October, 2006
Investigation vs. Validation; written for LogiGear's newsletter in 2007, this is an adaptation of an article Scott wrote for Software Test & Performance in September, 2005
Introducing the Captain of your Special Teams... The Performance Test Lead; written in support the EuroSTAR 2006 Keynote of the same title
SOA Driven Testing?; written in support of a webinar by SQE, May 2006
How Fast Does a Website Need To Be?;Ongoing Research
User Community Modeling Language (UCML™)v1.1; Visual Modeling Technique (Visio Template | SmartDraw Template)
Creating Effective Load Models for Performance Testing with Incomplete Empirical Data; Sixth IEEE International Workshop on Web Site Evolution(WSE'04)
SearchSoftwareQuality.com:
Peak Performance monthly columns:
Use "SCORN" to test the front end of a website for performance
Don't mistake user acceptance testing for acceptance testing
"Ask the Expert" Answers:
User acceptance testing that satisfies users and requirements
Software testing tools: How to interpret results from OpenSTA
What to do when the test environment doesn't match production
Better Software Magazine:
Feature Articles:
Hurry Up and Wait: When Industry Standards Don't Apply; Better Software, June 2007
Tester PI: Performance Investigator; Better Software, March 2006
High Performance Testing; Better Software, May/June 2005
Software Test & Performance Magazine:
Peak Performance monthly columns:
Feature Articles:
How to Identify the Usual Performance Suspects; Software Test & Performance, May 2005
Diagnosing Symptoms and Solving Problems; Software Test & Performance, July 2005
Commissioned Papers:
Get performance requirements right - think like a user; commission by and co-branded with Compuware, January 2007
User Experience, not Metrics Series
This is Scott's first series of articles where he starts by asking How many times have you surfed to a web site to accomplish a task only to give up and go to a different web site because the home page took too long to download? "46% of consumers will leave a preferred site if they experience technical or performance problems." (Juniper Communications) In other words, "If your web site is slow, your customers will go!" This is a simple concept that all Internet users are familiar with. When this happens, isn't your first thought always, "Gee, I wonder what the throughput of the web server is?" Well no, that is certainly not the thought that comes to mind. Instead, you think "Man, this is SLOW! I don't have time for this. I'll just find it somewhere else." Now consider this, what if it was YOUR web site that people were leaving because of performance?
Face it, users don't care what your throughput, bandwidth or hits per second metrics prove or don't prove, they want a positive user experience. There are a variety of books on the market, which discuss how to engineer maximum performance. There are even more books that focus on making a web site intuitive, graphically pleasing and easy to navigate. The benefits of speed are discussed, but how does one truly predict and tune an application for optimized user experience? One must test, first hand, the user experience! There are two ways to accomplish this. One could release a web site straight into production, where data could be collected and the system could be tuned, with the great hope that the site doesn't crash or isn't painfully slow. The wise choice, however, would be to simulate actual multi-user activity, tune the application and repeat (until the system is tuned) before placing your site into production. Sounds like a simple choice, but how does one simulate actual multi-user activity accurately? That is the question this series of articles attempts to answer.
Beyond Performance Testing Series
This is a companion series to the User Experience, not Metrics series and will address topics related to what happens after those initial results are collected, the part it takes a human brain to accomplish. We will explore what the results mean and what can be done to improve them. We will take the next step beyond simply testing and explore how to identifying specific, fixable, issues. What are these issues? Poor end user experience, scalability and confidence in our software applications.
Performance Testing and Analysis is the discipline dedicated to optimizing the most important application performance trait, user experience. In this series of articles, we will explore those performance engineering activities that lie beyond performance testing. We will examine the process by which software is iteratively tested, using Rational Suite TestStudio, and tuned with the intent of achieving desired performance by following an industry-leading performance engineering methodology that compliments the Rational Unified Process. This first article is intended to introduce you to the high-level concepts used throughout the series and to give you an overview of the articles that follow.
Part 8: Modifying Tests to Focus on Failure or Bottleneck Resolution
Part 9: Pinpointing the Architectural Tier of the Failure or Bottleneck
Part 10: Creating a Test to Exploit the Failure or Bottleneck
Technical Articles Written for IBM-Rational
Automated Testing for Embedded Devices: As the number of new applications being developed for wireless/embedded devices such as PDAs, pagers, and cell phones increases, so does the demand for tools to automate the testing process on these new platforms. Through several recent consulting engagements, we have had the opportunity to pioneer the use of Rational TestStudio to automate functional (or GUI) and performance testing of new applications developed to run on a variety of embedded devices. This automation is made possible by the use of emulators - the same emulators used by developers of applications for embedded devices. In this article, we're going to show you how to use Rational TestStudio to record and play back test scripts against emulators.
Now Available
Web Load Testing for DummiesLead Author: Scott Barber
Performance Testing Guidance for Web Applications
Lead Contributing Author: Scott Barber
Refining by Writing
Being able to do a thing is a far cry from being able to explain it in writing. Explaining a task in writing requires that the author have a deep understanding of every step of the process and often leads the author to significantly enhance their own understanding of the task. This is one reason why PerfTestPlus encourages their employees to publish their best ideas.