Definition:
In software engineering,
performance test is a process to measure system's (Application, Network,
Database or Devices) speed (responsiveness), stability and scalability under
particular work load.
Core Performance Testing Activities:
Core Performance Testing Activities:
Core Performance Testing Activities |
|
1. Identify the Test Environment
|
|
2. Identify Performance Acceptance Criteria
|
|
3. Plan and Design Tests
|
|
4. Configure the Test Environment
|
|
5. Implement the Test Design
|
|
6. Execute the Test
|
|
7. Analyze Results, Report, and Retest
|
1. Identify the Test
Environment:
Identify the physical test
environment. The physical environment includes hardware, software, and network
configurations. Having a thorough understanding of the entire test environment
at the outset enables more efficient test design and planning and helps you
identify testing challenges early in the project.
2. Identify Performance Acceptance Criteria:
2. Identify Performance Acceptance Criteria:
Identify the response time, throughput, and
resource utilization goals and constraints. In general,
·
Response time is
a user concern,
·
Throughput is a
business concern,
·
Resource
utilization is a system concern.
Additionally, identify project success
criteria that may not be captured by those goals and constraints; for example,
using performance tests to evaluate what combination of configuration settings
will result in the most desirable performance characteristics.
3. Plan and Design Tests:
3. Plan and Design Tests:
Identify key scenarios, determine variability
among representative users and how to simulate that variability, define test
data, and establish metrics to be collected. Consolidate this information into
one or more models of system usage to be implemented, executed, and
analyzed.
4. Configure the Test Environment:
4. Configure the Test Environment:
Prepare the test environment, tools, and
resources necessary to execute each strategy as features and components become
available for test. Ensure that the test environment is instrumented for
resource monitoring as necessary.
5. Implement the Test Design:
5. Implement the Test Design:
Develop the performance tests in accordance
with the test design.
6. Execute the Test:
6. Execute the Test:
Run and monitor your
tests. Validate the tests, test data, and results collection. Execute validated
tests for analysis while monitoring the test and the test environment.
7. Analyze Results, Report, and Retest:
7. Analyze Results, Report, and Retest:
Consolidate and share results data. Analyze
the data both individually and as a cross-functional team. Re-prioritize the
remaining tests and re-execute them as needed. When all of the metric values
are within accepted limits, none of the set thresholds have been violated, and
all of the desired information has been collected, you have finished testing
that particular scenario on that particular configuration.
Few example of why performance test is conducted:
Few example of why performance test is conducted:
Validating that the
application performs properly
Validating that the
application conforms to the performance needs of the business
Finding, Analysing, and
helping fix performance problems
Validating the hardware
for the application is adequate
Doing capacity planning
for future demand of the application
There are many different
performance testing tools. Everyone claims their tools are the best. Here are
my picks and this is what I think.
Commercial Performance Testing Tools: HP Performance Center:
HP LoadRunner:
Neotys Neoload:
Open Source Performance Testing Tools:
Apache JMeter
Commercial Performance Testing Tools: HP Performance Center:
HP LoadRunner:
Neotys Neoload:
Open Source Performance Testing Tools:
Apache JMeter

