URL for Industry Standard Benchmarks
Audience: All
Date: February 4, 2003
Here are several URL's containing industry standard benchmarks.
ISV Benchmarks
Oracle Applications 11.5.3: number of users, response time (http://www.oracle.com/apps_benchmark/html/index.html?results.html
)
SAP S&D 3 Tier, steps per hour (http://www.sap.com/benchmark
)
SAP ATO 3 Tier, throughput per hour
PeopleSoft 8 GL: lines per hour (http://www.peoplesoft.com
)
PeopleSoft 8 Financials: number of users
CPU Intensive
SPEC: http://www.spec.org
LINPACK: http://www.netlib.no/netlib/benchmark/performance.ps
Data Warehouse
TPC-H: http://www.tpc.org
Transaction Database
TransactionTPC-C: transactions per minute (http://www.tpc.org
)
Java
SpecJBB2000 (Java Business Benchmark): operations per second (http://www.spec.org
)
Graphics
Pro/E; http://www.proe.com
Web
SPECweb: http://www.spec.org
Current and Historic pSeries & RS/6000
Miscellaneous: http://www.ibm.com/servers/eserver/pseries/hardware/system_perf.pdf
Relative Hardware Performance by Vendor
IDEAS International: http://wwwideasinternational.com/benchmark/bench.html
I've found that you have to be careful when using benchmarks. Here are a few
guidelines that I use:
- The best benchmark is running your application on the same
server you intend to run in production.
- In the absence of doing your own benchmark, select a benchmark
that reflects your workload. Don't assume that a fast CPU means fast I/O.
- Benchmarks are better used for relative comparisons, not absolute sizings.
There are several reasons for this. First, benchmarks are so highly tuned that
you probably never achieve the benchmark results in production. Second,
the motivation for many ISV benchmarks is to show the maximum number of
users. As a consequence, ISV benchmark workloads tend to be a "light" relative
to a production workloads. Third, every company customizes their software
differently. I've seen the same application on the same server with the same
number of users behave totally differently at different companies due to
customization.
- For sizings, use ISV recommendations. Use references when possible.
- Finally, in many cases, it may be cheaper to just oversize the server instead
of running a benchmark. The cost of doing a benchmark may be more than the
cost of the hardware. For example, the last benchmark I ran cost $200,000.
Also, from a business perspective, if the application you intend to run saves $1M
per month, its almost a certainty that it is cheaper to oversize, rather than delay the
rollout to run the benchmark. You can recover excess capacity with either WLM
or partitioning.
Bruce Spencer,
baspence@us.ibm.com