Examples of experiment design for analysis, simulation, and measurement

[Japanese]

This page contains several experimet design examples for analysis, simulation, and measurement according to the guideline given in [Jain91]. For all performance analysit, I strongly recommend to carefully read [Jain91]. The following guidelines are excerpt from [Jain91].

Common Mistakes in Performance Evaluation

1. No Goals
2. Biased Goals
3. Unsystematic Approach
4. Analysis without Understanding the Problem
5. Incorrect Performance Metrics
6. Unrepresentative Workload
7. Wrong Evaluation Technique
8. Overlooking Important Parameters
9. Ignoring Significant Factors
10. Inappropriate Experimental Design
11. Inappropriate Level of Detail
12. No analysis
13. Errorneous Analysis
14. No Sensitivity Analysis
15. Ignoring Errors
16. Improper Treatment of Outliers
17. Assuming No Change in the Future
18. Ignoring Variability
19. Too Complex Analysis
20. Improper Presentation of Results
21. Ignoring Social Aspects
22. Omitting Assumptions

A Systematic Approach to Performance Evaluation

1. State Goals and Define the System
2. List Services and Outcomes
3. Select Metrics
4. List Parameters
5. Select Factors to Study
6. Select Evaluation Technique
7. Select Workload
8. Design Experiments
9. Analyze and Interpret Data
10. Present Results

Experiment Design Examples

According to the above guideline, "A Systematic Approach to Performance Evaluation", I have designed several REAL experiments in my research work. Performace analysis results based on these experiment designs will be soon available at my publication page.

References


Hiroyuki Ohsaki (ohsaki[atmark]lsnl.jp)