HOWTO get ns-3 to detect steady-state times in your data

From Nsnam
Revision as of 22:34, 20 June 2011 by Watrous (Talk | contribs) (Created page with "= SAFE Framework = == Steady-state detectors == The architecture of SAFE provides for the use of mechanisms to determine when metrics estimated through simulation have reached ...")

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

SAFE Framework

Steady-state detectors

The architecture of SAFE provides for the use of mechanisms to determine when metrics estimated through simulation have reached steady-state. Although well known, the initialization bias problem hasn't been addressed satisfactorily in the domain of network simulation. Few published research studies have taken steps to exclude samples generated during model transients from their statistical data analysis. Among other researchers who have identified the relevance of initialization bias in network simulation, "Perrone, Yuan, and Nicol [2003]":http://redmine.eg.bucknell.edu/safe/attachments/16/perrone2003.pdf report the significance of avoiding the use of samples from transient in the computation of statistical estimators for metrics of interest.

SAFE enables one to hook up a source of samples to an analysis module, which determines on its own whether the sample data is "in transient" or in steady-state.

The development of _steady-state detectors_ for SAFE is looking at a broad range of publications that will culminate in the identification of sound data analysis methodology. SAFE will use the information provided by steady-state detectors to implement data deletion in the samples collected thereby avoiding initialization bias.

Current Stage of Development

This work is at an investigation stage. We are currently going through the literature to identify algorithms that will be suitable for use in SAFE steady-state detectors.

Projected Milestones

  • Literature search and evaluation: June 8-14, 2011
  • First implementation: June 15-30, 2011
  • Evaluation of implementation against synthetic data generator: July 1-20, 2011

Rejected Algorithms

  • Robinson, Stewart (2002). A Statistical Process Control Approach for Estimating the Warm-Up Period. Proceedings of the 2002 Winter Simulation Conference.
    • Summary: 2 stage method:
      • Stage 1: has multiple replications.
      • Stage 2: has single seed.
    • Advantages: Works with a single seed value for stage 2.
    • Disadvantages: Requires visual inspection of final results and initial preliminary runs from Stage 1.
    • Note: Has equations for batch mean method, which works for single seeds.

Algorithms to Try

  • Robinson, Stewart (2005). Automated Analysis of Simulation Output Data. Proceedings of the 2005 Winter Simulation Conference.
    • Summary: Paper describes an Excel based semi-automated tool that calls SIMUL8.
    • Advantages: Automated methods. Describes batch method which works for single seed values.
    • Disadvantages: Based on Excel and SIMUL8, which means can't be called from ns-3.
    • Note: Has equations for an automated version of Welch's visual method.
    • Note: Describes how to automate the MSER-5 (White et al, 2000) method.
      • White, K.P., M.J. Cobb and S.C. Spratt. 2000. A Comparison of Five Steady-State Truncation Heuristics for Simulation. Proceedings of the 2000 Winter Simulation Conference (J.A. Joines, R.R. Barton, K. Kang and P.A. Fishwick, eds.). IEEE, Piscataway, NJ, 755-760.
    • Note: Describes how to automate the batch mean method.
    • Note: See Robinson (2002) in rejected algorithms for equations for batch mean method.