Difference between revisions of "GSoC2018: A Simulation Execution Manager for ns-3"

From Nsnam
Jump to: navigation, search
(Add project overview)
(Add Week 12 updates)
 
(16 intermediate revisions by the same user not shown)
Line 11: Line 11:
 
* '''Abstract:''' This GSoC project will develop a Python library to automatize the ns-3 simulation script execution and result management processes.
 
* '''Abstract:''' This GSoC project will develop a Python library to automatize the ns-3 simulation script execution and result management processes.
  
* '''Code:''' [https://github.com/DvdMgr/ns-3-execution-manager Github repository]
+
* '''Proposal:''' [https://github.com/DvdMgr/ns-3-execution-manager/raw/master/docs/GSoC%20proposal.pdf PDF]
  
* '''About me:''' I am a first-year PhD student at the University of Padova, Italy, working under the supervision of Prof. Michele Zorzi. My research interests include large scale IoT network simulation and medium access layer protocol design.  
+
* '''Code:''' [https://github.com/DvdMgr/sem Github repository]
 +
 
 +
* '''Documentation:''' [https://simulationexecutionmanager.readthedocs.io ReadTheDocs]
 +
 +
* '''About me:''' I am a first-year PhD student at the University of Padova, Italy, working under the supervision of Prof. Michele Zorzi. My research interests include large scale IoT network simulation and medium access layer protocol design.
  
 
= Project Timeline =
 
= Project Timeline =
 +
== Week 1 (May 14 - May 19) ==
 +
 +
Week 1 was used to set up the project and define the structure of the code that will be necessary to get to Milestone 1.
 +
For now, code is maintained in src/stats/utils.
 +
 +
'''Summary''':
 +
* Created a basic Python project using pipenv;
 +
* Defined an initial set of classes and functions to outline code structure (code available in the [https://github.com/DvdMgr/ns-3-dev-gsoc/tree/feature/api-specs/src/stats/utils/sem sem] folder);
 +
* A detailed report can be found in the [https://github.com/DvdMgr/ns-3-dev-gsoc/blob/feature/api-specs/src/stats/utils/dev.org dev.txt] file in the project repository.
 +
 +
== Week 2 (May 21 - May 25) ==
 +
 +
Week 2 was dedicated to a first implementation of the database management structures.
 +
 +
'''Summary''':
 +
* Moved project to its own [https://github.com/DvdMgr/sem github repository], outside ns-3-dev;
 +
* Created documentation page, available at [https://simulationexecutionmanager.readthedocs.io readthedocs];
 +
* Implemented database creation, management and insertion/querying of results.
 +
* Code available at the project repository, under tag gsoc-week2
 +
 +
== Week 3 (May 28 - June 1) ==
 +
 +
Week 3 saw the implementation of simulation running facilities in SEM.
 +
 +
'''Summary''':
 +
* Interfaced with waf libraries, to programmatically get information about libraries and the script executable. Now waf isn't needed anymore to run simulations from the sem library;
 +
* Created an ns-3 git submodule to facilitate example execution;
 +
* Stdout and files created by ns-3 scripts are now saved in the database;
 +
* Added code to perform sequential simulations;
 +
* Added a simple ParallelRunner class that uses threads to spawn multiple simulations in parallel;
 +
* Extended the documentation;
 +
* Add a function that takes into account the contents of the database before running simulations to get a list of parameter combinations and desired repetitions for each.
 +
 +
== Week 4 (June 4 - June 8) ==
 +
 +
Week 4 was dedicated to writing code to allow users to export simulation results.
 +
 +
'''Summary''':
 +
* Check whether the ns-3 repository is at the right commit before running simulations;
 +
* Added numpy and xarray exports
 +
* Progress bars were added to indicate the advancement in compilation and simulation running.
 +
 +
== Week 5 (June 11 - June 15) ==
 +
 +
Week 5 was dedicated to writing tests and documentation.
 +
 +
'''Summary''':
 +
* Set up pytest/ns-3 test facilities;
 +
* Wrote tests for database, campaign manager and runner;
 +
* Re-wrote the getting started section of the documentation to show how it's possible to go from a vanilla ns-3 installation to plotting results;
 +
* Added example plots to the wifi-plotting-xarray.py example.
 +
 +
== Week 6 (June 18 - June 22) ==
 +
 +
Week 6 was mainly focused on adding support for running ns-3 simulations on cluster architectures.
 +
 +
'''Summary''':
 +
* Added the GridRunner class, which allows users to run simulations on any DRMAA compatible cluster architecture;
 +
* Output files produced by simulations can now be easily accessed through dedicated functions.
 +
 +
== Week 7 (June 26 - June 29) ==
 +
 +
Week 7 was mainly dedicated to fixing various bugs and to improving the usability of the library.
 +
 +
'''Summary''':
 +
* Various bug fixes;
 +
* Simulation campaigns can not be loaded without the need to specify a working ns-3 installation;
 +
* Added an example showcasing parsing of output files.
 +
 +
== Week 8 (July 2 - July 6) ==
 +
 +
Week 8 was focused on test coverage and documentation.
 +
 +
'''Summary''':
 +
* Various bug fixes;
 +
* Refactor result dictionary structure;
 +
* Add missing docstrings;
 +
* Add codecov integration to repository;
 +
* Default to ParallelRunner for local simulations.
 +
 +
== Week 9 (July 9 - July 15) ==
 +
 +
Week 9 was mostly occupied by a personal commitment. The remaining time was used to increase test coverage and to lower compatibility to Python 3.4.
 +
 +
'''Summary''':
 +
* Added test cases;
 +
* Better handling of case in which no git repo is found;
 +
* Lower compatibility from Python 3.6+ to Python 3.4+.
 +
 +
== Week 10 (July 16 - July 20) ==
 +
 +
Week 10 was dedicated to the development of a command line interface for SEM. This allows users to run simulations, view results and export them to the .mat and .npy formats without writing a single line of Python code.
 +
 +
'''Summary''':
 +
* Add Command Line Interface;
 +
* Add ability to export to MATLAB and Numpy data files.
 +
 +
== Week 11 (July 23 - July 27) ==
 +
 +
Week 11 was dedicated to the development of the last features contained in the original proposal, and to the expansion of documentation and testing.
 +
 +
'''Summary''':
 +
* Updated documentation;
 +
* The addition of a new CLI parameter to read parameters from a file;
 +
* Increased testing coverage;
 +
* Parsing and allowing usage of Global variables, besides standard command line arguments;
 +
* Support for scratch directory scripts (through Pull Request #15);
 +
* New methods to export results to a folder structure.
 +
 +
== Week 12 (July 30 - August 3) ==
 +
 +
Since all proposed features were implemented, Week 12 was dedicated to writing documentation.
 +
 +
'''Summary''':
 +
* Update installation instructions
 +
* Add contribution guidelines
 +
* Add testing framework description
 +
* Add walkthrough of provided examples
 +
* Add description of advanced features and workflows

Latest revision as of 08:55, 3 August 2018

Return to GSoC 2018 Accepted Projects page.

Project overview

  • Project name: A Simulation Execution Manager for ns-3
  • Abstract: This GSoC project will develop a Python library to automatize the ns-3 simulation script execution and result management processes.
  • About me: I am a first-year PhD student at the University of Padova, Italy, working under the supervision of Prof. Michele Zorzi. My research interests include large scale IoT network simulation and medium access layer protocol design.

Project Timeline

Week 1 (May 14 - May 19)

Week 1 was used to set up the project and define the structure of the code that will be necessary to get to Milestone 1. For now, code is maintained in src/stats/utils.

Summary:

  • Created a basic Python project using pipenv;
  • Defined an initial set of classes and functions to outline code structure (code available in the sem folder);
  • A detailed report can be found in the dev.txt file in the project repository.

Week 2 (May 21 - May 25)

Week 2 was dedicated to a first implementation of the database management structures.

Summary:

  • Moved project to its own github repository, outside ns-3-dev;
  • Created documentation page, available at readthedocs;
  • Implemented database creation, management and insertion/querying of results.
  • Code available at the project repository, under tag gsoc-week2

Week 3 (May 28 - June 1)

Week 3 saw the implementation of simulation running facilities in SEM.

Summary:

  • Interfaced with waf libraries, to programmatically get information about libraries and the script executable. Now waf isn't needed anymore to run simulations from the sem library;
  • Created an ns-3 git submodule to facilitate example execution;
  • Stdout and files created by ns-3 scripts are now saved in the database;
  • Added code to perform sequential simulations;
  • Added a simple ParallelRunner class that uses threads to spawn multiple simulations in parallel;
  • Extended the documentation;
  • Add a function that takes into account the contents of the database before running simulations to get a list of parameter combinations and desired repetitions for each.

Week 4 (June 4 - June 8)

Week 4 was dedicated to writing code to allow users to export simulation results.

Summary:

  • Check whether the ns-3 repository is at the right commit before running simulations;
  • Added numpy and xarray exports
  • Progress bars were added to indicate the advancement in compilation and simulation running.

Week 5 (June 11 - June 15)

Week 5 was dedicated to writing tests and documentation.

Summary:

  • Set up pytest/ns-3 test facilities;
  • Wrote tests for database, campaign manager and runner;
  • Re-wrote the getting started section of the documentation to show how it's possible to go from a vanilla ns-3 installation to plotting results;
  • Added example plots to the wifi-plotting-xarray.py example.

Week 6 (June 18 - June 22)

Week 6 was mainly focused on adding support for running ns-3 simulations on cluster architectures.

Summary:

  • Added the GridRunner class, which allows users to run simulations on any DRMAA compatible cluster architecture;
  • Output files produced by simulations can now be easily accessed through dedicated functions.

Week 7 (June 26 - June 29)

Week 7 was mainly dedicated to fixing various bugs and to improving the usability of the library.

Summary:

  • Various bug fixes;
  • Simulation campaigns can not be loaded without the need to specify a working ns-3 installation;
  • Added an example showcasing parsing of output files.

Week 8 (July 2 - July 6)

Week 8 was focused on test coverage and documentation.

Summary:

  • Various bug fixes;
  • Refactor result dictionary structure;
  • Add missing docstrings;
  • Add codecov integration to repository;
  • Default to ParallelRunner for local simulations.

Week 9 (July 9 - July 15)

Week 9 was mostly occupied by a personal commitment. The remaining time was used to increase test coverage and to lower compatibility to Python 3.4.

Summary:

  • Added test cases;
  • Better handling of case in which no git repo is found;
  • Lower compatibility from Python 3.6+ to Python 3.4+.

Week 10 (July 16 - July 20)

Week 10 was dedicated to the development of a command line interface for SEM. This allows users to run simulations, view results and export them to the .mat and .npy formats without writing a single line of Python code.

Summary:

  • Add Command Line Interface;
  • Add ability to export to MATLAB and Numpy data files.

Week 11 (July 23 - July 27)

Week 11 was dedicated to the development of the last features contained in the original proposal, and to the expansion of documentation and testing.

Summary:

  • Updated documentation;
  • The addition of a new CLI parameter to read parameters from a file;
  • Increased testing coverage;
  • Parsing and allowing usage of Global variables, besides standard command line arguments;
  • Support for scratch directory scripts (through Pull Request #15);
  • New methods to export results to a folder structure.

Week 12 (July 30 - August 3)

Since all proposed features were implemented, Week 12 was dedicated to writing documentation.

Summary:

  • Update installation instructions
  • Add contribution guidelines
  • Add testing framework description
  • Add walkthrough of provided examples
  • Add description of advanced features and workflows