Bugzilla – Bug 2490
Feature request: changing logging timestamp precision
Last modified: 2017-01-22 21:54:59 EST
The default logging timestamp uses default output stream precision of 6, and is not fixed; e.g. Time 1.044620241 seconds is printed as: 1.04462s This default doesn't reveal event timestamps that occur on finer granularity, e.g. in Wi-Fi. I realize that I can substitute the default method with my own, by calling LogSetTimePrinter (&MyNewTimePrinterFunction); but that requires me to write a new function and set the callback. One possibility is to change the default in ns-3-dev for future releases. I would prefer nanosecond precision. I can hack the default TimePrinter implementation to achieve nanosecond resolution, such as: static void TimePrinter (std::ostream &os) { - os << Simulator::Now ().GetSeconds () << "s"; + std::streamsize oldPrecision = os.precision (); + os << std::fixed << std::setprecision (9) << Simulator::Now ().GetSeconds () << "s"; + os << std::setprecision (oldPrecision); + os.unsetf (std::ios_base::fixed); } A more configurable solution would be to allow users to change it via some public API. Here are two possible user APIs: 1) a new GlobalValue (e.g. --LogTimePrecision=[6] The precision on the output stream for logging timestamps 2) a new function in log.{cc,h}, to parallel the LogSetTimePrinter() function, such as: /** * Set the printing resolution on the default implementation for log time printing * n.b. if the user replaces the default time printer implementation, this * method will have no effect. * * \param [in] precision The precision value to use for time printing */ void LogSetTimePrecision (std::streamsize precision); From my perspective, changing the default to std::fixed and std::setprecision (9) is adequate for my needs, since users still have the ability to set a new callback function to change this, but I'm posting this to gather other inputs/requirements and to gauge whether people are interested in making this easily configurable.
Created attachment 2628 [details] suggested patch
1. My only significant concern is the last line: + os.unsetf (std::ios_base::fixed); You don’t know if fixed was set initially, so just unsetting it might leave the stream state altered. This is a bear to find and fix later, especially as it depends on whether log timestamps have been turned on. My solution to this, from int64x64.cc, is std::ios_base::fmtflags ff = os.flags (); // Save stream flags ... os.flags (ff); // Restore stream flags 2. I have a slight preference for + os << std::fixed << std::setprecision (9) << Simulator::Now ().As (Time::S); GetSeconds loses precision; As does not. 3. It would be useful to add a comment to the logging.rst and sample-log-time-format.cc pointing out that the maximum useful precision is 20 decimal digits, since Time is signed 64 bits. 4. Easier configuration would be nice, but not strictly necessary if you just want to move on. I have a slight aversion to creating GlobalVariables for everything, so I would disfavor —LogTimePrecision. Previously I wrote you privately > [I would] favor LogSetTimePrecision(). An additional useful utility would be > LogSetTimeUnit(Time::Unit)... However there is no end to the number of customizations this leads to. Looking at it now I would prefer one default behavior and the most general LogSetTimePrinter(), with nothing intermediate.
Instead of printing at fixed ns precision, what about printing at the Time::GetResolution()?
Do you think we are close to deliver the fix to unblock bug 2491? Thanks :-)
Created attachment 2730 [details] revised patch (approach 1: fixed NS precision)
Created attachment 2731 [details] approach 2 (precision follows Time::GetResolution())
(In reply to Peter Barnes from comment #3) > Instead of printing at fixed ns precision, what about printing at the > Time::GetResolution()? I added this variant as 'approach 2'; I tend to like it slightly better.
(In reply to Peter Barnes from comment #2) > 1. My only significant concern is the last line: > > + os.unsetf (std::ios_base::fixed); > > You don’t know if fixed was set initially, so just unsetting it might leave > the > stream state altered. This is a bear to find and fix later, especially as > it > depends on whether log timestamps have been turned on. > > My solution to this, from int64x64.cc, is > > std::ios_base::fmtflags ff = os.flags (); // Save stream flags > ... > os.flags (ff); // Restore stream flags > > > 2. I have a slight preference for > > + os << std::fixed << std::setprecision (9) > << Simulator::Now ().As (Time::S); > > GetSeconds loses precision; As does not. > > 3. It would be useful to add a comment to the logging.rst and > sample-log-time-format.cc pointing out that the maximum useful precision is > 20 > decimal digits, since Time is signed 64 bits. Thanks for the above comments; they are in the revised patch (approach 1). > > > 4. Easier configuration would be nice, but not strictly necessary if you > just > want to move on. I have a slight aversion to creating GlobalVariables for > everything, so I would disfavor —LogTimePrecision. > > Previously I wrote you privately > > [I would] favor LogSetTimePrecision(). An additional useful utility would be > > LogSetTimeUnit(Time::Unit)... > > However there is no end to the number of customizations this leads to. > Looking > at it now I would prefer one default behavior and the most general > LogSetTimePrinter(), with nothing intermediate. I left it as is.
+1 for approach 2
(In reply to Peter Barnes from comment #9) > +1 for approach 2 Pushed approach 2 in changeset: 12560:9bf327ae8008