System time function for profiling

I have looked right through the stan manual v2.17, and don’t see any function to get the system time. Does any exist?

It would be really useful to be able to get the system time in milliseconds (as an integer/real) in order to time different parts of a model. For instance, I have a model (discussing in this thread) looping through a large number of probability samples. I’d like to know which commands take the longest time. If I could get the system time, to the millisecond, at any particular point in time, I could calculate cumulative elapsed time for each calculation and work out where the pressure points are and therefore where to focus my attention on model optimization.

Does not exist in the Stan language and some would consider it to be too dangerous to expose. Of course, you can do pretty much anything if you write your own C++ implementation of an undefined Stan function:
https://cran.r-project.org/web/packages/rstan/vignettes/external.html

1 Like

Thanks, that’s helpful! I’ll look into that.

That won’t work. When you look at a function, say, in Stan, it operates in two phases. In the forward phase as the code gets executed, it adds an expression to the expression graph through template overlading. Then after the expression graph is built, it takes a reverse path to evaluate the gradient. Different functions have different ratios of execution times.

And then there are really two aspects of efficiency to worry about, the per-iteration efficiency of the Stan program and the statistical efficiency of the model. I discuss both in the efficiency chapter of the manual.