I have looked right through the stan manual v2.17, and don’t see any function to get the system time. Does any exist?
It would be really useful to be able to get the system time in milliseconds (as an integer/real) in order to time different parts of a model. For instance, I have a model (discussing in this thread) looping through a large number of probability samples. I’d like to know which commands take the longest time. If I could get the system time, to the millisecond, at any particular point in time, I could calculate cumulative elapsed time for each calculation and work out where the pressure points are and therefore where to focus my attention on model optimization.