Hangouts Link: https://meet.google.com/gzm-wmum-pfm
Instructions: Ask to attend in the hangouts interface and someone should let you in in the first 10 minutes of the meeting. Email email@example.com if you have problems or want to attend the physical meeting in New York City.
Please add your agenda items in replies.
Can we shortly discuss how we review the parallel reduce_sum stuff?
I started it, Ben rewrote it, Steve did a refactor…so the roles aren’t totally split in code writer and reviewer…
(I would love to see this go into the next release given how well benchmarks are and the design made it really easy to port complex models to it)
This will be my last Stan meeting for a while, I’m heading back to Australia next Wednesday and the time zone will be too difficult.
I’d like to update the group on the status of Stan user community survey, as we have a draft to show. If anyone is interested in helping to get this over the line and/or have access to the data, please let me know on the call or on this thread.
could we organize a meeting of stakeholders around this issue? (paralllel reduce sum)
the people in the Stan meeting are not the right audience for this - e.g. @rok_cesnovar not present
It’s about reviewing policy… which is a community thing, no?
if you say so… then let’s put this on the math meeting agenda.
I really want to move forward with reduce_sum - it’s going to make many models a lot faster if you give Stan the CPUs.
@syclik any chance we have time for a Math meeting this week?
I want to talk about the
von_mises cdf pull: https://github.com/stan-dev/math/pull/1753
The distribution is defined periodically for the pdf. This seems like an okay way to do it there, but that seems really sketchy for the cdf, especially with the centering as a parameter as well.
The numerical approximation is off in the 6th digit: https://github.com/stan-dev/math/pull/1753#issuecomment-603286189 . This will cause 6th-digit discontinuities in the likelihoods. It’s easy enough to doc when this happens (the approximation turns on at kappa = 50 and so that is where the switch is). Ironing this out will cost time (@pgree will need to develop some new numerics). Might be hard, might be easy. But I thought we might discuss whether this is worth it or not since I’m not sure how high demand the
von_mises cdf really is. It’s not clear to what standards our current stuff is developed either.
Sorry about the late reply. Please set up meetings whenever you can / want!
I have more availability in the upcoming week in the afternoons eastern on Mon - Wed if you (or anyone) wants to video chat.
Feature freeze is Monday, so we’ll see what gets through by then (and that’ll probably tell us if we need meetings or whatnot).