Just want to remind people that we targeted Thursday to release Stan 2.18. Let’s use this thread to coordinate that process.
Thanks! Mind putting that notice up when it gets decided next time? (I didn’t know until right now.)
The stuff that needs to be done for math:
- add milestone for 2.18 and 2.18++.
- assign issues to 2.18 milestone. Everything that’s not getting fixed this milestone will be pushed back to 2.18++
I’ll start walking through the issues just for assigning to 2.18 and 2.18++.
I usually take care of the milestones during the release process, unless I’m misunderstanding something.
It was during a meeting a couple of weeks ago. Thursday isn’t a hard deadline, but doing it anytime before Thursday didn’t make much sense with RStan’s CRAN release restrictions.
It’s about what gets into a release.
With a soft deadline, now we have to decide if we’re waiting for any pull requests. If not, then great. (I’m all for just having a releasable math library, but we usually have someone wanting a particular feature in the next release.)
I think there was mention of MPI. If we’re waiting for that to go in before releasing, then we have a lot of work to do before the release.
I see, that is a good way to use milestones, haha. If we think we have close to a month’s worth of work to do on MPI, I say we release 2.18 now with the good stuff in there (vectorized RNGs come to mind) and do another release in a month. The actual work of making a release isn’t too bad, though I know @Bob_Carpenter has expressed that he thinks there is a cost to releasing software too often in terms of confusing our users (I may not be repeating that correctly). The only other limit there is for CRAN, which I think will institutionally dislike you if you try to release more than once a month or something like that.
I’m all for releasing 2.18 asap (without holding up for particular features). There are already a lot of good features in there.
And yes, there is a cost to too many releases. That said, I think once a quarter is pretty reasonable. (Last release was Dec. 11, 2017)
What about once a month? What are those costs in your view? Just trying to
get a handle on the different ways people think about this.
I’m less sensitive about the release schedule because I’m always up to date with Math, Stan, and CmdStan. As long as we’re careful and are truly backwards compatible and the interface doesn’t change every release, it’s fine.
My RStan installation is always behind because it’s a heavy process (updating locally, making sure that doesn’t conflict with anything else I have installed, waiting around for CRAN). For R, I don’t want to reinstall more than once a quarter (even if it moves faster than that) because it’s an unknown amount of time that I have to sink into the installation, every time.
And you’re saying that a new RStan release forces users to upgrade?
I’ll just speak for myself. I tend to not upgrade RStan unless I have to (new language features I want to use). The reason is that every time I do, there’s always a little hiccup. Often minor (having to uninstall a bunch of packages, restart, then reinstall), but sometimes major (doing that doesn’t work, doing it again doesn’t work, then I try reinstalling R, then installing everything again from source).
I like the roughly quarterly release schedule. Given the trouble people have installing and the eagerness to install new issues, roughly quarterly seems like a good balance.
I haven’t personally had any problems (other than Anaconda deciding it wanted to be the boss of my
PATH at one point, including deciding which R gets launched).
But judging from the traffic on the mailing lists and how much people seem to like to upgrade these stats libraries, I’d say more than quarterly would be asking for trouble.
So we think people just enjoy updating Stan and so we ask for more trouble
by releasing more often?
Second that, I run CmdStan from develop so you can upgrade it and math all day long. I haven’t tried PyStan enough to have an opinion, and I have the same rstan problems. We could probably script an install that used ‘install.packages’ but properly cleaned the environment and got BH right (or choked if it couldn’t and automatically fell back to Ben’s devtools install_github scripts).
Yes, I think a lot of people just update by default if they see a newer version.
There are no problems in updating PyStan.
One could do it everyday.
Problems we usually have are c++ compilers & path issues (mostly with Anaconda, it can have its own Boost lib (this probably can be fixed) and libstdc++ files (this is much harder problem)).
Any failures due to boost/libstdc++ issues that
don’t affect the rest of a user’s system are our
problem (from almost any user perspective)
so I don’t think you can really say this. This is
one of the irritations I have with rstan: the compiler
system will work fine with every other Rcpp program,
it’ll work fine with straight c++ programs, but the
specific make magic we use fails.
True, but that is a problem in installation, not updating.
E.g. Boost thing happens only with some Anaconda version (or actually if some other package needs it) and libstdc++ was a problem with gcc version when ubuntu changed the defaults.
Is this really a fair comparison? What other Rcpp programs involve the
runtime compilation of Rcpp source code?
The story is similar with PyStan. There are no other Python packages
that involve runtime compilation of new programs. (There’s some JIT
stuff but that involves some sort of embedded compiler – not a system
This does raise a really good point though. Is anyone thinking about a
blue-sky solution to this problem? I realize it might involve abandoning
C++. But it seems like that’s more or less the path being followed by
some people. For example, this Tensor Comprehensions DSL work looked
What if we could generate some sort of “Stan Intermediate
Representation” of a program and then JIT that? We wouldn’t have to
worry about Boost incompatibilities, that’s for sure.