Building MPI

1 & 3 = Mac & Windows? Or 1 & 3 = CmdStan & PyStan?

I think it would be great to modernize our build system somehow. Maybe we should have a hangout about this with me and @syclik (and whoever else is interested) sometime. I’m also curious about autoconf vs cmake, having never used either.

I meant to say I don’t know rstan & pystan build process, but can give it a try with autotools. I don’t have production experience with cmake.

My guess is since Stan is header-only by most a modern config/build has not been a concern, until now. I’ve been thinking about this as I need a better build system to coordinate with a PDE solver that I’d like to connect with Stan to do PDE inverse problems. Leaving users to set up may bypass the problem in the moment, but in long term it may cause more trouble in issue tracking and maintenance. Ideally a config/build would also relieve us from carrying all the dependency libraries.

After MPI settles we can setup a build matrix to design config/build process.

1 Like

Minor progress: switching to the clang compiler helped. CmdStan was successfully built, but promptly failed to build either the oral_2cmt_mpi4 or bernoulli examples. They did however give different errors, attached below.

Someone from my university’s devops team suggested that g++ has some issues with Boost 1.6x.x and that it appeared to be dropping the linker flags. I did notice that the Boost libraries themselves were installed with gcc, so I might see if the different compiler helps there too.

Any advice appreciated.

mpi_example_log.txt (1.3 MB)
bernoulli_example_log.txt (26.3 KB)

Hi!

It is no surprise that the bernoulli example does not build, since there are not all functions defined which are expected to be around (it does not have a mpi_function).

For the oral example it seems that the linker does not find the libboost_mpi and libboost_serialization libs. Can you share how the compiler/linker is called and can you please verify that the dynamic libraries do exist?

What g++/clang++ version are u using?

Best,
Sebastian

Ah.

I totally forgot to address this: yes, the libraries are successfully built. $(BOOST)/stage/lib contains *.a and *.so files for libboost_mpi, libboost_mpi_python, libboost_python, libboost_serialization and libboost_wserialization, as well as mpi.so.

The compiler is clang++ 3.8.

I somehow unset OMPI_CXX flag and it had reverted to g++. Switching back, it appears to find the Boost libraries, but still fails to build. It does however give a much more interpretable error. Attached.

Thanks for your help.

mpi_example_log.txt (34.2 KB)

Oh… now you are getting an error which is my fault (I think). Let me fix the prototype.

I think that sounds reasonable to me as a plan for building in Math and CmdStan (which I think are our only targets for the initial release of MPI). @syclik, does that sound good to you? I’m approving the PR, but I think it still needs Daniel’s approval (or so github tells me).

Sorry for taking a moment. Could you please pull the latest changes from stan-math for the prototype and then retry?

Thanks.

Looks good to me! Thanks for your help.

Any chance you could put oral4_stan-base.R in the repo to save installing RStan when testing?

Edit: I now see the value of being able to tweak the dataset. Some timings on my cheap VPS server with 2 x86 cores (J = 1):

Solver method Serial (sec) MPI (sec)
Matrix exp. 77 57
ODE integration 487 276

Yeah, I can include that oral4_stan-base.R file, but things are really moving right now such that I need to see how (and if) to keep this prototype running. The map_rect is about to land in Stan (serial version).

For your timings… If J=1, then there is nothing to parallelize as the program splits the problem onto the cores by patient. So it sounds as if you are seeing the speedups due to using the map_rect version in Stan vs the MPI way of calculating things… although a doubling in speed for the ODE problem sounds like a lot to me. So, was J really equal to 1?

OK. Using J=2:

Method Serial (sec) MPI (sec)
ODE integration 691 596

Obviously there’s timing variation between runs, but really I just wanted to check the installation worked at all. I’ll report back when I have time to dive into some of my own models.

Thanks for all your help, can’t wait to get stuck into it.
Andrew

Well, thanks for bearing with me. User feedback right now is very much appreciated as this thing is rolling into Stan soon we want to make it as bullet proof as possible in terms of installation and usage.

And sure, I have shown that this stuff speeds up my problems, but its always good to have a hold-out sample.

I am starting to pay more attention to this now. Is there a minimal list of Boost files that are not in BH that are necessary for

I think the minimal set is the boost mpi and the boost serialization library. Both are not header-only libraries and are hence not included in BH from CRAN.

This does not apply to all MPI implementations. For example, Intel MPI ships two sets of wrappers, mpiicc vs mpicc, for Intel compiler and GNU compiler, respectively. And one usually wants to use Intel wrapper with Intel compiler.