[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: MPI debugging workflows



On 2018-08-29 12:15, Alastair McKinstry wrote:
On 28/08/2018 22:20, Drew Parsons wrote:
On 2018-08-03 22:46, Dima Kogan wrote:

2. Is the MPI implementation significant? Would mpich behave potentially
differently here from openmpi?

For what it's worth, 2 separate upstreams (PETSc and FEniCS) both hold a dim view of openmpi, perceiving it full of bugs, which has certainly been the case so far with openmpi3.  They recommend mpich.

We've already discussed transitioning mpi-defaults over from openmpi to mpich in the past. Now that openmpi3 is more or less settled in testing, is it time to open that discussion again?

I'm in favour of moving default to mpich. OpenMPI is now at 3.1.2 and
I plan to ship 3.1.x in Buster.

Its also worth testing MPICH 3.3b3. Currently mpich is at 3.3b2 and
3.3b3 is in experimental for the last few months. I've left off
updating mpich until openmpi is stable (nearly tere I hope)



It's a comedy of errors with openmpi3, I see 3.1.2 has triggered new RC bugs !

If you want a break from the openmpi angst then go ahead and drop mpich 3.3b3 into unstable. It won't make the overall MPI situation any worse... :)

Drew


Reply to: