[CSC 435] CSC 435 MPI Question
Andrew J. Pounds
pounds_aj at mercer.edu
Tue Apr 21 20:18:00 EDT 2020
I don't think that is going to work -- but you can try it. Cleanup
could be ugly though.
If you want to streamline your testing I don't recommend having a single
executable call MPI_Initialize and MPI_Finalize repeatedly -- you should
generally only do that once. You can, however, mimic what we did in the
BabyBLAS and OpenBLAS assignments and have the size of the matrix and
the number of OpenMP threads to used per processor be COMMAND LINE
ARGUMENTS. That should be a minimal code addition and you can copy
the relevant portions from the fortran codes we used before.
You could then have a PBS script for executing on 1 processor, another
for 2 processors, another for 3 processors, etc., and inside that
scripts call the same program over and over again -- but use a
different matrix size and number of OpenMP threads per processor in each
call.
I expect you to really hit these processors hard, so I made sure that
before I left 218 today that the AC was on its lowest setting!
During our time tomorrow afternoon I hope to show you how I had to
structure a program to "re-use" the processes I had created on each node.
On 4/21/20 7:14 PM, William Carl Baglivio wrote:
> Dr. Pounds,
>
> One of the factors of MPI that I've been trying to stay mindful of is
> making sure that the MPI stuff is fully closed out (so I don't leave
> any fragments behind). In the main section of mmm_mpi.c, I want to run
> the tests over a range of dimensions as opposed to having to go back
> and define the dimension every time. I am good to loop the code after
> the MPI_Finalize() call? I.e.
>
> For (int I = 2000; I < 10000; I += 2000)
> {
> setting up code; running mmm; returning results; MPI_Finalize();
> }
>
> ~Will B.
--
Andrew J. Pounds, Ph.D. (pounds_aj at mercer.edu)
Professor of Chemistry and Computer Science
Director of the Computational Science Program
Mercer University, Macon, GA 31207 (478) 301-5627
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://theochem.mercer.edu/pipermail/csc435/attachments/20200421/d094f16b/attachment.html>
More information about the csc435
mailing list