<html>
<head>
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252">
</head>
<body>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">And remember -- PBS is just executing a
bash script -- there's nothing preventing you from a using a
couple of nested loops around your mpirun command to set the
number of threads and the matrix size. It may be kinda late for
that now -- but that's a trick I've used before in these type of
scenarios.<br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">On 4/21/20 8:18 PM, Andrew J. Pounds
wrote:<br>
</div>
<blockquote type="cite"
cite="mid:bbd85c4e-bd9b-fd8b-efe7-e61e1bcc2e1b@mercer.edu">
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252">
<div class="moz-cite-prefix">I don't think that is going to work
-- but you can try it. Cleanup could be ugly though.</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">If you want to streamline your
testing I don't recommend having a single executable call
MPI_Initialize and MPI_Finalize repeatedly -- you should
generally only do that once. You can, however, mimic what we
did in the BabyBLAS and OpenBLAS assignments and have the size
of the matrix and</div>
<div class="moz-cite-prefix">the number of OpenMP threads to used
per processor be COMMAND LINE ARGUMENTS. That should be a
minimal code addition and you can copy the relevant portions
from the fortran codes we used before.<br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">You could then have a PBS script for
executing on 1 processor, another for 2 processors, another for
3 processors, etc., and inside that scripts call the same
program over and over again -- but use a different matrix size
and number of OpenMP threads per processor in each call. <br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">I expect you to really hit these
processors hard, so I made sure that before I left 218 today
that the AC was on its lowest setting!</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">During our time tomorrow afternoon I
hope to show you how I had to structure a program to "re-use"
the processes I had created on each node.</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix"><br>
</div>
<div class="moz-cite-prefix">On 4/21/20 7:14 PM, William Carl
Baglivio wrote:<br>
</div>
<blockquote type="cite"
cite="mid:8c870007b84c430c8f30a2200f45a8fc@BN6PR01MB2228.prod.exchangelabs.com">
<meta http-equiv="Content-Type" content="text/html;
charset=windows-1252">
<style type="text/css" style="display:none;"> P {margin-top:0;margin-bottom:0;} </style>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> Dr. Pounds,</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> <br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> One of the factors of
MPI that I've been trying to stay mindful of is making sure
that the MPI stuff is fully closed out (so I don't leave any
fragments behind). In the main section of mmm_mpi.c, I want to
run the tests over a range of dimensions as opposed to having
to go back and define the dimension every time. I am good to
loop the code after the MPI_Finalize() call? I.e.</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> <br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> For (int I = 2000; I
< 10000; I += 2000)</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> {</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> setting up code;
running mmm; returning results; MPI_Finalize();</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> }</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> <br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif;
font-size: 12pt; color: rgb(0, 0, 0);"> ~Will B.</div>
</blockquote>
<p><br>
</p>
<pre class="moz-signature" cols="72">--
Andrew J. Pounds, Ph.D. (<a class="moz-txt-link-abbreviated" href="mailto:pounds_aj@mercer.edu" moz-do-not-send="true">pounds_aj@mercer.edu</a>)
Professor of Chemistry and Computer Science
Director of the Computational Science Program
Mercer University, Macon, GA 31207 (478) 301-5627
</pre>
</blockquote>
<p><br>
</p>
<pre class="moz-signature" cols="72">--
Andrew J. Pounds, Ph.D. (<a class="moz-txt-link-abbreviated" href="mailto:pounds_aj@mercer.edu">pounds_aj@mercer.edu</a>)
Professor of Chemistry and Computer Science
Director of the Computational Science Program
Mercer University, Macon, GA 31207 (478) 301-5627
</pre>
</body>
</html>