public inbox for gentoo-soc@lists.gentoo.org
 help / color / mirror / Atom feed
* [gentoo-soc] Weekly Report: MPI Overlay, Week 7
@ 2017-07-17 10:47 Michael Gilroy
  0 siblings, 0 replies; only message in thread
From: Michael Gilroy @ 2017-07-17 10:47 UTC (permalink / raw
  To: gentoo-soc; +Cc: soc-admins

[-- Attachment #1: Type: text/plain, Size: 3973 bytes --]

Hi,

*Summary:*

Last week's plan:
1) Successfully test mpi-select against hpl, disregarding having only
mpi-dependent libs installed.
2) Implement proper DEPEND= function based on users' make.conf settings.
3) Clarify design goals for mpi-dependent installations and start work on
executing this.

mpi-select is much more refined but still lacks proper testing due to
fortran issues I've been having with my work environment(s). I am working
to resolve this, as I do have a much better idea of what my next immediate
actions are when I can test. Sorting out this block should improve next
week's productivity.

Next week's plans:
1) Test various make.conf configurations to test against hpl builds.
2) Set proper LDFLAGS to detect libmpi.so in its new location.
3) Verify implementations listed in make.conf, expand upon
dependency-appending function.

*Day-by-day breakdown:*

*Monday's Plans:*

Sucessfully use inherited multibuild functions for mpi purposes

Incomplete. I will be working on this tomorrow.


Add helper function for dependencies based on MPI_TARGETS

Completed, but much more will be added to this function later. MPI_TARGETS
are used but there are much more dependencies to be accounted for.


Test against hpl, add modified hpl to overlay
Work in progress. I added hpl to the overlay but more work is needed to
successfully build against hpl.


*Tuesday's Plans:*

Sucessfully use inherited multibuild functions for mpi purposes.

Implemented but not tested. I did not get to work as many hours today on
this as I have a lot of homework due tomorrow.


Test against arbitrary MPI_TARGETS input in hpl
Testing in progress. I have noticed some small bugs that I need to squash
tomorrow.


*Wednesday's Plans:*

Test mpi-select multibuild for mpi_src_*

Progress has been made but this can be expanded upon. My goal is to have
this 100% operational by the end of the week. Again, today I was busy due
to school, but I will be working full shifts in days to come.


Debug issues in hpl installation.

I am still running into gcc/fortran compiler issues on gcc 5+ even after
following the wiki with regards to updating libs. I will continue testing
tomorrow on my bare metal system to work things out.



*Thursday's Plans:*

Expand upon src_* using multilib.

Testing needed. I am trying to use more multilib function calls to make
things less cluttered.


Test hpl succesfully on a new system with proper dependency installations.

I am still struggling with hpl compilations against an environment that has
things installed to /usr/lib64/mpi/*, as I believe when it looks for mpicc
this is now in a different place. I will be testing this more tomorrow, and
make modifications to the hpl ebuild as needed. Is there a simple way to
tell an ebuild to point to a different directory for libs? I'm currently
failing the build with a  "mpicc: command not found" error which has led to
to this conclusion. hpl is currently on eapi 4 as well, so perhaps I could
also give it an EAPI bump this weekend.


Test against new MPI_TARGETS. Add the ebuilds to the overlay if they are
not already there.

Untested for reasons above.

I will certainly be working quite a bit this weekend, as these blocks have
prevented me from making as much progress as I have wanted to make this
week. Once testing issues are sorted out I have some ideas on how to expand
upon the eclass.


*Friday's Plans:*

Sort out mpicc/lib issues in hpl.ebuild, potentially add this to eclass as
this solution may be applicable to other mpicc/lib dependent mpi software.

Completed, as per our meeting. Different roadblocks should be easier to
deal with now. Even though my bare metal setup's compiler should be fine,
I'll send build info for my VM setup's failures.


Squash existing logical bugs in mpi-select through testing.

Work in progress. Now that I know how to overcome issues I've been having,
this should go more smoothly tomorrow. I will ask questions as needed.


Thanks
Michael

[-- Attachment #2: Type: text/html, Size: 9711 bytes --]

^ permalink raw reply	[flat|nested] only message in thread

only message in thread, other threads:[~2017-07-17 10:47 UTC | newest]

Thread overview: (only message) (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2017-07-17 10:47 [gentoo-soc] Weekly Report: MPI Overlay, Week 7 Michael Gilroy

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox