I have an existing Fortran code using MPI for parallel work. I am interested in adding some of the PETSc solvers (KSP specifically), however when including the relevant .h or .h90 files (petsc, petscsys, petscksp, etc...) I get a problem with variables that share the same name as the MPI ones.
i.e.:
error #6405: The same named entity from different modules and/or program units cannot be referenced. [MPI_DOUBLE_PRECISION]
error #6405: The same named entity from different modules and/or program units cannot be referenced. [MPI_SUM]
error #6405: The same named entity from different modules and/or program units cannot be referenced. [MPI_COMM_WORLD]
and so on.
(using ics/composer_xe_2011_sp1.6.233 and ics/impi/4.0.3.008 and petsc 3.6.0, also tried older petsc version 3.5.4)
All of these are defined equally in both MPI and PETSc - is there a way to resolve this conflict and use both?
I'll point out that I DO NOT WANT to replace MPI calls with PETSc calls, as the code should have an option to run independent of PETSc.
As for minimal code, cleaning up the huge code is an issue apparently, so I've made the following simple example which includes the relevant parts:
program mpitest
implicit none
use mpi
! Try any of the following:
!!!#include "petsc.h"
!!!#include "petsc.h90"
!!!#include "petscsys.h"
! etc'
integer :: ierr, error
integer :: ni=256, nj=192, nk=256
integer :: i,j,k
real, allocatable :: phi(:,:,:)
integer :: mp_rank, mp_size
real :: sum_phi,max_div,max_div_final,sum_div_final,sum_div
call mpi_init(ierr)
call mpi_comm_rank(mpi_comm_world,mp_rank,ierr)
call mpi_comm_size(mpi_comm_world,mp_size,ierr)
allocate(phi(nj,nk,0:ni/mp_size+1))
sum_phi = 0.0
do i=1,ni/mp_size
do k=1,nk
do j=1,nj
sum_phi = sum_phi + phi(j,k,i)
enddo
enddo
enddo
sum_phi = sum_phi / real(ni/mp_size*nk*nj)
call mpi_allreduce(sum_div,sum_div_final,1,mpi_double_precision,mpi_sum, &
mpi_comm_world,ierr)
call mpi_allreduce(max_div,max_div_final,1,mpi_double_precision,mpi_max, &
mpi_comm_world,ierr)
call mpi_finalize(error)
deallocate(phi)
WRITE(*,*) 'Done'
end program mpitest
This happens directly when PETSc headers are included and vanishes when the include is removed.
Alright, so the answer has been found:
PETSc does not favor Fortran much and, therefore, does not function the same way as it does with C/C++ and uses different definitions.
For C/C++ one would use the headers in /include/petscXXX.h
and everything will be fine, moreover the hierarchical structure already includes dependent .h files (i.e. including petscksp.h will include petscsys.h, petscvec.h and so on).
NOT IN FORTRAN.
First and foremost,for FORTRAN one needs to include headers in /include/petsc/finclude/petscXXXdef.h
(or .h90 if PETSc is compiles with that flag). Note that the files are located in a different include folder and are petscxxxdef.h.
Then 'use petscXXX' will work along with MPI without conflicting.