You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment when we convert a distributed StencilVector or StencilMatrix or BlockLinearOperator to PETSc.Vec or PETSc.Mat format with the topetsc() method, the corresponding (dense) arrays cannot be compared directly. This is because PETSc distributes the vectors or matrices differently than Psydac.
In a distributed Psydac StencilVector or StencilMatrix a process may own non-contiguous rows (for instance, if the domain is 2D and it is distributed by 4 processes). At the moment, a process with the PETSC distributed Vec or Mat object will always own contiguous rows. This is possible because when calling the assembly of the PETSc Vec or Mat object, there is an exchange of global communication.
The reason why PETSc does this is because it does not have any information about how the domain is decomposed. Consequently, it simply decomposes the vector or matrix by rows uniformly.
The major consequence of this issue is that the function petsc_to_array becomes unnecessarily expensive, since we need to gather the array from all the processes and redistribute it according to Psydac's domain partition.
A potential solution might be to create a PETSc.DM object that contains the domain decomposition information from Psydac.
Note: In order to get the global indices of the owned rows and columns of the PETSc.Mat object, one can use the method getOwnershipIS()
The text was updated successfully, but these errors were encountered:
At the moment when we convert a distributed
StencilVector
orStencilMatrix
orBlockLinearOperator
toPETSc.Vec
orPETSc.Mat
format with thetopetsc()
method, the corresponding (dense) arrays cannot be compared directly. This is because PETSc distributes the vectors or matrices differently than Psydac.In a distributed Psydac
StencilVector
orStencilMatrix
a process may own non-contiguous rows (for instance, if the domain is 2D and it is distributed by 4 processes). At the moment, a process with the PETSC distributedVec
orMat
object will always own contiguous rows. This is possible because when calling the assembly of the PETScVec
orMat
object, there is an exchange of global communication.The reason why PETSc does this is because it does not have any information about how the domain is decomposed. Consequently, it simply decomposes the vector or matrix by rows uniformly.
The major consequence of this issue is that the function
petsc_to_array
becomes unnecessarily expensive, since we need to gather the array from all the processes and redistribute it according to Psydac's domain partition.A potential solution might be to create a
PETSc.DM
object that contains the domain decomposition information from Psydac.Note: In order to get the global indices of the owned rows and columns of the PETSc.Mat object, one can use the method getOwnershipIS()
The text was updated successfully, but these errors were encountered: