You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I used this program to train an incremental correlation model and encountered a very headache inducing problem. In general, it should be reasonable for a subroutine to calculate a unit. Therefore, when designing dependent variables, we can consider them as dependent variables of a unit. For example, if a unit has six equivalent plastic strains, DEPVAR can be set to 6. However, in the pyvumat subroutine, it provides strain tensors for multiple units at once, such as [144,6]. How to set DEPVAR has become a headache inducing problem. Is there a way to solve this situation?
The text was updated successfully, but these errors were encountered:
I am not clear on what your issue is. You can change DEPVAR in the Abaqus input file to whatever you need. The stateOld/stateNew arrays will then be size [N,DEPVAR] where N is the number of material points up the batch size (which defaults to 144 in Abaqus). This is standard behavior for VUMATs. If you want to work with one material point (is this what you mean by unit?) you can add a loop and access the data for that entry by stateOld[i] which will be a vector of size DEPVAR.
If this doesn't answer your question, please rephrase or provide additional detail.
I used this program to train an incremental correlation model and encountered a very headache inducing problem. In general, it should be reasonable for a subroutine to calculate a unit. Therefore, when designing dependent variables, we can consider them as dependent variables of a unit. For example, if a unit has six equivalent plastic strains, DEPVAR can be set to 6. However, in the pyvumat subroutine, it provides strain tensors for multiple units at once, such as [144,6]. How to set DEPVAR has become a headache inducing problem. Is there a way to solve this situation?
The text was updated successfully, but these errors were encountered: