You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While doing the backend data sync research we discussed several possible issues that can occur due to users submitting concurrent mutations, but agreed that none of them seem high priority enough to address now. There is a fuller write up here.
This issue is sized as a 3 for time-boxing the "investigate" portion of it. The expected outcome is to massage this issue or create further issues with more accurate sizes, not to implement some or all of the possible action items at the bottom of the issue.
The two biggest issues we thought of:
Because soil depth intervals are validated using Django's clean method, it's possible that two concurrent depth interval updates which are individually valid could both get committed creating overlapping depth intervals.
because we don't use Django's update_fields argument when calling save on a mutation, it's possible that two concurrent updates to independent fields could result in only one field getting updated, and the client whose mutation was ignored wouldn't immediately know about it. this is true of all models in the codebase.
Ways to address this:
the first issue can be addressed by removing the non-DB level constraint around overlapping depth intervals. this could be by:
figuring out how to make it a DB level constraint
abandoning a data-level invariant of non-overlapping depth intervals
the second issue can be addressed by using Django's update_fields argument
these and similar issues could be addressed by using the "serializable" transaction isolation level, which would require creating re-try logic for out transactions. this can be done on a per-transaction or global basis
these and similar issues could be addressed by acquiring row-level locks using Django's select_for_update functionality, though care would need to be taken to avoid creating deadlocks
The text was updated successfully, but these errors were encountered:
While doing the backend data sync research we discussed several possible issues that can occur due to users submitting concurrent mutations, but agreed that none of them seem high priority enough to address now. There is a fuller write up here.
This issue is sized as a 3 for time-boxing the "investigate" portion of it. The expected outcome is to massage this issue or create further issues with more accurate sizes, not to implement some or all of the possible action items at the bottom of the issue.
The two biggest issues we thought of:
clean
method, it's possible that two concurrent depth interval updates which are individually valid could both get committed creating overlapping depth intervals.update_fields
argument when callingsave
on a mutation, it's possible that two concurrent updates to independent fields could result in only one field getting updated, and the client whose mutation was ignored wouldn't immediately know about it. this is true of all models in the codebase.Ways to address this:
update_fields
argumentselect_for_update
functionality, though care would need to be taken to avoid creating deadlocksThe text was updated successfully, but these errors were encountered: