Out-of-memory checks for large data #533
Labels
area: data
Things related to the data structures underlying the world, and the functions that manipulate them.
kind: feature
Adding user-facing/developer-facing functionality
Voxel data can easily be very large, even accidentally. Therefore, we should make some amount of effort to handle out-of-memory failures for the large allocations. This cannot completely prevent running out of memory fatally, since a small allocation might fail after a large one took up nearly all available memory, but it should result in a much higher probability of continued functioning.
Potentially large allocations include:
Space
creationSpace
that did not previously have itBlock::evaluate()
producing evaluated voxelsIt might also be interesting to keep global counters of the total amount of memory allocated for each of these purposes, for diagnostics. That is an entirely separate implementation effort, except that it touches the same parts of the code.
The text was updated successfully, but these errors were encountered: