Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some buffer memory does not get freed after the device is dropped #5648

Closed
cryscan opened this issue May 2, 2024 · 4 comments
Closed

Some buffer memory does not get freed after the device is dropped #5648

cryscan opened this issue May 2, 2024 · 4 comments

Comments

@cryscan
Copy link

cryscan commented May 2, 2024

Description
Some buffer memory not get freed after the device is dropped.
40d0847837d2d92e9c1cd014f5ed97a4

In the picture above, I load the same amount of data twice. Between the loads I drop the adapter, the device and all the buffers. However it seems that there are still some GPU memory not freed.

Expected vs observed behavior

  • Expected: The two loads take the same amount of GPU memory.
  • Observed: The later load takes more memory, and if repeating the process you will get an OOM.

Platform

AdapterInfo {
    name: "AMD Radeon 780M Graphics",
    vendor: 4098,
    device: 5567,
    device_type: IntegratedGpu,
    driver: "AMD proprietary driver",
    driver_info: "22.40.05.11 (LLPC)",
    backend: Vulkan,
}
@cryscan
Copy link
Author

cryscan commented May 2, 2024

Maybe related to #3518 and #3498?

@cryscan
Copy link
Author

cryscan commented May 2, 2024

Update: Tried

queue.submit(None);
device.poll(Maintain::Wait);

before dropping, it gets mitigated, but each reload still adds 0.2GB.

@ErichDonGubler ErichDonGubler changed the title Some buffer memory not get freed after the device is dropped Some buffer memory does not get freed after the device is dropped May 13, 2024
@jimblandy
Copy link
Member

@cryscan Thanks for the bug report and the analysis!

Do the issues you linked (#3518, #3498) exhibit the same behavior on your system? It is extremely valuable for us to have a test case that can reproduce the issue.

@teoxoy
Copy link
Member

teoxoy commented Jul 31, 2024

This is most likely a duplicate of #3498 or #5529 (which have been fixed).

If you are still seeing leaks like this on the latest release (v22) please reopen the issue or create a new one with a minimum reproducible example.

@teoxoy teoxoy closed this as not planned Won't fix, can't repro, duplicate, stale Jul 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

No branches or pull requests

3 participants