Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Batched inference in existing models? #3149

Open
wouterwln opened this issue Nov 8, 2024 · 0 comments
Open

[Docs] Batched inference in existing models? #3149

wouterwln opened this issue Nov 8, 2024 · 0 comments

Comments

@wouterwln
Copy link

📚 The doc issue

I can not find in the documentation if it is possible to do inference on batches of images (or an np.ndarray of rank 4) using the MMPoseInferencer. I can imagine it is beneficial for users to run inference in batches because of the obvious performance gains. But it is not clear how and if this is possible

Suggest a potential alternative/fix

A clear tutorial on how to do batched inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant