You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have thoroughly reviewed the project documentation (installation, training, inference) but couldn't find any relevant information that meets my needs. English中文日本語Portuguese (Brazil)
I have searched for existing issues search for existing issues, including closed ones.
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
1. Is this request related to a challenge you're experiencing? Tell us your story.
Does this project support multi-gpu inference? Can you give me some potential solutions?
2. What is your suggested solution?
NO
3. Additional context or comments
No response
4. Can you help us with this feature?
I am interested in contributing to this feature.
The text was updated successfully, but these errors were encountered:
Currently, we don't support multi-gpu inference. However, you can still deploy multiple instance on each card and run a load balancer to achieve parallel inference.
Self Checks
1. Is this request related to a challenge you're experiencing? Tell us your story.
Does this project support multi-gpu inference? Can you give me some potential solutions?
2. What is your suggested solution?
NO
3. Additional context or comments
No response
4. Can you help us with this feature?
The text was updated successfully, but these errors were encountered: