Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] How 'low-latency-mode' affect encoding latency? #500

Closed
MemeTao opened this issue Sep 11, 2024 · 5 comments
Closed

[Question] How 'low-latency-mode' affect encoding latency? #500

MemeTao opened this issue Sep 11, 2024 · 5 comments
Labels

Comments

@MemeTao
Copy link

MemeTao commented Sep 11, 2024

In following two case, encoding latency will be totally different(both of them are set low-latency mode):
case 1:

// Continuous submit frame
setProperty(low-latency-mode, true);
setProperty(query-timeout, 50);
while (true) {
    auto t1 = cur_time();
    submitFrame();
    queryOutput();
    auto t2 = cur_time();
    auto took_ms = t2 - t1;   **// 1080P will took 4ms on my pc;**
}

case2:

// submit frame every 16ms;
setProperty(low-latency-mode, true);
setProperty(query-timeout, 50);
while (true) {
    auto t1 = cur_time();
    submitFrame();
    queryOutput();
    auto t2 = cur_time();
    auto took_ms = t2 - t1;   **// 1080P will took 12ms on my pc;**
    sleep(16 - took_ms);
}

Both of them are low-latency mode, why the encoding latency totally different.

It can be reproduced by 'EncoderLatency' example.

@MikhailAMD
Copy link
Collaborator

Hard to tell without actual code but few random thoughts:

  • setting timeout property is static, meaning it should be done before Init() call or no effect
  • Precision of Windows Sleep function or similar waits on events is very inaccurate - could be increased, see AMF samples
  • The sleep time may be negative
  • What is your method of getting time from accuracy perspective?
  • Why don't you measure time you actually spent in sleep the same way as you measure encode time?
  • What do you submit? Do you have any GPU operation other than encoder? If so, adding sleep may change clocks caused by power management on GFX or Compute queue
  • If you have PDBs and record GPUVEW ETL log, you can see stacks and why you have the difference.

@MemeTao
Copy link
Author

MemeTao commented Sep 12, 2024

Hard to tell without actual code but few random thoughts:

* setting timeout property is static, meaning it should be done before Init() call or no effect

* Precision of Windows Sleep function or similar waits on events is very inaccurate - could be increased, see AMF samples

* The sleep time may be negative

* What is your method of getting time from accuracy perspective?

* Why don't you measure time you actually spent in sleep the same way as you measure encode time?

* What do you submit? Do you have any GPU operation other than encoder? If so, adding sleep may change clocks caused by power management on GFX or Compute queue

* If you have PDBs and record GPUVEW ETL log, you can see stacks and why you have the difference.

Thanks for such a detailed answer. I was wrong. The real problem is that low latency mode is not set.

case 1 (no latency-mode, no sleep):

res = encoder->SetProperty(AMF_VIDEO_ENCODER_LOWLATENCY_MODE, false);

image

case 2(no latency-mode, sleep):

res = encoder->SetProperty(AMF_VIDEO_ENCODER_LOWLATENCY_MODE, false);
//..
amf_sleep(16);

image

This problem can reproduced from here: #501

I will see gpu-view for more details later (Study now)

@MikhailAMD
Copy link
Collaborator

But in this case seem situation is clear: if low latency parameter is false and you insert sleep after encoding, GPU power management sees that jobs submissions are spaced in time and will reduce VCN clocks and encoding will be longer, so latency increases. The main reason for adding low latency parameters is to force clocks to stay high and keep latency low.

@MemeTao
Copy link
Author

MemeTao commented Sep 12, 2024

But in this case seem situation is clear: if low latency parameter is false and you insert sleep after encoding, GPU power management sees that jobs submissions are spaced in time and will reduce VCN clocks and encoding will be longer, so latency increases. The main reason for adding low latency parameters is to force clocks to stay high and keep latency low.

Thanks for answer. “GPU power management” and "VCN clocks" is really new thing for me, I'll learn it right away. By the way, I found that encoding latency will very large when 3D Game running in the foreground. Is there any d3d api or something to increase gpu priority for app (just for capturing desktop and encoding) when 3d game running(take almost 100% gpu 3D resources).

@MikhailAMD
Copy link
Collaborator

Yes, game can interfere with some graphic jobs but not with encoder. To avoid this in the sample, use "-prerender" parameter as in this sample graphics is used for preparation of inputs only. Though, if your GPU/APU doesn't have ability to accept RGBA directly for encoding, the encoder will use shader based color converter.
Also note, that in real use case for game streaming there is pacing of streamed frames to a certain framerate. Use "-framerate" or "-hevcframerate" parameter in the sample to emulate it.
All this can be investigated in GPUVIEW.

@MemeTao MemeTao closed this as completed Sep 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants