You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My solution requires a high-performance timestamp (as unix epoch) with the precision of 10µs.
I recognise that it's probably me trying to solve this wrong, but it seems like Quanta continues to drift away from the system clock.
The below code creates both a t0 starting point from Quanta and a t0_sys starting point from SystemTime and attempts to calculate the current system time based on those starting points, but it appears that the Quanta clock keeps drifting further away from the system time.
Is this a Quanta calibration issue, or is my solution just too naive?
use std::{thread, time::{Duration,SystemTime}};use quanta::Clock;fnmain(){let clock = Clock::new();let t0_sys = SystemTime::now();let t0 = clock.now();loop{let now_sys = SystemTime::now();let now = clock.now();let calc = t0_sys + now.duration_since(t0);println!(" now = {:?}", now_sys);let drift = if calc < now_sys { now_sys.duration_since(calc).unwrap()}else{ calc.duration_since(now_sys).unwrap()};println!("calc = {:?} drift = {:?} total elapsed = {:?}\n", calc, drift, t0.elapsed());
thread::sleep(Duration::from_secs(5));}}
Apologies for the delay in response here. I saw this issue when you filed it, immediately looked it over, did some calculations, and then.... forgot to respond. 😭
It does indeed appear like there is potentially a calibration issue here, or that we're simply hitting the limits of what the calibration loop can provide. Calibration cannot cope with changes to system time through mechanisms like NTP, which may skew time forwards or backwards at a rate faster or slower than the actual local passage of time.
All of that said, I'd say your test case is reasonable. Running it locally, I can observe the same slow drift over time, in which the degree that it drifts changes between runs. I had also observed a few runs where the drift was sinusoidal, likely related to legitimate NTP adjustment/tracking.
Replacing quanta with simple calls to Instant::now all but eliminates the drift, locking it tightly to a few hundred nanoseconds plus or minus over the course of a few minutes. This is not surprising, as Instant::now is based on CLOCK_MONOTONIC for unix-y platforms, where it will be subject to NTP adjustments (but simply ensured to at least only change monotonically), and so is likely to stay much closer to SystemTime which is similarly disciplined by NTP.
This definitely has me thinking about re-examining the use of TSC overall, but I would suggest just utilizing SystemTime/Instant for the time being since that should be accurate enough if you have a strict need to present a timestamp that is wall clock-referenced.
My solution requires a high-performance timestamp (as unix epoch) with the precision of 10µs.
I recognise that it's probably me trying to solve this wrong, but it seems like Quanta continues to drift away from the system clock.
The below code creates both a
t0
starting point from Quanta and at0_sys
starting point fromSystemTime
and attempts to calculate the current system time based on those starting points, but it appears that the Quanta clock keeps drifting further away from the system time.Is this a Quanta calibration issue, or is my solution just too naive?
Output (for 9 minutes):
The text was updated successfully, but these errors were encountered: