this won't noticeably lose precision until at least 200 days
W/ 52-bits in the significand, it would last much longer than that. At a ~1 micro-second interval (accurate enough for virtually all use-cases), a 64-bit float should last for: 2^52 / 1000 / 1000 / 60 / 60 / 24 = ~52k days.
As a small criticism, if you used integers instead of floats, it would never lose precision (it would just have a much higher hard-cap, rather than a lower "soft-cap" that gradually gets buggier).
When you need to accumulate many small amounts into a much larger amount, floating-point is simply not a good fit, because the advantage of floating-point disappears in that scenario. That's the reason that all the system-time functions across different operating-systems use integers (usually as micro- or nano-seconds). At an interval of 1 micro-second, a 64-bit unsigned integer timer would run for ~200m days.