Skip to content

feat: reduce apparent latency when re-computing tokens by computing the 'next' token in advance.

One downside of offloading the token computation to a worker thread and having to do token decryption is an increase in latency. For a few tokens this latency does not matter, but for many tokens it can induce a significant delay when refreshing tokens in the UI.

To hide this latency, when computing an OTP token for the current state of the account the logical 'next' token is also computed as well and cached in the Account object. When the next (re)computation of the OTP token is requested, the cached 'next' value is reused if still valid before the next pair of tokens is being computed. This way the apparent latency of a token update is reduced to an near immediate property update in the UI, hiding the actual latency of the computation itself.

This 'optimisation' is implemented in the dumbest possible fashion that can still work. This means that the code complexity of the change is quite limited, at the cost of rougly doubling the actual work being performed in the worker thread.

Edited by Johan Ouwerkerk

Merge request reports