Menu

  • Home
  • Blogs
  • Projects
logo
  • Home
  • Blog
  • Projects
logo
HomeBlogProjects

Built with:

  • AWS Lambda, S3, CloudFront
  • nextjs logo Next.js
© kenjiding 2025 All rights Reserved

How to accurately accumulate time on the front end

Views: 869editDate: 11/4/2020

requirement

The project's page needs to periodically fetch data, and it should support pausing and resuming data fetching. It also needs to accumulate the fetching time in minutes.

implement

In general, front-end developers often use setTimeout or setInterval to increment a variable by 1 every second. Then, by dividing it by 60, they can get the accumulated time in minutes, like this:

  let second = 0;
  let minute = 0;
  setInterval(() => {
      second += 1;
      minute = Math.floor(second / 60); // minute
   }, 1000);

issue

This situation may seem fine at first glance, but students familiar with JavaScript should be able to spot the problem right away. This is because JavaScript is single-threaded, and each callback function must pass through the event queue before reaching the main thread for execution.

So, if there is a complex and time-consuming event in the event queue before the setInterval callback, which takes more than 1 second to complete, then the setInterval callback will wait for more than 1 second before it can execute.

This can lead to errors. For example, when second = 2, it may actually be 4 or 5 seconds in reality.

Obviously, the above code will accumulate errors over time, and the error will grow larger as time goes on.

After some thought, can we improve the accuracy by using timestamps?

That is, we start by recording the timestamp when the fetching begins (startDate). Then, we use setInterval to record the current timestamp (currentDate) as quickly as possible. We calculate the accumulated fetching minutes by subtracting the start timestamp from the current timestamp, dividing by 60 * 1000, and then rounding down the result using Math.floor. This way, we calculate the accumulated fetching minutes. The code is as follows:

let minute = 0;
let startDate = new Date();
const handler = setInterval(function loopTimer() {
  let currentDate = new Date();
  minute = Math.floor((currentDate - startDate) / 60 * 1000);
  startDate = currentDate;
}, 10);

const clearInterval = () => {
  handler && clearInterval(handler);
}

In theory, there is still a time error in this approach. As mentioned above, loopTimer may not execute exactly 1/100 second later. However, we are not using the second += 1 method to accumulate, but rather using timestamps. So, as long as the interval of setTimeout is short enough, the more frequent the calculation of minute = Math.floor((currentDate - startDate) / 60 * 1000), the more accurate the minute will be.

Javascript Typescript react