Recently I encountered an issue where a CRON task contained a loop. It would make many requests to a third-party API endpoint. Within the documentation of this API, like many others, was a rate limit of 2 requests per second.
Previous we had a solution that looked something like this, which I have seen in a lot of other places:
foreach($productVariants as $variant) {
sleep(2); // avoid rate limit
sendRequestToAPI($variant->id);
}
Very simple, pause for an adequate amount of time to avoid hitting the rate limit.
Sometimes such operations might have a loop that iterates over a dataset that could contain tens of thousands of entries. It already takes a long time to run, so every wasted millisecond can add up and can make a script longer than it needs to be. A waster half second over a thousand products is eight minutes! So one could appreciate how this could snowball.
With a few adjustments, you can simply have the script wait a much more precise amount of time. Still keeping under the rate limit, but also not wasting any more time than necessary.
$timeBetweenEachRequest = 2;
foreach($productVariants as $variant) {
if (isset($timeStart)) {
$elapsedTime = microtime(true) - $timeStart;
$remainingTime = $timeBetweenEachRequest - $elapsedTime;
sleep($remainingTime * 1000000); // Convert seconds to microseconds
}
$timeStart = microtime(true);
sendRequestToAPI($variant->id);
}
In the example above we simply time how long the loop took to run, and wait the remaining amount of time necessary. By simply subtracting the time already spent executing the logic within the loop already, we avoid wasted time whilst still being safe from hitting our limits.
I hope this little snippet helps someone out there.
The lead image for this article was generated by HackerNoon's AI Image Generator via the prompt "software development".