paint-brush
A Saner Method for Keeping Under API Rate Limitsby@v34sey
118 reads

A Saner Method for Keeping Under API Rate Limits

by veasey
veasey HackerNoon profile picture

veasey

@v34sey

From stacking tents for the airforce to working as a...

June 15th, 2023
Read on Terminal Reader
Read this story in a terminal
Print this story
Read this story w/o Javascript
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

A CRON task that makes many requests to a third-party API has a rate limit of 2 requests per second. To avoid hitting the rate limit, pause the script for an adequate amount of time. With a few adjustments, you can simply have the script wait a much more precise time.
featured image - A Saner Method for Keeping Under API Rate Limits
1x
Read by Dr. One voice-avatar

Listen to this story

veasey HackerNoon profile picture
veasey

veasey

@v34sey

From stacking tents for the airforce to working as a full stack developer.

About @v34sey
LEARN MORE ABOUT @V34SEY'S
EXPERTISE AND PLACE ON THE INTERNET.
0-item
1-item

STORY’S CREDIBILITY

Opinion piece / Thought Leadership

Opinion piece / Thought Leadership

The is an opinion piece based on the author’s POV and does not necessarily reflect the views of HackerNoon.

Guide

Guide

Walkthroughs, tutorials, guides, and tips. This story will teach you how to do something new or how to do something better.

Recently I encountered an issue where a CRON task contained a loop. It would make many requests to a third-party API endpoint. Within the documentation of this API, like many others, was a rate limit of 2 requests per second. 

Previous we had a solution that looked something like this, which I have seen in a lot of other places:

foreach($productVariants as $variant) {

    sleep(2); // avoid rate limit

    sendRequestToAPI($variant->id);
}

Very simple, pause for an adequate amount of time to avoid hitting the rate limit.

Sometimes such operations might have a loop that iterates over a dataset that could contain tens of thousands of entries. It already takes a long time to run, so every wasted millisecond can add up and can make a script longer than it needs to be. A waster half second over a thousand products is eight minutes! So one could appreciate how this could snowball.

With a few adjustments, you can simply have the script wait a much more precise amount of time. Still keeping under the rate limit, but also not wasting any more time than necessary.

$timeBetweenEachRequest = 2;

foreach($productVariants as $variant) {

    if (isset($timeStart)) {
        $elapsedTime = microtime(true) - $timeStart;
        $remainingTime = $timeBetweenEachRequest - $elapsedTime;
        sleep($remainingTime * 1000000); // Convert seconds to microseconds
    }

    $timeStart = microtime(true);

    sendRequestToAPI($variant->id);
}

In the example above we simply time how long the loop took to run, and wait the remaining amount of time necessary. By simply subtracting the time already spent executing the logic within the loop already, we avoid wasted time whilst still being safe from hitting our limits.

I hope this little snippet helps someone out there.

The lead image for this article was generated by HackerNoon's AI Image Generator via the prompt "software development".
L O A D I N G
. . . comments & more!

About Author

veasey HackerNoon profile picture
veasey@v34sey
From stacking tents for the airforce to working as a full stack developer.

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite
Devurls
X REMOVE AD