0

I need to call an API interatively, but the api endpoint limits me to 100 calls per minute. I have a list which is used as the data for an API call (I'm using requests to call the API).

Lets say the list has 227 elements. I need to make a separate API call for each element (that's just the way the API endpoint works).

My question is, how do I take the first 100 elements and use them to make 100 individual API calls, then wait for 60 secs, then repeat for the next 100 elements, sleep again for 60secs, then repeat for the last 27 elements?

I'm ok with the API cals themselves. I'm ok with using sleep() to pause the function, and I'm ok processing the response from the API calls.

It's just the breaking up my iterative loop into chunks that I'm struggling with.

My code looks a bit like this:

list_of_data = [{key1:value1, key2:value2, ... key227:value227}]
total_data=len(list_of_data_
for count in range(0, total_data):
    if count <= total_data:  # limit the api calls to the list length
       data = list_of_data[count]  # Pull out the current list element
       count = count + 1  # step counter to the next list element
       url = 'https:api.com/endpoint'
       params = {
                 'param_1': data[key1],
                 'param2': data[key2],
                 ...
                'param227': data[key227]
                 }
        headers = {some headers}

        response = requests.get(url=url,params=params,headers=headers,)
        json = response.json()
        <process the json>

I'm not looking for you to debug the above. Its just to illustrate what I'm doing. the actual question is, how do I get this to pause every 100 times the API is called, without leaving off the remainder(modulus)?

Many thanks in advance.

3
  • 1
    Please show your existing code to call the endpoint. Commented Jan 30 at 23:06
  • Question editied. Hope it helps Commented Jan 30 at 23:18
  • You really have 227 data parameters and it only pulls one piece of data? Show a working example of reading one element. Basically you need something like for x in range(0, 227, 100): for y in range(100): fetch(x+y) with a sleep after the 100 loop and break when x+y == 227. Commented Jan 31 at 5:54

2 Answers 2

0

Good to hear you are comfortable with the execution, and you are looking at how to execute the api calls.

Ability to retry

This is a key point that comes to mind. Yes, we can sleep for 60s, but every api call is not guaranteed success. One batch may succeed, and the next may fail. You don't want to be in a situation where you need to start from batch #1 when a batch in between fails.

Storage of inputs

I would suggest to think of a way to store files of batches, with one batch per file. This could be a blob item, message queue message, local file etc. Think of a queue of files, when one file is processed, move it is deleted, and the respone is handled. If api call is not successful, see how to retry.

Processing

Split the inputs into files with 100 items. Something on this lines is what I can suggest.

def process():
    try:
        api_call()
    except Exception as e:
        print(f"Error processing file {file}: {e}")
        #move file to a retry folder

for file in files:
    process(file)
    pause(60_seconds)

The main point id make, is you are limited to a batch size of 100. I'd always suggest to ensure success of every batch, and the ability to retry.

Sign up to request clarification or add additional context in comments.

Comments

0

Thanks to all who responded, in particular mkreiger, who guided me to this answer:

def list_splitter(seq, block_length):
    return (seq[pos:pos + block_length] for pos in range(0, len(seq), block_length))

for group in list_splitter(my_list, 100):
    print(group, "\n", len(group), "\n")

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.