Skip to main content

I’m writing a php script in which I need to fetch all records from a certain base of about 1000 records (and store them in one array so I can handle some stuff).


As I understand the max pageSize is 100, but with the offset it returns and using this as a parameter in your next request you can get the next pages.



But this doesn’t seem to really work. I can get a few pages (actually sometimes just one) until it throws the error LIST_RECORDS_ITERATOR_NOT_AVAILABLE on me.


I simply never am able to fetch every page. This error always occurs.



How can I proceed from this error to be able to fetch every record? Am I missing something?



A snippet from my code:



//curl code ...

curl_close($ch);

$airtable_response = json_decode($entries, TRUE);



foreach ($airtable_responseo"records"] as $key => $record) {

$all_recordso] = $record;

}



// Next page if there is one

if( $airtable_responseo'offset'] ) {

fetch_records($url . "?offset=" . $airtable_responseo'offset']);

} else {

return $all_records;

}

I’m not too familiar with PHP, but you seem to be doing things correctly.



One thought: values for query parameters need to be URI-encoded. So, for example, if the offset is itrxxxxxxxxxxxxxx/recxxxxxxxxxxxxxx, your URL will look something like:



https://api.airtable.com/v0/appxxxxxxxxxxxxxx/My%20Table?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx



Again, I’m not a PHP expert, but the rawurlencode function seems like it’ll do what you want. Perhaps something like:



fetch_records($url . "?offset=" . rawurlencode($airtable_responses'offset']));



Could you try logging the URL passed to fetch_records to see if that’s what you expect?


You should make sure that your recursion don’t add multiple offset to URL.



e.g.



https://api.airtable.com/v0/appxxxxxxxxxxxxxx/My%20Table?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx


Thanks for the suggestions guys. I altered my code during the searching process yesterday, and somewhere along that process the issue disappeared.



Probably, while optimizing and cleaning code, I unknowingly resolved the issue which @Aurtime mentioned.



Which is … yeah, noobish 🙂


Thanks guys!


Reply