Trying to fetch all records in php - error occurs

#1

I’m writing a php script in which I need to fetch all records from a certain base of about 1000 records (and store them in one array so I can handle some stuff).
As I understand the max pageSize is 100, but with the offset it returns and using this as a parameter in your next request you can get the next pages.

But this doesn’t seem to really work. I can get a few pages (actually sometimes just one) until it throws the error LIST_RECORDS_ITERATOR_NOT_AVAILABLE on me.
I simply never am able to fetch every page. This error always occurs.

How can I proceed from this error to be able to fetch every record? Am I missing something?

A snippet from my code:

//curl code ...
curl_close($ch);
$airtable_response = json_decode($entries, TRUE);

foreach ($airtable_response["records"] as $key => $record) {
	$all_records[] = $record;
}

// Next page if there is one
if( $airtable_response['offset'] ) {
	fetch_records($url . "?offset=" . $airtable_response['offset']);
} else {
	return $all_records;
}
#2

I’m not too familiar with PHP, but you seem to be doing things correctly.

One thought: values for query parameters need to be URI-encoded. So, for example, if the offset is itrxxxxxxxxxxxxxx/recxxxxxxxxxxxxxx, your URL will look something like:

https://api.airtable.com/v0/appxxxxxxxxxxxxxx/My%20Table?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx

Again, I’m not a PHP expert, but the rawurlencode function seems like it’ll do what you want. Perhaps something like:

fetch_records($url . "?offset=" . rawurlencode($airtable_response['offset']));

Could you try logging the URL passed to fetch_records to see if that’s what you expect?

#3

You should make sure that your recursion don’t add multiple offset to URL.

e.g.

https://api.airtable.com/v0/appxxxxxxxxxxxxxx/My%20Table?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx?offset=itrxxxxxxxxxxxxxx%2Frecxxxxxxxxxxxxxx

1 Like
#4

Thanks for the suggestions guys. I altered my code during the searching process yesterday, and somewhere along that process the issue disappeared.

Probably, while optimizing and cleaning code, I unknowingly resolved the issue which @Aurtime mentioned.

Which is … yeah, noobish :slight_smile:
Thanks guys!

1 Like