Hi,
I’m trying to import a dataset of permits in the city of Austin, TX. My goal will be to update this daily, so I can filter by permits add on a given day.
Every time I hit “Run” on my script, it only adds about 1,000-1,500 records to the table (different every time) and it seems to be skipping records, the length of the data set is roughly 20,000 records.
Thanks in advance for any help!
Here’s where I’m importing from:
https://dev.socrata.com/foundry/data.austintexas.gov/mavg-96ck
And here’s my script:
// url to be queried
let url = "https://data.austintexas.gov/resource/mavg-96ck.json?$limit=50000&$$app_token=[MY API TOKEN]";
// set the table that stores the records
let table = base.getTable("Table 1");
// make the API call
let response = await fetch(url, {
method: "GET"
})
// get the API response
let data = await response.json();
output.text("Retrieved " + data.length + " Records");
data.forEach(async element => {
let newRecord = await table.createRecordAsync({
"Permit Number": element.permit_number,
"Case Type": element.case_type,
"Status": element.status,
"Application Date": element.application_start_date,
"Link": element.link,
"Address": element.street_number + " " + element.street_name +" "+ element.street_type +", "+ element.city +", TX "+ element.zip_code,
"Owner Name": element.owner_fullname,
"Owner Company": element.owner_organization_name
});
});