I have about 10k records in my airtable base, and each have a specific link to a webpage that I regularly check for certain data. I would love to be able to scrape those pages for the info needed and automatically put it into AT. Is this possible? I assume I will have to run a script of some sort, but don't know exactly how to go about that. We are currently updating the 10k records manually, and using some sort of AI would save SO much time!
Yes this should be possible, depending on the website. The most important question is 'can we get extract the data'?
I've built a tool that's good at this (Simplescraper), so if you'd like to share an example of what data you're extracting and from which URL it will be possible verify what's possible pretty quickly.
After that it's a case of pulling in the data and mapping it to your fields - the scripting extension should make this step straightforward.
Happy to help you through the steps.
@Jayme_Richardso, @itsmike used Simplescraper to collect every post I've made in this community into Airtable itself, and it found all 3,188 with author links, etc - cool stuff. Having a consultant who knows as much about Airtable as his scraping craft is golden.
Yes, it's possible to automate web scraping for information from website links in your Airtable records. You can achieve this by writing a custom script or using web scraping tools and libraries like Python's Beautiful Soup or Scrapy.
These tools can help you extract the desired data from the webpages linked in your records and then update your Airtable base with the scraped information. By automating this process, you can significantly save time compared to updating the 10k records manually.