Help

Re: How to get more detailed revision history log en masse?

Solved
Jump to Solution
7478 0
cancel
Showing results for 
Search instead for 
Did you mean: 
Alyssa_Buchthal
8 - Airtable Astronomer
8 - Airtable Astronomer

We have an Enterprise level account and are trying to implement a date field which will track the date that a status is changed in another field. This is all good on a move-forward basis, but we would also like to update it historically, which is proving to be trickier.

With our enterprise account, we have 3 years of revision history, but it seems like the only way to access what we need is to open up every single record, scroll through until we find the action we’re wanting to track, and then approximate based off of Airtable’s generic date timeframes when that action was performed. Once we’re out of the 1 month timeframe, this goes to month by month and then year by year approximations, when what we’re really looking for is exact dates.

Is there a better way to do this? I don’t see an access point in the scripting API to see record history. Our Enterprise audit logs only go back 6 months, and don’t offer this level of granularity. We know the data is there but are at a loss as to how to fetch it. This is about 1000 records, so not anything we could do manually. Has anybody done something similar before?

1 Solution

Accepted Solutions
Alyssa_Buchthal
8 - Airtable Astronomer
8 - Airtable Astronomer

Oh good idea! Didn’t even think of that. This is the article I used to guide my powershell conversion, I ended up using his exact command to fully convert, rename, and place in the directory I wanted. Then I performed a text to column to break out the JSON appropriately. Then I pivoted, filtered by the action type I wanted, which is “updatePrimitiveCell”, and again by the specific column id and matching single select option id. Super easy to do.

When I’m doing en masse, I’ll likely write a bash script to go ahead and perform this conversion in bulk, starting with the csv of download links for each days audit logs. When I do that, I’ll try to remember to post the process here as well!

(@Bill.French this worked for me as I only needed to parse out the top layer of JSON, from that point on I could get what I needed without entering any of the contained JSON objects, though if I needed to do that I’d likely just VBA a script to generate a table using the request ids and contained JSON object with a text-to-column operation using ceiling of options from that JSON to ensure all my objects aligned in the new columns. Then I could just vlookup to get any relevant internal JSON results to the parent request object. This is a one-time backfilling project for content only, no reporting or compliance work involved.)

See Solution in Thread

11 Replies 11

From what I understand, Enterprise Admins have the ability to download an audit history log, which may contain more detailed information that you can search through. I’ve never seen one of these audit history logs myself yet, so be sure to let us know if it gives you all the information that you are looking for.

Otherwise, if that doesn’t do the trick, there is no other way to a access the revision history because the revision history is not a part of Airtable’s API.

Yeah I’ve downloaded those and it doesn’t seem as though they have what I need, though I could be wrong. They’re pretty large JSON lines files, so I haven’t dived too deep into converting them to more readable csv yet. From my initial browsing it seems like the field tracking doesn’t get granular enough, it’s more for verifying user/API activity.

Oh okay, that’s good to know. Then your only option would likely be to build your own revision history system, which would probably not be an easy or fun task.

@ScottWorld is on the right track, and I believe the granularity you seek is in that JSON data. The challenge is to harvest this data set in a way that you can utilize it; it will require some intensive scripting and the development of a framework to use this data. I would not recommend attempting to convert it to CSV as JSON objects have many layers - they are not flat objects. And this may be the reason the granular details are not obvious to you.

This problem is not likely to become easier unless you embrace the Enterprise Events API which gives you the capacity to build a very comprehensive compliance and reporting framework that integrates with any system you have or may decide to create for this business requirement.

In my experience, the Enterprise audit logs are extremely granular, and do contain the details for every change to every cell value. The logs show the user id of who make the change, the new value, and the exact millisecond of the change. The logs do not show the previous value.

However, it does require digging into the JSON in order to get that level of detail. If you are interested in a specific field in a specific table, you can filter the results to a specific table ID. Even so, digging into the JSON realistically needs to be done with code. There is so much data in the audit logs that you need code to filter out all the stuff you are not interested in.

I actually have been digging, looks like powershell conversion and some pivots were all I needed to get at the data! Thanks all. In this scenario, a csv sheet was perfect as it was only one action type and one column I was looking at.

Awesome!

Would you mind sharing the powershell conversions and pivots you used? It might help other people in the future.

Alyssa_Buchthal
8 - Airtable Astronomer
8 - Airtable Astronomer

Oh good idea! Didn’t even think of that. This is the article I used to guide my powershell conversion, I ended up using his exact command to fully convert, rename, and place in the directory I wanted. Then I performed a text to column to break out the JSON appropriately. Then I pivoted, filtered by the action type I wanted, which is “updatePrimitiveCell”, and again by the specific column id and matching single select option id. Super easy to do.

When I’m doing en masse, I’ll likely write a bash script to go ahead and perform this conversion in bulk, starting with the csv of download links for each days audit logs. When I do that, I’ll try to remember to post the process here as well!

(@Bill.French this worked for me as I only needed to parse out the top layer of JSON, from that point on I could get what I needed without entering any of the contained JSON objects, though if I needed to do that I’d likely just VBA a script to generate a table using the request ids and contained JSON object with a text-to-column operation using ceiling of options from that JSON to ensure all my objects aligned in the new columns. Then I could just vlookup to get any relevant internal JSON results to the parent request object. This is a one-time backfilling project for content only, no reporting or compliance work involved.)

Unless, you develop a system that establishes the deltas between each change point. This is possible given all historical data or by using the more robust Enterprise Events API. But, it’s not a trivial pursuit. :winking_face: