Help

Archive selected records to new base?

Topic Labels: ImportingExporting
4994 8
cancel
Showing results for 
Search instead for 
Did you mean: 
Miles_T
5 - Automation Enthusiast
5 - Automation Enthusiast

We have a base where we track kids who are up for adoption. Each record contains a LOT of notes and there are two tables - A “main” table and a “notes” table. Records where a child has been adopted are marked as “closed.” These “closed” records need to get moved to an archive base - with the option of adding them back to our “main” base in the future. Filtering out the records I want to move is simple enough, but I’m not sure how to duplicate data from each table and move that to a new base - all while keeping the linking alive. If you have encountered this and have ideas I would love to hear them. Thanks!

8 Replies 8

I’m dealing with similar issues on a couple of projects. Is there a reason you wish to archive to a different base, as opposed to merely a different table? If you must move to a different base, I’d recommend first archiving to a table in the original base, in which you use rollup fields and concatenation to ‘pack’ the data to be archived, and from there export CSV files to be imported into the archive base. (If you don’t need a separate base, you do everything up to the CSV export.)

My attempts at archiving all generally follow the process given in this reply. The use of the copy-and-paste of the looked up ‘portmanteau’ field, coupled with the IF() statement that draws from either the original referenced field or its pasted copy, causes data to be replicated and eventually decoupled from their original representations, allowing deletion of the originals without risk of data corruption. (Damn, that’s wordy. In brief, a recurring issue for Airtable is its lack of a persistent calculation field. Accordingly, no formula field is sacred: At any moment, its value might change, depending on the stability of its components. In such a dynamic environment, archiving records is impossible, for the moment the originals are deleted, every field descending from them disappears as well. What I’ve attempted to do with the copy/paste routine is break the link between living and archived data.

Even more, why you can’t use Views? One for Closed ones, and another for Open ones, using the “Closed” field as a condition: https://support.airtable.com/hc/en-us/articles/202624989-Guide-to-views#add_filters

I was working from the assumption there were potential privacy issues (which would possibly be an argument for archiving to a different base) where they didn’t want too much historical data in one place. Secondarily, I had the impression the OP wanted to use the archiving process as an opportunity to winnow the records, but on re-reading the post I don’t find evidence for that.

Another possibility, of course, is that two bases (live and archive) of 1,000 records each cost $0 per year, while one base containing all 2,000 records costs $120. Small potatoes, true — especially in comparison with the cost of sufficient middleware to perform the archiving – but I’ve worked with nonprofits where $120 would have been (or, at least, would have been seen as) a show-stopper.

One of the potential clients for whom I’ve been fiddling about with archival has the potential to spawn many thousands of records a year. (20-25,000 individuals, to begin with, linked to the thousands of companies for whom they work and the accounts payable contacts at said companies. At this point, I’ve described only the basic scaffolding: The data and code that have to exist so all the fun, cool, value-added pieces — and the tens of thousands of records they represent – have a place to sit.)

At some point, it’s likely we’ll bump up against Airtable’s max record size for tables: Probably not the first year, possibly the second, almost definitely the third. (Of course, by then Airtable will likely have a much higher limit; at the same time, though, my client is also expanding frighteningly quickly, so their data needs could easily keep pace with future performance improvements.) Since it’s the [People] table that dominates, in terms of both size and rate of growth, it would only make sense to archive off out-of-date records or those of unresponsive individuals; moving them to a separate table would help reduce the pressure on the main [People] table while still keeping the archived data close to hand for those inevitable cases when stale data turns fresh again.

(Given I see the first possible need for such functionality appearing near the end of Year 2, and likely not until mid- to late-Year 3, you can probably guess where on my list of priorities archival of old records ultimately falls.)

So, yes. We have to archive to a new base because we are at our records limit and, are basically forced to archive “closed” records - which will open up a bunch of space on our main base.

I want to clarify one assumption, here, and ask a couple of questions/make a couple of comments.

The assumption is that you are currently on a Free plan, with its 1,200-record limit.

My questions/comments are as follows:

  1. Depending on the number of collaborators, you could easily pay more each month for middleware than it would cost to upgrade your Airtable plan from Free to Plus (or from Plus to Pro).

  2. Even at Free plan levels, 1,200 records seems generous, given your market. The exception would be if the bulk of records consist of notes rather than entries in your [Main] table. This raises a couple of points:

    1. If you intend to archive [Main] table records marked ‘closed’ as well as related notes, your middleware cost obviously would increase. (Note some middleware plans place a relatively low ceiling on transactions-per-month, so while you might be able to rely upon a lower-cost plan for ongoing archiving, you may need a larger plan at first to eliminate any backlog.) Doing so also introduces slightly more complexity into the archival process, which may or may not be a concern.
    2. Conceivably, you might be able to consolidate multiple [Notes] records into a single one — or, possibly, even rolled up into a single long text field — prior to archiving. If so, you might want to consider whether doing the same with active notes might obviate or delay archiving.

Hmmm… On reading what I just wrote, I guess my questions are implicit ones.

We’re on a paid plan and its our “notes” table that has reached the limit. We cannot combine notes - they are confidential and the mixing of notes would be unlawful. Our “main” table only has 39 records, and each record has many notes, and some of those records are “closed” and contain thousands of notes (quite a few of the records contain thousands of notes). We are a nonprofit and have a tight budget so upgrading is not a priority - especially since we can archived “closed” records to a new base and save the money. I’m looking at Zapier to possibly save a view to a new base and then simply go back and hand delete “closed” records after they have been moved (and confirmed) to the new base.

Are you saying combining notes associated with a single [Main] record would break confidentiality rules, or are you referring to commingling of notes from various [Main] records?

Are [Notes] text (either single-line or long) fields, or are they attachments to store, for instance, scanned input?

Thanks!

Here’s why I’m asking.

If the notes are textual and if it’s OK to combine notes related to a given [Main] record, you could try something like this. For this test, I generated 1,000 records of random text and imported them into a dummy base. Using a common linked record, I first simply rolled them up with a divider specified as the separator character for ARRAYJOIN(), which gave me this:

mergedNotes01

I then went back to my original table and created a formula field that referenced an autonumber field and the record CREATED_TIME() to build a header. After a little tweaking of character choices and number of newlines, I ended up with this:

mergedNotes

For archival purposes, the rollup field can be changed to a long text field. This decouples it from its component pieces, allowing the original notes to be deleted with no effect on the combined field.

Other than time spent massaging the header format, this literally took less than 10 minutes and reduced the size of my base by 998 records.(Note: There’s some uncertainty as to whether the resulting digest fields were of a legal size; it was recommended I limit such fields to the current maximum size for a long text field, 100,000 characters. Under such constraints, I still would have cut 995 records from the base.)

Admittedly, this approach may not work for you; should that be the case, perhaps it may be of use to others who come across it while searching the forum.


Edit: I guess I should include the formulas I used.

Part of the header is created in the [Notes] table. Given the original notes field {Notes} and an autonumber field {Nbr}, I created a new formula field with the following formula:

REPT(
    '0',
    5-LEN(
        Nbr&''
        )
    )&
Nbr&
' • '&
DATETIME_FORMAT(
    CREATED_TIME(),
    'LLL'
    )&
'\n‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾\n'&
Notes

The horizontal line character forming the rule between the two newline characters ('\n') is the overline character, U+203E — which also explains why it looks funny in your browser, assuming it does.

The remainder of the header was created as part of the rollup process. Rolling up values from my ‘cooked’ notes field — the one with the partial header shown above — I used the following aggregation formula:

ARRAYJOIN(
    values,
    '\n\n______________________________________________________\n'
    )

This time, the horizontal line character is the usual underscore ('_').

Why did I divide the header between formula and rollup? Because that’s how I started — for no reason, in other words. I could just as easily defined the entire header in the initial formula field and used an aggregation function of ARRAYJOIN(values,'\n').