Help

Copying and reorganizing records into new table

1798 1
cancel
Showing results for 
Search instead for 
Did you mean: 
DeAndre_Wright
4 - Data Explorer
4 - Data Explorer

I’m developing a CRM base and needed help on a problem I have.

I have a table that collects information about customers. Each record in this table is created when my customers complete a form during checkout on my website. One column in this base generates a “Member ID” that is unique to each record.

The online merchant that I use generates an order ID for each transaction. I would have like to have kept it simple and used the order ID as the customer’s “Member ID” but because the products I’m selling are recurring subscriptions, a new order ID is generated at each renewal. I obviously don’t want this because a customer will have multiple ID’s.

Instead, I want to be able to link the order ID to a member ID. The only data that is generated on the exported Order .csv file that could be used to match to my customers table is an email address. None of the other tables in my base use an email address as the primary field.

So, my question is: Is there a way to take information from my “Customers” table and re-arrange it so that it is created in a new table where the e-mail address is now the primary field? I’d like to do this within airtable.

The work around that I’ve found is to have Zapier copy information from my “Customers” table into a new “bridge” table. The e-mail address is now the primary field in the bridge table. From there, I can link an e-mail account to a customer/member ID and then now to the imported Order ID’s field.

Is there a simpler way to do this and that is contained inside of Airtable?

Thanks and hopefully I explained everything clearly.

1 Reply 1

Note As I neared the bottom of this message, I realized a much simpler method that would likely work for you. I’m leaving the longer version in case the quick-and-dirty method doesn’t work for you, or for anyone reading over your shoulder who might have a use for the more complicated approach.


I ran into a similar problem on a client base a while back. Their main data source consisted of attendee records from various conferences they sponsored, taken from badge scans performed at the door of each conference session. An individual’s badge number remained constant across all sessions for a given conference, but there was no overarching ID associated with an individual that held constant across conferences. (To make matters worse, over a 2- or 3-year period, a significant percentage of their attendees changed email addresses, either as a result of job transitions or as a side effect of mergers and acquisitions in their target industry.)

What I did was create a data layer beneath the main one for the base – that is, beneath the attendee level. With the first block of session data imported, I assigned each previously unknown attendee an arbitrary user ID based on an autonumber mashed with a few additional pertinent fields. Subsequent blocks pulled together all of an individual’s attendance data from that conference, tied together by badge ID.

I followed the same process with subsequent conferences – except now I activated data deduplication routines similar to the ones I published a few months back. I used a technique similar to the one I describe in the dedupe users guide under Merger of Authentic Duplicates to allow me to link together records representing the same individual across multiple conferences. In my case, this was by necessity a manual process, as I had to check near-matches in name, last name + employer, and/or email, but with a common key (email), much more of your task could be automated. I then passed attendee data from the most recent conference to the master user record. (That is, the master user record links to multiple conference records for the same individual; each conference record contains information on the individual’s employer, title, mailing address, phone, email, and the like at the time of the conference. Based on the assumption the most-recent data have the greatest probability of being accurate, and using the trick @Matt_Bush outlines in this reply, I bubble up the most-recent data to the master user record.)


Possibly simpler approach: It just dawned on me your situation offers a potential short-cut:

  1. Define a [MasterUser] table. The primary field is email address; you can include whatever other fields you deem necessary ({Member ID} seems like a good bet…)
  2. Define a new linked-record field in your main table that links to [MasterUser].
  3. Define a view in your main table that places {Link2MasterUser} adjacent to {Email}. (Actually, to be safe, you should probably create a formula field called {CookedEmail} with the formula LOWER({Email}). Place {CookedEmail} adjacent to {Link2MasterUser}.)
  4. Left-Click the header for {CookedEmail}, which will mark all the cells in the column. (A handy message ### cells marked will appear in the lower-left of your screen.)
  5. Press Ctrl-C to copy all of the values in the {CookedEmail} column. (The same, but for ### cells copied.)
  6. Left-click the header for {Link2MasterUser}. (Ditto.)
  7. Press Ctrl-V to paste the copied values into {Link2MasterUser}. (Pasting... floats up in the lower left, followed by Paste complete.)

At this point, you will have a [MasterUser] table containing a single record for each unique email found in your base, with each master user record linked to all applicable orders for that user. You can then continue to batch-load later orders whenever convenient (filter your view by {Link2MasterUser} is empty), manually enter new records one at a time (Ctrl-C, Right-Arrow, Ctrl-V), or set up a Zap to perform the copy/paste on each new record created.

Obviously, once you have [MasterUser] created, you can use a lookup or rollup field to pull {MasterUserID} (which you either get from somewhere or generate using some autonumber-containing algorithm) back into your individual order records.

(I hope this is coherent; I’ve been waging a losing battle with my right contact lens for most of this reply, so my proofreading ability isn’t up to par. Let me know if anything needs another pass…)