MY SITUATION
I have a base containing several tables:
- ‘Translations’: Using DeepL API this 10 row table converts the record contents under 3 distinct fields ‘header’, ‘subheader’ & ‘cta’ into 10 languages
- ‘Banner Orientation 1’: Has fields ‘lang’, ‘target-header’, ‘target-subheader’ & ‘cta’ etc) & 1000 rows
Table ‘Banner Orientation 1’ has:
- 100 rows where the field ‘lang’ records hold the 2-letter code for the 1st language
- 100 rows where the field ‘lang’ records hold the 2-letter code for the 2nd language
- etc…
THE PROBLEM
All 100 rows for the 1st language contain the same record value for each of the 3 fields mentioned above (and the other 9 sets of 100 rows each share the same 3 translated phrases).
So… table ‘Translations’ only has to do 3x10=30 translations ONE TIME.
The alternative to these 30 API calls?
If I translate each row record separately I get 3 records x 1000 rows = 3000 API calls to DeepL for only 30 unique phrases… not good!
MY GOAL
I want those 3 translation records copied from table ‘Translations’ to each of the 100 rows for that particular language in table ‘Banner Orientation 1’…
Then again for the remaining languages.
CAN YOU ASSIST? IDEAS?
Hopefully I’ve explained well enough…
Thx in advance,
David
P.S. Yes, I know I could simply ‘copy/paste’ the translations myself into 1000 rows. Just hoping there’s a way to do this automagically :winking_face:
P.P.S. ‘Lookup’ fields came the closest but I couldn’t figure it out…