Forum Discussion
I can think of three ways to do this using Pipelines. Note that Pipelines can be set to run on a schedule, for exampe Fridays at 2:00 am.
Method 1.
Create a saved table to table copy from the import export menu in native Quickbase. Then run it with an API call from the Quickbase channel.
The API would be like
https://mycompany.quickbase.com/db/xxxxxxx?a=API_RunImport&id=10
set Method = POST
The xxxx is from the URL you see when you are building the saved T2T import.
nothing else is required
Method 2.
Have a Pipeline search the source records and then do a For Each loop and in that loop step you create the records. That will work fine if the # of records being created is not very large, say under, perhaps 50 records. It's a little unkind to the performance of the app to create the records one by one like this but if it's happening in the middle of the night I suppose who cares.
Method 3.
Have the first step in the pipeline be to create a bulk upsert, (think of that as a temporary holder for your records), and then do the search step against your source table. Then the For Each loop would add a row to the bulk upsert and then when the For Each loop is completed, outside the Loop you would have the last step to commit the Bulk Upsert. The advantage of a bulk Upsert is that if you were creating say thousands of records you would not be using your somewhat precious QuickBase processing power except for the initial search which is super fast and the last upsert which is super fast. But you would not be annoying your QuickBase app with each individual and record step and impacting the performance of the app for other users.
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------
Thank you.
I just started working with pipelines and it works.
------------------------------
Costin Angelescu
------------------------------
- CostinAngelescu2 years agoQrew Member
Dear Mark Shiner,
Thank you again for recommending the pipelines. Now I face another issue.
I've set up the pipeline (see attached), but when a user adds 6 records, the entries that are searched and then added to the final table end up being multiplied by 6.
How can I ensure that when a user adds records to the table, the pipeline copies these records to the final record only once?Please notice the attachment.
------------------------------
Costin Angelescu
------------------------------- ChayceDuncan2 years agoQrew Captain
Costin -
Since you're triggering on each record being added your Pipeline will perform that same action for each being added. It seems from your setup that when a new TMS report is added - you want to copy it over to a new table automatically correct?
If that is the case - you technically do not need to do a search in step B where you are querying for similar records in the TMS table. Instead - you can just retrieve the fields in Step A when the 'New Record' is being triggered, and copy it straight away into your second table.
Reading back to your original post - your original comment was that you wanted this to run on a schedule, of which if that is your intent, then you probably don't actually want to trigger on a new record as you are doing in Step A. Instead - you would just launch the Pipeline with the Search in Step B and schedule it to run once a week.
So in summation - you should either 1) Remove Step A and just do a weekly search to copy your records, or 2) Remove Step B from searching the table and instead copy each record in real time.
------------------------------
Chayce Duncan
------------------------------- CostinAngelescu2 years agoQrew Member
Good day,
Thank you for your guidance!
It appears that I cannot trigger a single copy when there are more than one record added; this is clear in step A, where the option specifically mentions "When Record Created."
Later edit:
I just noticed the option below where the trigger is "On New Event".
The outcome looks like what I need.
A. When a user adds multiple records the pipeline is triggered by the event.
B. With the search option, I will retrieve the records based on my criteria.
C. The records are copied only once to the output table.
Concerns:
- Did anyone use this option? Do you see any risk in using it?
- Just noticed an error while running it. Came up while using the search option. Please notice the second picture below. I remember deleting the loop. Any suggestion?
------------------------------
Costin Angelescu
------------------------------