Forum Discussion
- MarkShnier__YouQrew Legend
I can think of three ways to do this using Pipelines. Note that Pipelines can be set to run on a schedule, for exampe Fridays at 2:00 am.
Method 1.
Create a saved table to table copy from the import export menu in native Quickbase. Then run it with an API call from the Quickbase channel.
The API would be like
https://mycompany.quickbase.com/db/xxxxxxx?a=API_RunImport&id=10
set Method = POST
The xxxx is from the URL you see when you are building the saved T2T import.
nothing else is requiredMethod 2.
Have a Pipeline search the source records and then do a For Each loop and in that loop step you create the records. That will work fine if the # of records being created is not very large, say under, perhaps 50 records. It's a little unkind to the performance of the app to create the records one by one like this but if it's happening in the middle of the night I suppose who cares.
Method 3.
Have the first step in the pipeline be to create a bulk upsert, (think of that as a temporary holder for your records), and then do the search step against your source table. Then the For Each loop would add a row to the bulk upsert and then when the For Each loop is completed, outside the Loop you would have the last step to commit the Bulk Upsert. The advantage of a bulk Upsert is that if you were creating say thousands of records you would not be using your somewhat precious QuickBase processing power except for the initial search which is super fast and the last upsert which is super fast. But you would not be annoying your QuickBase app with each individual and record step and impacting the performance of the app for other users.
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------- CostinAngelescuQrew Member
Thank you.
I just started working with pipelines and it works.
------------------------------
Costin Angelescu
------------------------------- CostinAngelescuQrew Member
Dear Mark Shiner,
Thank you again for recommending the pipelines. Now I face another issue.
I've set up the pipeline (see attached), but when a user adds 6 records, the entries that are searched and then added to the final table end up being multiplied by 6.
How can I ensure that when a user adds records to the table, the pipeline copies these records to the final record only once?Please notice the attachment.
------------------------------
Costin Angelescu
------------------------------