Forum Discussion
The Trigger should be "On Event" as opposed to Record Created. That will give you the flexibility to trigger on the Project being edited in case for some reason you wanted to manually trigger.
The next step will be to Create a Bulk Upsert. That's like a big old temporary cardboard box to throw stuff into. The reason for this method is that the fastest processing and least impactful on your app performance will be to fill the box outside of QuickBase and then "upsert" the box all at once. You will define the Bulk Upsert by specifying the columns to be populated including the important field for Related Project.
The next step will be to search for records in the master Tasks table which meet some criteria filter or perhaps just get them all. Then after the Search there will be a For Each step where you will Add Bulk Upsert Row ... filling the box. Those fields can be filled by dragging the fields from the right panel, mostly from the search step C but at least one, the Record ID#, from the Step A trigger record.
Then you drag across the last step which will be to Commit the Upsert, ie upload them to the tasks table.
Remember to turn it on and leave that screen open so you can watch it run. Then flip to another tab, create a project and quickly flip back to watch the pay by play unfold as the Pipeline is triggered.
------------------------------
Mark Shnier (YQC)
mark.shnier@gmail.com
------------------------------
- AnnettaColeman3 years agoQrew Cadet
Mark -
A bit more background:
Thanks (as always) for your prompt and informative reply. Thanks for confirming this would be a great Use Case for pipelines. I will admit that I have not yet discovered a comprehensive article on Pipelines so struggling to get a good definition of options such as 'upserts'. Could you please answer a few more questions to help me get my head around this concept?
- My 'Master Task' table contains the entire set of tasks that I need to load to each new project.
- Prior to launch I loaded the Master Tasks to each of the first ~100 projects in the Project Tasks Table via an Excel spreadsheet which contained the related Project ID and the 50 tasks that originated from the Master Task table. Now that we have launched, I need to create the pipeline to add these 50 tasks to new projects as they are brought forward.
- I have an event that confirms when the business requestor has completed their 'intake' of the business justification, preliminary business scope and preliminary prioritization or the project, which would be a good trigger point for this pipeline.
Follow on Questions:
- Do you happen to have a comprehensive article on this topic?
- You mention needing to build a 'temporary' external box that contains the columns and rows and search for the criteria to fill the external box (inclusive of the related project field).
- Could the temporary external box be an Excel table?
- Curious why this has to be external to QB?
- Could the source instead be the internal Task Master Table which was constructed for this purpose?
- I presume the dragging of fields is a one-time set up step. Please confirm.
- I presuming the step of opening a new tab to watch the Pipeline at work is a testing step that at any point could be utilized for troubleshooting if needed. Please confirm.
Best Regards,
Annetta Coleman
------------------------------
Annetta Coleman
------------------------------- MarkShnier__You3 years agoQrew LegendAnnetta, see my responses below
- Do you happen to have a comprehensive article on this topic? There is a collection here. But it's pretty scattered. The native help text, imho, still needs a lot of work. https://community.quickbase.com/blogs/bree-mackey1/2020/07/22/so-you-want-to-learn-pipelines
- You mention needing to build a 'temporary' external box that contains the columns and rows and search for the criteria to fill the external box (inclusive of the related project field).
- Could the temporary external box be an Excel table? NO. I was being perhaps too "glib". The Quickbase Channel has a twisty to open for Bulk Record Sets. There is an option there to create the Bulk Upsert (ie the big old empty cardboard box)
- Curious why this has to be external to QB? (I'm I'm just saying that if you need to create 50 records in QuickBase you could do it by searching your Master table of tasks and then on the For Each step you could one by one create those 50 child Tasks. That will certainly work but you are hammering 50 APIs to your app all at once to create the 50 records and in theory it will slow down your app while those are being created. Now 50 records is not going to kill you but it's just much better practice to let the pipeline do more of the work. The pipeline runs on a different engine than your Quickbase app. The QuickBase app runs under the proprietary database of QuickBase and can only do one thing at a time. (ie while you 50 records are being added one by one, someone who is waiting for a dashboard to populate will be waiting. Pipelines runs in a cloud, I forget if it is a AWS or a different cloud service. But it effectively has unlimited capacity. So it is better to query your master table once which is very quick and then have the pipeline do the work of filling the cardboard box and then uploading all those records all at once. That way you are only hitting on your app once to do the query and once again to do the upload so that is a lot less impactful on performance than hitting your app 50 times.
- Could the source instead be the internal Task Master Table which was constructed for this purpose? I am not sure what you mean by this question. The pipeline will query the Master Task table in order to find the records to put in the Bulk Upsert. Just by the way what the word "Upsert" means is that it will Insert new records or UPdate old records. In your case they will be all inserting new records but there could be a different use case where you are merging data in where there is a key field and in that case the upsert will update existing records and insert new ones as required. This would be very similar to how a native Quickbase table to table copy works or manual import of an Excel sheet into a Quickbase table where there is a key field. In those cases it will either update existing records or create new ones.
- I presume the dragging of fields is a one-time set up step. Please confirm. Yes you are building the pipeline once.
- I presuming the step of opening a new tab to watch the Pipeline at work is a testing step that at any point could be utilized for troubleshooting if needed. Please confirm. Yes, this is just a debug step that you would normally do after building or editing any pipeline trigger the pipelines and then watch the recap to confirm it runs OK. You don't have to watch it live, as there is a button to look at the activity for a pipeline but I think it's just much less confusing to be watching when it is triggered.
------------------------------
Mark Shnier (YQC)
mark.shnier@gmail.com
------------------------------
- MikeTamoush3 years agoQrew Commander
Mark,
Thanks for the great explanation on Bulk Upsert. I've been trying to understand what it is and when to use it.
So do I understand correctly:
This WOULD work by simply doing a search in the master table, and the For Each: Create the child record. But that would take system resources as 50 create APIs will run?However, if I do the Bulk Upsert technique, in the For Each loop it only performs one API call?
------------------------------
Mike Tamoush
------------------------------- MarkShnier__You3 years agoQrew Legend
Mike, exactly.
The other thing you will observe if you watch the pipeline run if you try to add the records individually is that the For Each loop fires asynchronously. What that means is basically it hammers the application with 50 API requests all at the same time. Because QuickBase cannot do 50 things at once it rejects many of the API requests. Now, fortunately the pipeline re-Queues those requests automatically and then fires the same request after a short delay.
I think I recall seeing an indication that it was willing to try the same request up to 15 times before it gives up.
But imagine a use case where you were adding say 2000 records. So you're app gets hammered with 2000 instantaneous API calls and I think that there's a risk that some of them could timeout because every time it goes to retry a rejected request the app would still be busy processing others of those 2000 API's.
Also, if you had 2000 individual request I think your users would definitely feel like the app became unresponsive.
So sure, in practice, probably 50 individual requests you could get away with easily and nobody would really notice but it's just basic good practice if you are going to add dozens of records to take the extra time to create that bulk upsert and do it "right ". One API call to Query and one API call to Upsert.
------------------------------
Mark Shnier (YQC)
mark.shnier@gmail.com
------------------------------