Forum Discussion
Hey Marcelo,
The pipeline itself is looping through a number of related tables. It identifies a field in one table that has a one to many relationship. Then based on that relationship, it loops through each found value and updates one field to a new value.
------------------------------
Caleb Gayton
------------------------------
Hi, all actions in quickbase, or do you calling a external service? and, how many records through?
Thank you
Marcelo Benavides Torres
------------------------------
Marcelo Benavides
------------------------------
- CalebGayton7 months agoQrew Member
It is all QuickBase actions. The number of records is probably a decent volume maybe like 15-20K records.
------------------------------
Caleb Gayton
------------------------------- MarceloBenavide7 months agoQrew Cadet
Ok, After 24 years of experience developing software, I'm not sure if the performance is good for 15k records, especially considering that the pipelines run in the background.
Have you tried with a small group of records? make a filter to test.
Have you tried it in hours where the system is without users?
I suggest that you use API and update this data via JSON with various record blocks.
------------------------------
Marcelo Benavides
------------------------------- ChayceDuncan7 months agoQrew Captain
One observation of the below is that you're querying for each record - and then doing some kind of update. Across 15K records that is is going to mean 30K api calls to QB. If possible - I would suggest trying 1) to do the search that you're doing in the pipeline step somehow in the record itself so the Pipeline doesn't have to do it and then 2) doing a bulk process to handle the upload. Right now you're putting a lot of stress on QB through Pipelines.
------------------------------
Chayce Duncan
------------------------------