Forum Discussion
What if you try a pipeline step to copy the value of the formula query into a scalar field. Then use the scalar field in the real Pipeline. The copy could be done by making a bulk upsert and adding the Bulk upsert rows one by one, then commit the Pipeline. maybe that way it only has to calculate the formula queries record by record and it won't time out.
You may be able to copy your existing pipeline and then use the new pipeline designer interface to insert steps before your main pipeline steps.
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------
At face value you might try a Table to Table Import instead of processing it through a Pipeline, but your actual error relates to a formula query execution. Are one or several of the fields that you're moving between field sets formula queries? Have you optimized those fully or is there a way to not use formula-queries?
------------------------------
Chayce Duncan
------------------------------
- AaronB2 years agoQrew Trainee
Chayce,
This table has a couple fields with horrendous formulas. However, those fields are not used in this section of the pipeline. All fields involved here are straight text fields that were originally pulled from the import of a csv file.
I'm willing to try a table-to-table import but would need that to coincide with the rest of this pipeline......and this event gets kicked off only on demand (when a user hits a button). So I am not sure exactly how to make that happen.
I'll read up on table-to-table imports to see if I can make it work.
Thanks....Aaron
------------------------------
Aaron B
ab1692@att.com
------------------------------- ChayceDuncan2 years agoQrew Captain
Sorry didn't see this. You can setup a Table import and manually test it to make sure, and then in your pipeline do a Quickbase API call and just do a API_RunImport call to process it within the overall pipeline. This makes it so you only have one point of failure and can run it on demand as well and it might help isolate what fields might be causing the issue. This also keeps it so you're only doing one request to Quickbase versus doing a request per record in a loop which will be more efficient for Quickbase in the overall process since its only doing one thing.
------------------------------
Chayce Duncan
------------------------------