Forum Discussion

AaronB's avatar
AaronB
Qrew Trainee
8 months ago

Pipeline error: Formula Query Execution took too long

I found some posts similar to this regarding reports but none for Pipelines so here is my situation.  I've also run the Performance Analyzer, Performance Insights, and Performance Optimizer but not have provided any advice other than "optimize your query". 

When running pipelines, I get 3 outcomes:  Success without issue, Success with Heavy Traffic notifications, Failure due to the Formula Query Execution taking too long.

This happens on the exact same table, using the exact same pipeline (no differences between runs).

Success without issue happens on the weekends or middle of the night.

Heavy Traffic notifications seems to be hit or miss during the day on weekdays.

Failure due to Formula Query Execution happens during the weekdays during typical office hours.

Based on these scenerios, I am guessing it is a Quickbase "resource availability" issue????

The table is ~ 4000 records with 57 fields. 

Below is the Pipeline activity output for one of the failures.  Sometimes it makes it through nearly all records and then dies.  Other times it dies after the first few hundred.  This particular pipeline exports 20 fields but it also dies on similar pipelines for this table that export only 5 fields. 

The pipeline grabs all records in the table and shifts data between the 20 fields exported as part of this pipeline (data in fields 11-15 move to 16-20, fields 6-10 move to 11-15, fields 1-5 move to 6-10) 



------------------------------
Aaron B
ab1692@att.com
------------------------------

12 Replies

  • What if you try a pipeline step to copy the value of the formula query into a scalar field.  Then use the scalar field in the real Pipeline. The copy could be done by making a bulk upsert and adding the Bulk upsert rows one by one, then commit the Pipeline.  maybe that way it only has to calculate the formula queries record by record and it won't time out.

    You may be able to copy your existing pipeline and then use the new pipeline designer interface to insert steps before your main pipeline steps.



    ------------------------------
    Mark Shnier (Your Quickbase Coach)
    mark.shnier@gmail.com
    ------------------------------
    • ChayceDuncan's avatar
      ChayceDuncan
      Qrew Captain

      At face value you might try a Table to Table Import instead of processing it through a Pipeline, but your actual error relates to a formula query execution. Are one or several of the fields that you're moving between field sets formula queries? Have you optimized those fully or is there a way to not use formula-queries? 



      ------------------------------
      Chayce Duncan
      ------------------------------
      • AaronB's avatar
        AaronB
        Qrew Trainee

        Chayce,

        This table has a couple fields with horrendous formulas.  However, those fields are not used in this section of the pipeline.  All fields involved here are straight text fields that were originally pulled from the import of a csv file. 

        I'm willing to try a table-to-table import but would need that to coincide with the rest of this pipeline......and this event gets kicked off only on demand (when a user hits a button).  So I am not sure exactly how to make that happen.

        I'll read up on table-to-table imports to see if I can make it work.

        Thanks....Aaron



        ------------------------------
        Aaron B
        ab1692@att.com
        ------------------------------
    • AaronB's avatar
      AaronB
      Qrew Trainee

      Mark,

      I am not familiar with 'scalar fields' but will look into that subject and see if I can make it work for this situation.

      Thanks for the suggestion....Aaron



      ------------------------------
      Aaron B
      ab1692@att.com
      ------------------------------
      • MarkShnier__You's avatar
        MarkShnier__You
        Icon for Qrew Legend rankQrew Legend

        believe me, I'm not a programmer. But the Quickbase staffers call a data entry field a scalar field meaning a field that you just type in to as opposed to being a formula field or look up field.

        Also a Pipeline step to run a saved table to table import is just a Make Request Quickbase Channel step like this

        Pipeline API_RunImport Import Example;  The URL is just   
        https://mycompany.quickbase.com/db/xxxxxxxx?a=API_RunImport&id=11
        Method = POST

        nothing else is required

          



        ------------------------------
        Mark Shnier (Your Quickbase Coach)
        mark.shnier@gmail.com
        ------------------------------

  • You are using a for-each to loop through the 4k records and make changes to each of the 20 fields in each iteration?



    ------------------------------
    Marcelo Benavides
    ------------------------------
  •  are you using a do-loop to loop through the 4k records and make changes to each of the 20 fields in each iteration?



    ------------------------------
    Marcelo Benavides
    ------------------------------
    • ChayceDuncan's avatar
      ChayceDuncan
      Qrew Captain

      If possible instead of doing any kind of loop you should just make a table to table import to move your fields inbetween your different subsets and have a pipeline run the import. It'll be more effiicient that way



      ------------------------------
      Chayce Duncan
      ------------------------------
    • AaronB's avatar
      AaronB
      Qrew Trainee

      Marcelo,

      Basically...yes.

      Pipeline Step A:  Pipeline triggered

      Step B:  Search for all the records in the table Query:  {3.GT.0}

      For each record:  Step C:  Update Record (values in fields 11-15 moved to fields 16-20)

                                  Step D:  Update Record (values in fields 6-10 moved to fields 11-15)

                                  Step E:  Update Record (values in fields 1-5 moved to fields 6-10)

      End for each.



      ------------------------------
      Aaron B
      ab1692@att.com
      ------------------------------
  • HI Aaron, I apologize, I was  not clear. My question is:  Are you  using a  do-loop (pipeline function) to  through the 4k records and make changes to each of the 20 fields in each iteration?

    Thank you



    ------------------------------
    Marcelo Benavides
    ------------------------------