Please how can i optimize my Quick base app performance? What are some of the common mistakes to avoid when i am using quick base? ------------------------------ David Mensah -----------------...
Similar to Mike's comments my general suggestions will include:
Don't use Cross-App relationships or Table To Table imports between Apps.Use Pipelines or Sync Tables
Optimize your main tables to make fields non-reportable and non-searchable to minimize their potential impact.
There is no documented best practice that will always work - but always try and optimize formulas and reports to do the least amount of work. In reports that means ordering your filters to be efficient so that you eliminate the most data with your simplest/initial filters and then get more complex. For formulas, use If statements or other logic and only evaluate/calculate things when you have to. Nest variables where possible so that QB only has to evaluate what it has to
When using Automations like Pipelines, where ever you can always do bulk uploads. Avoid trying to do individual edits/changes unless you have to or the workflow requires it
Don't try and shove everything in one table or app. If you put too much stuff in one application or one table it creates potential for over-taxing that app/table and creates potential conflicts and security issues down the line. Always make everything as simple as possible
For data contained in formulas or lookups that stays static after a period of time, always try and 'freeze it' if possible. So if you have complicated calcs that leverage several tables or summary/lookup data - but it won't change after a project is closed for example - then freeze that value into storage field(s) and instead of doing the calc in your formula just read the storage field. This helps that if you've stored the field and point your formula to it - then QB no longer has to do the work for these records that will never change.
"For data contained in formulas or lookups that stays static after a period of time, always try and 'freeze it' if possible. So if you have complicated calcs that leverage several tables or summary/lookup data - but it won't change after a project is closed for example - then freeze that value into storage field(s) and instead of doing the calc in your formula just read the storage field. This helps that if you've stored the field and point your formula to it - then QB no longer has to do the work for these records that will never change. "
I wonder if understood this tip and if it really helps.
I have a field where I query a table with thousands of records. The formula worked fine until the table which it queries from reached 25 thousand records when I wanted to make changes to formula, I wasn't able to save it, and I got this error.
So, I tried to optimize it as in your tip number 6. The query results stay always the same, so I created a new filed where I copy over the results of the query filed to the copy filed with Pipeline.
Then I changed the formula in the query filed to query inside a IF() statement like bellow.
My theory was that since most of the time there will be data in the copy filed it will not need to perform the query for those all records. But that didn't help for my error. And when I tested the formula filed with 'Formula Checker' based on a record where I had data in the copy field it showed in the results that the equation to "[Copy Filed] <> null" is true and it pulled the data from [Copy Filed] and after that it also ran the whole long query which was enclosed in the false part of the if statement.
Then after a while my Pipeline stopped working because it wasn't able to query the table since the table had this heavy filed.
Is this what I did what you have suggested in number 6 or am I misunderstanding your idea.