Forum Discussion

EmberKrumwied's avatar
EmberKrumwied
Qrew Officer
2 days ago

Suggestions for archiving or closing records

In my current application I am building for the purpose of tracking Assets, their monthly rental rates, rental charges for each/month, and monthly invoices for each client.

Currently the application has about 10 years' worth of data and this is making some of the reports, forms have long load times as there are at least 3 dozen or so calculations being done across all the tables and forms.

My thought was when a month, quarter, year is closed then specific data would be "captured" and moved to another table. The "Archived" table would just hold the results of any calculations previously done, essentially creating a snapshot of that record into another table. The Archived table could then be used for other reports but wouldn't be bogged down with having to do the original calculations, their results would just be summarized.

Once this data was captured, the original record would either somehow be closed for editing or get an archived indicator, so they could be excluded from the current month/quarter/years' entries.

Just seeking the best way to optimize the application as the amount of data will only continue to grow and thus the lag time greater when accessing the system.

3 Replies

  • Denin's avatar
    Denin
    Qrew Cadet

    If I were you, I would just make a copy of the app without the data. Then rename the one with the data to include (Archive 2015-2025).

     

    Then if you want to optimize the archive, there's plenty ways described in Quickbase University like removing unused reports, making reports filter out the most records first, and other strategies. There's no specific approach, but a lot of options depending on your needs. Something more complex like creating new snapshot records with Pipelines could be possible, but maybe someone wouldn't want to run Pipelines on so many old records (not sure how many records you have or what your specific requirements are).

  • Archiving and summarizing can be a lot of development work for active buildings which have a long history.

    One thought is that an asset which is no longer active is much more easy to archive off than an active asset.

    Can I ask for the ballpark record counts in some of your larger tables, and what percent full those tables are?

    I'm asking that because if you still have many years left before you are forced to archive due to the 500 MB table limits, then maybe some initial efforts instead should be to find ways to optimize the runtime for reports that are run frequently.

    Do you have the filters on these reports set in the proper sequence so that you filter out the most records possible with the initial filters? Do you have columns on the report which are not really necessary, especially columns involving summary fields?

    have do you have Table home pages which are needlessly causing the system to sort up hundreds of thousands of records every time I user goes there.

    Another thought I have is too copy the app assuming it's all one big app, and arbitrarily dropping off and deleting half the data to satisfy yourself that if you did all that archiving work,  that the app actually would run much faster. You wouldn't want to go through a whole lot of development archiving work only to find out a screen loads maybe 25% faster which is saving your user maybe 1 second  on some reports 

    Another thought is that some of the process processing gets done locally on your computer. Sometimes what I have done is gone into an Apple Store or some kind of Microsoft store and try your application speed on the latest and greatest M3 Chip. Again, that would help you understand what part of the bottleneck is the local browser on your computer and rendering the data that is fed down the line.

    Another test is whether or not the app feels slow only during the most heavily used time of the day, for example at 11 AM or 1:30 PM versus how it runs at 8:30 PM at night. That's also a clue.  

    So I think you get where I am coming from, until you are forced into archiving due to QuickBase table size limits, I would first try to optimize dashboards and reports so that you are not needlessly wasting the single threaded server, which is allocated to your application.

    I don't have links handy right now, but there are definitely blogs that have been written over the years about how to optimize performance, and QuickBase Support can probably point you at resources of things to review.

     

     

  • You can write a Pipeline that duplicates the data in a new table and then deletes the old records.  You need to really think through all your reporting scenarios to make sure that you are covered.

    Another answer would be to push all of this data including the relationships out to a real SQL DB.  That you can really query it for unique purposes, push to an AI model and so on.