Forum Discussion
MarkShnier__You
Qrew Legend
3 years agoI do agree with Mike that if there is any way to leverage running table to table imports they are super fast.
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------
EdwardHefter
3 years agoQrew Cadet
I don't think I can do a table-to-table import because I need the 100 different serial numbers on the 150 components (well, 15,000 components I suppose). But by using a temp table to put just the 150 components in from the larger Master Data table, the pipeline sped up a lot.
Since the pipeline may get triggered more than once during the long duration of a run, I made sure the temp table had the PCBA ID in it as well as the time the record that triggered the whole thing was updated. The pipeline still does a search on the temp table and it looks for both the PCBA ID and the timestamp. That way, even if there are multiple triggering events during the first event's run, the multiple "instances" of the pipeline will get the right data from the temp table based on the timestamp. Also, the pipeline deletes the data from the temp table using the PCBA ID and timestamp.
This is the biggest set of data manipulation I've done in Quickbase and it definitely made me think about multiple users and/or pipeline instances, giving a sense of "ready to do the next thing" to the user, and managing large sets of data!
------------------------------
Edward Hefter
www.Sutubra.com
------------------------------
Since the pipeline may get triggered more than once during the long duration of a run, I made sure the temp table had the PCBA ID in it as well as the time the record that triggered the whole thing was updated. The pipeline still does a search on the temp table and it looks for both the PCBA ID and the timestamp. That way, even if there are multiple triggering events during the first event's run, the multiple "instances" of the pipeline will get the right data from the temp table based on the timestamp. Also, the pipeline deletes the data from the temp table using the PCBA ID and timestamp.
This is the biggest set of data manipulation I've done in Quickbase and it definitely made me think about multiple users and/or pipeline instances, giving a sense of "ready to do the next thing" to the user, and managing large sets of data!
------------------------------
Edward Hefter
www.Sutubra.com
------------------------------
- EdwardHefter3 years agoQrew CadetDoes anyone know at what point tables start slowing down? If we put in 15K records at a shot, after only 65 sets (about a month) it is up to a million records and then after a year it could be over 10 million records.
------------------------------
Edward Hefter
www.Sutubra.com
------------------------------- MarkShnier__You3 years ago
Qrew Legend
Well before you worry about speed you should worry about the actual size of the table. The maximum size of a table is 500 MB. You should look at your record account now and the number of megabytes and then see how much it can grow until you hit the 500 limit.
Go to settings for the application then App Management, and then show app statistics.
The issue of that performance with large records counts is not black and white. It mainly depends to what degree you use summary fields which access these large tables and are used often.
------------------------------
Mark Shnier (Your Quickbase Coach)
mark.shnier@gmail.com
------------------------------- MikeTamoush3 years agoQrew EliteAgree with Mark, but also to note, there are a limit to records in a table. According to the statement below, the limit is on record IDs, so the way I understand it is, even if you deleted all your records, but your record ID was up to 4.2 million, that would be it for the table. Not 100% sure on that, but that is how I read their limit explanation.
"The limit for maximum record ID in a table is 4,294,967,295. There are no hard limits on the number of rows or columns a table can have. However, a single table cannot exceed 500 MB. The amount of space each record uses depends on several factors, such as the number and types of fields in the table."
------------------------------
Mike Tamoush
------------------------------