Forum Discussion
SystemAdmin4
Qrew Member
Yet again not the answer I was hoping for! But really appreciate the clarity thanks, Evan.
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------
SystemAdmin4
4 years agoQrew Member
Thanks to both Evan and Mark for your various responses on this thread.
Given what I've now learned, it's clear that I need to transfer the data from my master to my slave app differently, so the two apps are run in different threads, for optimum performance.
I thought Id share my plan here, in case there are any flaws which Mark, Evan and the community can point out for me, before I replace one problem with another!
To summarise my current situation, I have 4 master tables in the Master App, and 4 slave tables in the slave app.
The master app has lots of complex relationships (normal and reverse), summaries and lookups, and it's where we do our content management, so it's optimised for usability within the native QB UI. Lightning fast speed isn't a priority.
The slave app is currently refreshed from the master on a daily schedule (or upon manual trigger) by import API.
The slave is optimised for speed, has no relationships or formula fields, and only essential data for its sole purpose which is serving read only content to our web app.
Given that I don't want to mess with the slave tables, and they cannot be reconfigured as connected tables to use the sync functionality, my plan is now to use a two step process as follows:
1) Create 4 new connected tables in the slave app, to act as an interim store of the data. These interim holding tables would be refreshed using the connected tables sync functionality.
2) Refresh my original real slave tables from the interim tables (both in the same app) using the good old import API.
One thing at the back of my mind, that I'm not sure how I will approach yet (beyond careful scheduling) is...
I'd like to trigger the API Import automatically when the sync has fully completed.
And obviously this could involve anything from 1 to thousands of records changing, so I need to trigger this in a way which cannot result in a single import triggering the API calls multiple times.
I also wouldn't want the import to happen until the sync had finished.
Presumably I'll need to resort to pipelines for this, and I'm thinking that there will be no way to know the sync has finished, so it will have to just be a case of creating a pause in the processing that is long enough to be sure there is no way it wouldn't have.
One final question for anyone who is familiar with pipelines. Presumably they run independently of QB db operations (except when they are running QB DB operations themselves). So a pause in a pipeline that is being run doesn't create a blockage in the processing of normal queries against the db? Is my assumption here correct?
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------
Given what I've now learned, it's clear that I need to transfer the data from my master to my slave app differently, so the two apps are run in different threads, for optimum performance.
I thought Id share my plan here, in case there are any flaws which Mark, Evan and the community can point out for me, before I replace one problem with another!
To summarise my current situation, I have 4 master tables in the Master App, and 4 slave tables in the slave app.
The master app has lots of complex relationships (normal and reverse), summaries and lookups, and it's where we do our content management, so it's optimised for usability within the native QB UI. Lightning fast speed isn't a priority.
The slave app is currently refreshed from the master on a daily schedule (or upon manual trigger) by import API.
The slave is optimised for speed, has no relationships or formula fields, and only essential data for its sole purpose which is serving read only content to our web app.
Given that I don't want to mess with the slave tables, and they cannot be reconfigured as connected tables to use the sync functionality, my plan is now to use a two step process as follows:
1) Create 4 new connected tables in the slave app, to act as an interim store of the data. These interim holding tables would be refreshed using the connected tables sync functionality.
2) Refresh my original real slave tables from the interim tables (both in the same app) using the good old import API.
One thing at the back of my mind, that I'm not sure how I will approach yet (beyond careful scheduling) is...
I'd like to trigger the API Import automatically when the sync has fully completed.
And obviously this could involve anything from 1 to thousands of records changing, so I need to trigger this in a way which cannot result in a single import triggering the API calls multiple times.
I also wouldn't want the import to happen until the sync had finished.
Presumably I'll need to resort to pipelines for this, and I'm thinking that there will be no way to know the sync has finished, so it will have to just be a case of creating a pause in the processing that is long enough to be sure there is no way it wouldn't have.
One final question for anyone who is familiar with pipelines. Presumably they run independently of QB db operations (except when they are running QB DB operations themselves). So a pause in a pipeline that is being run doesn't create a blockage in the processing of normal queries against the db? Is my assumption here correct?
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------
- MarkShnier__You4 years agoQrew ChampionGood plan. A couple of thoughts.
The refresh cycle is hourly and it will be at the same minute each hour. You don't get to choose the minute but the History shows the time of completion.
I don't know if it is possible that source table records will be deleted but if that is possible then you will need to consider how you will purge out those records are after you do the save table to table copy. The Saved toTable copy will copy across all records but not delete old ones.
Pipelines can be scheduled to run hourly and you do get to choose your minute, so a pipeline can run say every hours at 17 minutes after the hour if you are seeing that the Refresh completes at 15 minutes after the hour.
You are correct that a dormant pipeline has absolutely no effect on the performance of an app. There is only load on the app when the pipeline is actually interacting with your app.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------- SystemAdmin44 years agoQrew Member
Brill, thanks Mark.
D
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------