Forum Discussion
EvanMartinez
5 years agoModerator
Hi Mike,
I just wanted to chime in and share a few resources we have that touch base on Best Practices around sharing data across apps. It goes into the various options and what their pros and cons are, minus Pipelines which came out after this guide.
Sharing Data Across Quick Base Apps: Part 1 Cross App Relationships, Table to Table Imports, and Sync
Sharing Data Across Quick Base Apps: Part 2 Quick Base Actions, Webhooks, Automations, and 3rd Party Integrations.
Quick Base Memory Calculations
How we built Quick Base to scale and you can too!
As the other commenters have mentioned as an app gains in complexity and usage there is an increased chance of it needing more power and therefore having performance issues. Using methods like Sync, Automations, or Pipelines can help you avoid having apps directly connected but still be able to pass data back and forth and where possible it is a good best practice. In cases where I want to connect apps using these methods and still want to make it easy for users to move back and forth I have added buttons to the app dashboards that direct them to the right app/table to submit a form and buttons on the form to help them move back if needed. This way you can have a unified feeling experience but have ways to move data and users around. I hope those resources are helpful.
------------------------------
Evan Martinez
Community Marketing Manager
Quick Base
------------------------------
I just wanted to chime in and share a few resources we have that touch base on Best Practices around sharing data across apps. It goes into the various options and what their pros and cons are, minus Pipelines which came out after this guide.
Sharing Data Across Quick Base Apps: Part 1 Cross App Relationships, Table to Table Imports, and Sync
Sharing Data Across Quick Base Apps: Part 2 Quick Base Actions, Webhooks, Automations, and 3rd Party Integrations.
Quick Base Memory Calculations
How we built Quick Base to scale and you can too!
As the other commenters have mentioned as an app gains in complexity and usage there is an increased chance of it needing more power and therefore having performance issues. Using methods like Sync, Automations, or Pipelines can help you avoid having apps directly connected but still be able to pass data back and forth and where possible it is a good best practice. In cases where I want to connect apps using these methods and still want to make it easy for users to move back and forth I have added buttons to the app dashboards that direct them to the right app/table to submit a form and buttons on the form to help them move back if needed. This way you can have a unified feeling experience but have ways to move data and users around. I hope those resources are helpful.
------------------------------
Evan Martinez
Community Marketing Manager
Quick Base
------------------------------
SystemAdmin4
5 years agoQrew Member
Hi Evan
These resources are excellent - thanks for sharing.
Could I ask you to clarify a point you make in the first document referencing the table to table imports, specifically:
Is this basically saying that the depency and sharing of resources only occurs for the duration of the import (ie when it is actually executing the import)?
So in my use case, I refresh my slave tables over night when there is almost no usage on either my master or slave app, so the shared resources and any performance issue on either app would not be an issue.
Sorry if it seems like I'm basically asking the same question repeatedly, but I don't want to go rebuilding the way our apps work if I've simply misunderstood the nature of the dependency.
Thanks
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------
These resources are excellent - thanks for sharing.
Could I ask you to clarify a point you make in the first document referencing the table to table imports, specifically:
"TTIs create dependencies between applications when executed. Executing TTIs result in the share of system resources. "
Is this basically saying that the depency and sharing of resources only occurs for the duration of the import (ie when it is actually executing the import)?
So in my use case, I refresh my slave tables over night when there is almost no usage on either my master or slave app, so the shared resources and any performance issue on either app would not be an issue.
Sorry if it seems like I'm basically asking the same question repeatedly, but I don't want to go rebuilding the way our apps work if I've simply misunderstood the nature of the dependency.
Thanks
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------
- EvanMartinez5 years agoModeratorHi David,
Not a problem at all it isn't always clear with TTIs how it all works, that is correct when a table to table import takes places the two apps that are involved get pulled together into one server in order to exchange that data. They will then stay there until the next time they would move around in the servers which isn't always on a fixed interval. If there is a saved table to table import they aren't able to separate from each other as the system respects that connection. One way around this that I have used which is a bit of a work around is to have a Scheduled Sync that syncs the data into a hidden sync table in the second app, then I use Automations or a Pipeline set to go off whenever records in the hidden Sync are added or modified and have it send all that new data over to my normal manual entry tables. This way I get the combo of a Sync table which doesn't cause the apps to be connected and the added bonus of all the data being put into my usual workflow where users can modify and act on it as needed.
------------------------------
Evan Martinez
Community Marketing Manager
Quick Base
------------------------------- SystemAdmin45 years agoQrew Member
Yet again not the answer I was hoping for! But really appreciate the clarity thanks, Evan.
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------- SystemAdmin45 years agoQrew MemberThanks to both Evan and Mark for your various responses on this thread.
Given what I've now learned, it's clear that I need to transfer the data from my master to my slave app differently, so the two apps are run in different threads, for optimum performance.
I thought Id share my plan here, in case there are any flaws which Mark, Evan and the community can point out for me, before I replace one problem with another!
To summarise my current situation, I have 4 master tables in the Master App, and 4 slave tables in the slave app.
The master app has lots of complex relationships (normal and reverse), summaries and lookups, and it's where we do our content management, so it's optimised for usability within the native QB UI. Lightning fast speed isn't a priority.
The slave app is currently refreshed from the master on a daily schedule (or upon manual trigger) by import API.
The slave is optimised for speed, has no relationships or formula fields, and only essential data for its sole purpose which is serving read only content to our web app.
Given that I don't want to mess with the slave tables, and they cannot be reconfigured as connected tables to use the sync functionality, my plan is now to use a two step process as follows:
1) Create 4 new connected tables in the slave app, to act as an interim store of the data. These interim holding tables would be refreshed using the connected tables sync functionality.
2) Refresh my original real slave tables from the interim tables (both in the same app) using the good old import API.
One thing at the back of my mind, that I'm not sure how I will approach yet (beyond careful scheduling) is...
I'd like to trigger the API Import automatically when the sync has fully completed.
And obviously this could involve anything from 1 to thousands of records changing, so I need to trigger this in a way which cannot result in a single import triggering the API calls multiple times.
I also wouldn't want the import to happen until the sync had finished.
Presumably I'll need to resort to pipelines for this, and I'm thinking that there will be no way to know the sync has finished, so it will have to just be a case of creating a pause in the processing that is long enough to be sure there is no way it wouldn't have.
One final question for anyone who is familiar with pipelines. Presumably they run independently of QB db operations (except when they are running QB DB operations themselves). So a pause in a pipeline that is being run doesn't create a blockage in the processing of normal queries against the db? Is my assumption here correct?
David
------------------------------
dmlaycock2000 dmlaycock2000
------------------------------