MartinSuske1
4 years agoQrew Member
Questions on data import
Hello Quickbase Community,
I'm looking for some experienced opinions on how to approach a current requirement.
I basically need to upsert 25.000 data rows each day.
I have little to no control over the output format of the source systems, so my main concern is:
Will I be able to perform the necessary transformations within Quickbase or should I rely on an additional tool sitting between the source and Quickbase and transforming the data into appropriate formats for Quickbase to handle it.
I already worked with connected tables and found it pretty smart for the task.
But I found it processes only comma separated files with the headers matching the Quickbase table fields.
Then I tested the pipeline CSV Handler with 25 rows of testdata and had to wait almost 5 Minutes to load them. Doesn't seem to be the way to go with 25.000 rows. Although I could flexibly change the separator and match source to target fields.
I had a look into the pipeline "Bulk Record Set - Import with CSV" which seems to be suited for big loads of data and read about a 10.000 rows limit per pipeline execution what would lead to further preparation of my source data too.
My question is:
Is there a way of dealing with data transformation and mass data imports in Quickbase or should I definitely use an ETL framework upfront?
Any input is highly appreciated.
Thanks and best regards,
Martin
------------------------------
Martin Suske
------------------------------
I'm looking for some experienced opinions on how to approach a current requirement.
I basically need to upsert 25.000 data rows each day.
I have little to no control over the output format of the source systems, so my main concern is:
Will I be able to perform the necessary transformations within Quickbase or should I rely on an additional tool sitting between the source and Quickbase and transforming the data into appropriate formats for Quickbase to handle it.
I already worked with connected tables and found it pretty smart for the task.
But I found it processes only comma separated files with the headers matching the Quickbase table fields.
Then I tested the pipeline CSV Handler with 25 rows of testdata and had to wait almost 5 Minutes to load them. Doesn't seem to be the way to go with 25.000 rows. Although I could flexibly change the separator and match source to target fields.
I had a look into the pipeline "Bulk Record Set - Import with CSV" which seems to be suited for big loads of data and read about a 10.000 rows limit per pipeline execution what would lead to further preparation of my source data too.
My question is:
Is there a way of dealing with data transformation and mass data imports in Quickbase or should I definitely use an ETL framework upfront?
Any input is highly appreciated.
Thanks and best regards,
Martin
------------------------------
Martin Suske
------------------------------