What the max number of records or filesize for the API_ImportFromCSV?

  • 0
  • 1
  • Question
  • Updated 1 year ago
  • Answered
I'd like to push 10-15k records into a Quickbase table using API_ImportFromCSV.  Is there a limit to how many records we can push at once, or best practices to reduce performance hits by breaking up the post into smaller requests?
Photo of Karl

Karl

  • 80 Points 75 badge 2x thumb

Posted 1 year ago

  • 0
  • 1
There is a limit based on the total size of the import data (#of records * record size).  I wouldn't recommend you send large batches (especially if it's coming from a users browser and not node) for several reasons.
1 - Imports are all or nothing.  If there is an error in a single record you are importing (missing required field, duplicate on a unique field, etc.) the whole batch gets rejected.   The error information does not help isolate the problem.
2 - It can take a while to import 10K records.  You could affect other users of the db you are importing to with large batches of data.  They will have to wait for the operation to complete.

Neil
Photo of Karl

Karl

  • 80 Points 75 badge 2x thumb
Thanks for the reply.  That's good input around the single failure fails them all.  The request will come from a job setup on a server to pull and push records to sync up quickbase with our data source. Do you know what the file size limit is? We currently push about 250 records a minute so it's taking an hour to get 10k records pushed, thinking we can improve on that as long as there isn't a large performance hit to the application.
-Karl
Karl,
I don't remember.  I had entered a support case and they gave me the answer - I just looked for it and it fell off my list.  If you enter a support case they can give you an exact number. 
It's difficult to know the impact of the updates.  It depends on how busy the app is when you do the updates.  If the tables you are querying have a lot of formula fields and lookups then the pull could have a negative effect as well as the push.  Also (I believe this is still true) each request runs to completion (read and write) and other users are locked out of the files until they complete.
And finally - QB sometimes evaluates a query and determines (without attempting the query) that it will take too long to perform and returns a timeout error. You'll see subsecond responses from QB coming back as timeouts and wonder why...
Hope this  helps
Good luck
Neil
Photo of QuickBaseCoach App Dev./Training

QuickBaseCoach App Dev./Training, Champion

  • 51,436 Points 50k badge 2x thumb
The import will also depend on the number of relationships and the complexity of you app.  Best way is just to try it and see how long it takes.  Yes, your app will stop for other users while the import is happening.