Large file importing/CSV syncing error.

  • 0
  • 1
  • Question
  • Updated 2 years ago
  • Answered
I have a spreadsheet of 600K records with 21 fields for each record. There are some special characters  (&%#/ etc) in 2 of the fields. 

1. I tried to sync CSV file - around 110 MB stored on Google Drive. But it always stopped randomly after around 20 mins. 

2. I also tried to import the original excel file - around 40 MB. And this error popped up - "We cannot get the status of your upload at this time. Please refresh the page. Error: Internal Error"

I am pretty sure I imported/CSV synced even bigger file several months ago. I have reached Quickbase support team but haven't had any solutions. 

It seems Quickbase is not working very well with large file. I got another error massage when trying to export the table I imported before.

Anyone please help!

Photo of Daniel


  • 10 Points

Posted 3 years ago

  • 0
  • 1
Photo of Matthew Neil

Matthew Neil

  • 31,698 Points 20k badge 2x thumb
We had the same issue about the same time you posted this.  

QuickBase did a CSV sync update in early November, and it caused all sorts of problems with existing sync tables.

We had to delete the table and recreate it all.

As far as the excel import
from my experience the import option will error out if you have more than 10,000 lines of data.  

If it is an option for you to break the 600k lines into smaller sections during the initial sync, and then just an update sync might work.

Just throwing out ideas, as I've had my far share of troubles with this similar issue.
Photo of Daniel


  • 10 Points
Hi Matthew,

Thanks for the reply.

Currently I am using a workaround method:

1. Import the file into MS Acess

2. Set up Macro to run couple of queries to break the file into 5 small CSV

3. Importing them one by one manually.

As you can see above, I am using importing instead of syncing. It takes much longer to do this manfully. I did try to sync small set and then whole set. But I still got the same error.

One of the Quickbase guy is working on this, hopefully we will have some solution soon. Will keep you updated.



Photo of Avinash Sikenpore

Avinash Sikenpore

  • 162 Points 100 badge 2x thumb
There is an easier solution if you have MS Access. Using Qunect ODBC connector, you can create an append query that uploads to the table. You can create a macro that uploads all your 5 CSV files. Use a windows task schedular to schedule the upload on a regular basis using a simple batch file. If the old data needs to be refreshed you can create a batch file to purge all records using a URL the syntax can be found on the API guide. once the purge is complete you can push the data via the ODBC connector using access. Reach out to me if you need any help ( . It was a big issue for my company since we wanted to upload over 2M lines of data with 12 fields and it now works seamlessly.