I need to update 100,000 records every week -- just serial number and product name. About 99,500 will be exactly the same. Then there will be a few additions and a dozen or so deletes every week. I cannot get a 'changes' file. Unless there is a better way, I'm planning to just delete all records in Quckbase, and upload the new file, is that how you would do it? Is there a way to tell Quickbase to automatically delete all records and load this file ever week if it's on a server somewhere?
As you know, QuickBase has announced Synch and at EMPOWER they said that while they do not "sync" to CSV files, that is the next interface they are working on and it will be available soon. I don't know when "soon" is but I would guess by the end of the calendar year at the latest. But that is just my assumption because they typically do not discuss features which are years away.
So, until then its probably best to just purge and re-import.
I don't know if you have tried to do a full import and how smoothly it went. You are lucky that your data is only 2 columns "wide". I have found that when trying to import a dataset with many columns, I had to break it up to 25,000 records at a time or i would get an error. The specific error message is misleading, but its just telling you that it's unhappy.
To make the purge quicker, you could use the API_PurgeRecords in a URL formula button or a dashboard button and hope that it does not choke on too many records. It can time out if there are too many records, in which case you need to fall back on a List All report and then "More ... Delete these records" and watch the progress bar creep across.