Forum Discussion
- KirkKirk1Qrew TraineeThis is just about the most ridiculous product limitation I have heard of. If there is some arbitrary and or undocumented limitation on exporting a table to a file, then the export function ought to permit chunking it into multiple files, zipping the output our auto-building a table report that includes the same fields (all of them) that are typically exported using the tool...
- MarkLind1Qrew Trainee
100% agreed. If you max out a table's limit, then attempt to export records to try and reduce its total size downward, you're effectively stuck in an endless cycle of dealing with rather vague imposed restrictions that should be handled automatically through the export function QuickBase offers...
------------------------------
Mark Lind
Applications Specialist
CCI Systems, Inc.
------------------------------
- QuickBaseCoachDQrew CaptainFind a filter such as record ID and break the report in two. Or three.
- ArchiveUserQrew CaptainAs above but create a formula numeric field and using something like the record id value to group your data into downloads
eg
if([record ID]<10000,1,
[record ID]<20000,2,
[record ID]<30000,3,
99999)
Then set this as your predefined filter in your report and create buttons to download these. You could possibly incorporate the call to download the reports into one button, but would still need to merge the csv files.
Multiple CSV files can be easily merged using the dos prompt in windows, takes me less than a minute to merge hundreds of CSV's. You could probably write some kind of script to do that and remove the columns from the other CSV's. - ArchiveUserQrew CaptainThe other suggestions are correct. You need to either break up the report into smaller segments if you want to manually export those details, or the API allows you to access more data. The difficulty with the API process is it will return either XML, HTML, or JavaScript Objects depending on the API call and options used, and would then need to be converted to a usable format within your code.
- JohnHubertQrew MemberYou can use Qunect Backup: http://qunect.com/products.html
- NathanCrumQrew Member
I just ran into this problem as well with a report that only has 4 fields but over 120K records. If you are using the API to download it you can create a loop that keeps track of how many records you are downloading at a time and use API_DoQueryCount as well as the API_DoQuery options "num-n.skip-n" to paginate your way through the records like this sudo code below:
maxRecords = {API_DoQueryCount} retrievedRecords = 0 recordChunk = 1000 myCSV = path\to\csv tempRecords = ""
while (retrievedRecords < maxRecords) tempRecords = API_DoQuery&query="yourQuery"&options=num-{recordChunk}.skip-{retrievedRecords} myCSV += tempRecords retrievedRecords += 1000While the example above is over simplified and leaves out any conversion process of changing the XML into CSV I think it gives you a starting idea of how it can be done.
- GiuseppeMacriQrew CaptainStupid question, this is webhook code right?
- HankHalversonQrew CadetI recently encountered this error and after further investigation I noticed I had a lot of extra fields that had no values. After removing those empty fields from my default report, the file was small enough to be exported.
------------------------------
Hank Halverson
------------------------------