Forum Discussion

LoanLoan's avatar
LoanLoan
Qrew Member
9 years ago

Suggestion to work around when ""Report too large. Maximum number of bytes in report exceeded"" error?

I want to queried all data from a table/report to save it into a local database. I've tried to export it into CSV format, however, got the problem as attached image. Please suggest ways to overcome it with:

  • no filter applied
  • one extracted data file only (like a svn dump file)
I've read QB HTTP API and wonder if it can help to over come the "too large" issue? 

9 Replies

  • I recently encountered this error and after further investigation I noticed I had a lot of extra fields that had no values. After removing those empty fields from my default report, the file was small enough to be exported.

    ------------------------------
    Hank Halverson
    ------------------------------
  • I just ran into this problem as well with a report that only has 4 fields but over 120K records. If you are using the API to download it you can create a loop that keeps track of how many records you are downloading at a time and use API_DoQueryCount as well as the API_DoQuery options "num-n.skip-n" to paginate your way through the records like this sudo code below:

    maxRecords = {API_DoQueryCount} retrievedRecords = 0 recordChunk = 1000 myCSV = path\to\csv tempRecords = ""
    while (retrievedRecords < maxRecords) tempRecords = API_DoQuery&query="yourQuery"&options=num-{recordChunk}.skip-{retrievedRecords}     myCSV += tempRecords     retrievedRecords += 1000

    While the example above is over simplified and leaves out any conversion process of changing the XML into CSV I think it gives you a starting idea of how it can be done.

  • This is just about the most ridiculous product limitation I have heard of.   If there is some arbitrary and or undocumented limitation on exporting a table to a file, then the export function ought to permit chunking it into multiple files, zipping the output our auto-building a table report that includes the same fields (all of them) that are typically exported using the tool...
    • MarkLind1's avatar
      MarkLind1
      Qrew Trainee

      100% agreed. If you max out a table's limit, then attempt to export records to try and reduce its total size downward, you're effectively stuck in an endless cycle of dealing with rather vague imposed restrictions that should be handled automatically through the export function QuickBase offers...



      ------------------------------
      Mark Lind
      Applications Specialist
      CCI Systems, Inc.
      ------------------------------
  • The other suggestions are correct. You need to either break up the report into smaller segments if you want to manually export those details, or the API allows you to access more data. The difficulty with the API process is it will return either XML, HTML, or JavaScript Objects depending on the API call and options used, and would then need to be converted to a usable format within your code.
  • As above but create a formula numeric field and using something like the record id value to group your data into downloads
    eg
    if([record ID]<10000,1,
    [record ID]<20000,2,
    [record ID]<30000,3,
    99999)

    Then set this as your predefined filter in your report and create buttons to download these. You could possibly incorporate the call to download the reports into one button, but would still need to merge the csv files.

    Multiple CSV files can be easily merged using the dos prompt in windows, takes me less than a minute to merge hundreds of CSV's. You could probably write some kind of script to do that and remove the columns from the other CSV's.