Suggestion to work around when "Report too large. Maximum number of bytes in report exceeded" error?

  • 0
  • 1
  • Question
  • Updated 2 months ago
  • Answered
I want to queried all data from a table/report to save it into a local database. I've tried to export it into CSV format, however, got the problem as attached image. Please suggest ways to overcome it with:

  • no filter applied
  • one extracted data file only (like a svn dump file)
I've read QB HTTP API and wonder if it can help to over come the "too large" issue? 

Photo of Loan


  • 0 Points

Posted 4 years ago

  • 0
  • 1
Find a filter such as record ID and break the report in two.  Or three.
Photo of Jack

Jack, Champion

  • 50 Points
As above but create a formula numeric field and using something like the record id value to group your data into downloads
if([record ID]<10000,1,
[record ID]<20000,2,
[record ID]<30000,3,

Then set this as your predefined filter in your report and create buttons to download these. You could possibly incorporate the call to download the reports into one button, but would still need to merge the csv files.

Multiple CSV files can be easily merged using the dos prompt in windows, takes me less than a minute to merge hundreds of CSV's. You could probably write some kind of script to do that and remove the columns from the other CSV's.
Photo of Eric


  • 40 Points
The other suggestions are correct. You need to either break up the report into smaller segments if you want to manually export those details, or the API allows you to access more data. The difficulty with the API process is it will return either XML, HTML, or JavaScript Objects depending on the API call and options used, and would then need to be converted to a usable format within your code.
Photo of John Hubert

John Hubert

  • 80 Points 75 badge 2x thumb
You can use Qunect Backup:
Photo of Kirk


  • 90 Points 75 badge 2x thumb
This is just about the most ridiculous product limitation I have heard of.   If there is some arbitrary and or undocumented limitation on exporting a table to a file, then the export function ought to permit chunking it into multiple files, zipping the output our auto-building a table report that includes the same fields (all of them) that are typically exported using the tool...
Photo of Nathan Crum

Nathan Crum

  • 72 Points

I just ran into this problem as well with a report that only has 4 fields but over 120K records. If you are using the API to download it you can create a loop that keeps track of how many records you are downloading at a time and use API_DoQueryCount as well as the API_DoQuery options "num-n.skip-n" to paginate your way through the records like this sudo code below:

maxRecords = {API_DoQueryCount}
retrievedRecords = 0
recordChunk = 1000
myCSV = path\to\csv
tempRecords = ""
while (retrievedRecords < maxRecords) tempRecords = API_DoQuery&query="yourQuery"&options=num-{recordChunk}.skip-{retrievedRecords}     myCSV += tempRecords     retrievedRecords += 1000

While the example above is over simplified and leaves out any conversion process of changing the XML into CSV I think it gives you a starting idea of how it can be done.

Photo of G.Macri


  • 878 Points 500 badge 2x thumb
Stupid question, this is webhook code right?