Forum Discussion

JBJB's avatar
JBJB
Qrew Trainee
6 years ago

QuickBase API- QID + range of records

Hi there

So i utilized Alteryx to download/upload/delete data from QuickBase. Im working on building a tool for cohearts that will allow them to download quickbase data via api and speiicfying the QID. I have one report that i'm testing, i can't get it to download due to the 200mb rule. 

Is there anyway to do the qid then also send an API command to do like 

&query='3'.GTE.'[record_range_start])'}AND{'3'.LTE.'[record_range_end])'}

So then lets say it figures out it needs to do 5 batches ( 0-3000,3001-6000,etc), it will do the above query + qid 5 times so that it all gets unioned together to have the total records.

5 Replies

  • When I have had issues with downloading data that is too large; I create a Role and restrict access to the records based on the criteria I need.  Then I can assign a User to that Role for the download function.  This might be a temporary-solution for you to meet your needs; but will still require a manual effort to keep changing the Role Access Permissions criteria between each download.  I usually only deploy this when I am backing up an entire table that is over 500k records.  Make sure, if you do this, you either make sure the Role you create has administrative-features so you can modify and change your Role back to your default role after the download process.
  • Hi JB,

    I think we may work at the same company in Dallas... would you mind if I asked for a copy of your Alteryx workflow that downloads and updates QuickBase data?  I'm working on an issue where I need to do that and am starting from scratch.  I'm decent with Alteryx, but I hardly ever use QuickBase.  I can follow up with you via email if that would be easier.  Thank you!
    • JBJB's avatar
      JBJB
      Qrew Trainee
      Hi. I�m going to send you an email with some info regarding this tool I�ve been working on.
  • A common method I've employed with client side scripting ( I'm not familiar with Alteryx so I'll leave this here and ignore me if its not relevant or doable ) is to do it in batches like you describe doing a combination of API_DoQueryCount with API_DoQuery and using the 'num-' and skp-n components of API_DoQuery in the 'options' parameter

    So in your table - lets say you have 100,000 records - API_DoQueryCount returns that #. Pick a batch size you like based on whatever feels right - say 15,000, and do the Ceiling() of Num of records / your chosen batch size. So 7 batches in this example. Simple For loop through your number of batches - you end up doing 7 API_DoQuery calls, you can call the same qid over and over - and your actual API call looks like 

    API_DoQuery&qid=1234&options=num-20000.skp- // You would skip 20,000 * wherever you are in the loop 
    So all in all it looks like
    API_DoQuery&qid=1234&options=num-20000.skp-0
    API_DoQuery&qid=1234&options=num-20000.skp-15000
    API_DoQuery&qid=1234&options=num-20000.skp-30000
    API_DoQuery&qid=1234&options=num-20000.skp-45000
    API_DoQuery&qid=1234&options=num-20000.skp-60000
    API_DoQuery&qid=1234&options=num-20000.skp-75000
    API_DoQuery&qid=1234&options=num-20000.skp-90000

    Just string all the responses together to build the entire response

    Chayce Duncan | Technical Lead
    (720) 739-1406 | chayceduncan@quandarycg.com
    Quandary Knowledge Base
    • JBJB's avatar
      JBJB
      Qrew Trainee
      I think I did the same thing but did it off a query of column 3 (record id)

      Turns out I could do api get schema to get what is mostly the extended url and parsed it apart.