Forum Discussion

ShalomEguale's avatar
ShalomEguale
Qrew Trainee
12 hours ago
Solved

Bulk upsert field limit in pipeline

Hi,

I have a pipeline that I'm using the bulk upsert steps for. Essentially, I'm just copying data from one table to another table and I need to do this bi-weekly. It's my first time using this bulk upsert and now I see there is a 250 field limit. Is there any way around this limit? I checked online and the solution seems to be to using a helper table, I'm not sure how to do this. I only have about 400 fields so I don't want to do that if unnecessary. I also saw something about a loop but I don't know how to do this either. What's the best way to go about this?

I have the following steps in my pipeline:

  1. Prepare bulk record upsert 
  2. Search Records
  3. In Loop: Add a bulk upsert row - End loop
  4. Commit upsert
  • np,

    In regular Quickbase go to any table home page or field list and look for import export. Then select import from another table.

    Go to select the app, but caveat!  Your browser will probably block the pop up , so acknowledge that in the browser URL and select always allow popups from Quickbase.

    Then select your app and the source table.

    Then do the field mapping and filtering. 

    Save and note the id# from the URL  it will be #10 as it will be the first one for that table.

    Then compose the URL like my example including the xxxxxxxx for the the table id.

    Then in the pipeline you will just have one step as this will be a scheduled pipeline.  Just look in Quickbase steps for the Make request step.  Put in the URL and select Method = Post.

    Feel free to post back if you get stuck anywhere  

     

6 Replies

  • I think I have a way easier way.  The only downside I can see is that if the two apps are already completely separate then this will cause them to be in the the same Quickbase server instance which can affect performance if you have a lot of concurrent users. 


    You can create a saved Table to Table import. 

    Then simply have a pipeline step to do a Quickbase Make Request step 

    https://mycompany.quickbase.com/db/xxxxxxxx?a=API_RunImport&id=10

    Set the Method to POST

    Replace the 10 with the number of your import from the URL  

     

    If you get lucky and the field names match, the Saved T2T copy will guess the matching correctly for 400! fields.

  • .. also feel free to post back if you don't know what a saved table to table import is. 

    • ShalomEguale's avatar
      ShalomEguale
      Qrew Trainee

      Hi Mark,

      Both tables are actually within the same app. 

      And yes, I don't know what a saved table to table import is. I'm also not sure about the make request step. Could you give me instructions on how to do this?

  • np,

    In regular Quickbase go to any table home page or field list and look for import export. Then select import from another table.

    Go to select the app, but caveat!  Your browser will probably block the pop up , so acknowledge that in the browser URL and select always allow popups from Quickbase.

    Then select your app and the source table.

    Then do the field mapping and filtering. 

    Save and note the id# from the URL  it will be #10 as it will be the first one for that table.

    Then compose the URL like my example including the xxxxxxxx for the the table id.

    Then in the pipeline you will just have one step as this will be a scheduled pipeline.  Just look in Quickbase steps for the Make request step.  Put in the URL and select Method = Post.

    Feel free to post back if you get stuck anywhere  

     

    • MarkShnier__You's avatar
      MarkShnier__You
      Icon for Qrew Legend rankQrew Legend

      ... also when you build the saved T2T import you can run it manually to test.

      • ShalomEguale's avatar
        ShalomEguale
        Qrew Trainee

        Wow, that worked perfectly. Thank you for this quick and simple solution!