Forum Discussion

ZacharyLancaste's avatar
Qrew Member
11 months ago

How to iterate through the fields of a single record in a Quickbase pipeline?


I'm currently attempting to create a pipeline which converts a list of construction parts and their quantities that are contained in a single row of a build "Activities" table into a set of records in a different "Build Parts" table. By this I mean that a single row of this Activities table has a series of fields labeled "Part 1", "Quantity 1", "Part 2", "Quantity 2", "Part 3", "Quantity 3", etc. There are 50 pairs of these fields, so I would like to find out a way to iterate through the fields of this single record and create up to 50 distinct records in a different table.

My current thoughts on how to do this would be to convert this record into a JSON format and use Jinja to extract the field data, though I am not entirely sure how I could do this.

I would like to avoid needing to create 50 individual add record steps to the pipeline if possible, as I am to believe that I would need to call multiple different pipelines in order to do this (due to the max step count for a single pipeline), which is not optimal for my application.

Please note that I am new to Quickbase and Jinja, so there may be some gaps in my understanding.

Thank you,

Zach Lancaster

Zachary Lancaster

1 Reply

  • It'll take a little bit of tweaking on your part to get it right - but I would suggest making a formula text object in Quickbase that builds a 'json' string that you can have direct control of in an easier to use format. Basically you can have a bunch of if() statement for if( part 45 is present and qty > 0 then build json. Your json object should look something like: 

    "data": [



              "fid of part field": {"value": "part value"},

              "fid of qty field": {"value": "qty value"},

               "fid of related parent record field": {"value": "record ID of something"}     



    You would do that for each field first checking if it has a value and if it doesn't then ignore that entirely.

    You'd then want to build a pipeline that does a Request using the REST API and specifically the insert/update records call

    The reason for the above is that if you have upwards of 50 fields and quantities, if you try and do an add record for each the pipeline will max out with the 26 steps available right now. You could try and do the above purely in the Pipeline - but it'll be almost impossible to read and hard to manage. In this example, putting it in native QB first allows you to actually see the payload and you know its right/accurate and then the Pipeline is basically just a passthrough to load. 

    Chayce Duncan