Forum Discussion

NSH_F_WT's avatar
NSH_F_WT
Qrew Member
2 days ago

Records, Pipelines, and Callable Pipelines clarification

Hello! I am a new user, and I'm looking for some clarifications on some of the specifics about how records, pipelines, and the callable pipelines action relate.

I have a complex pipeline trigger by a user record creation. The pipeline runs through a substantial decision tree and updates records and sends notifications depending on the path through the decision tree the record takes.

However, I have run out of actions and need to set up a callable pipeline to do some record validation (instead of adding those steps into the main pipeline).

So, I am working on setting up the callable pipeline, but I want to make sure that the pipelines are only updating a given record, and not ALL records.

-

Update: before you read the next part of the post, consider that I may be overthinking this whole thing and I just need to pass through the record ID(s) for the records I want my called pipeline to validate.

-

  1.  I assume that when a pipeline runs with the "record created" trigger that the pipeline will run only with relation to a given record; i.e. is the context of the pipeline run a single record?
    Is that correct, or does it evaluate the whole table of records? 
    Or do I need to set a record limit on the trigger?
  2. When I use a callable pipeline, does the calling pipeline maintain the context of the call; i.e. for a single or batch of records?
    Depending on that answer, does the pipeline evaluate each of those records individually, or does it evaluate the entire table with respect to the called field(s)?
  3. For example, here are my two actions. I thought I had them set up correctly, but I can't call the fields from the calling pipeline in the called pipeline (third screenshot). Update: this may be because neither of the pipelines are turned on, and I may be using Jinja expressions where I should instead be using aliases which I can set to the calling pipeline's fields (see this video).
    Do I need to bother with passing the record IDs ("a.id"), or are those "implicit"?
    Call action:

    Called trigger:

    Not finding called fields:

     

  4. BUT let's assume that I get these calls to work. Will the changes to the records made in the calling pipeline have been written to the table when the called pipeline is called? Because that pipeline will need to do evaluations based on the records' data, which is dependent on the actions in the calling pipeline!

 

Thanks for all of your help!

1 Reply

  • Maria's avatar
    Maria
    Community Manager
     
    1. The answer to this question depends on the context of your app and what the pipeline is being asked to do. Generally I don't care to control the number of records updated unless I'm testing the pipeline and want to limit the time and processing bandwidth used. If there should never be more than one or two or 15 records updated then you can apply the limit as a stopgap, just in case, but again, it depends on what you expect the Pipeline to do. I hope others chime in on how they use and get the most value out of this feature. 
     
    2. See #1, it depends what the trigger is and what directives the called Pipeline is given. Your video basically follows the Callable Pipelines FAQ tutorial. If you don't apply conditions to your function it will apply to all the records from the caller Pipeline. Notice the blue "add conditions" button that appears after you define your function inputs. 
     
    3. This is where you start to lose me. I think you're still asking the same question but I'm not 100% sure of that. As of my last use of Pipelines, you did want to pass in the key identifier for records you want to transform which is often {{a.record_id#}} I believe this is still the case.
     
    4. Yes, the 1st Pipeline will complete the preceding steps before it moves to the subsequent step of calling the callable Pipeline. While requests made to Quickbase can execute out of order, a Pipeline follows the logic outlined in the order it was defined in. It could be that the request sent from the calling Pipeline has not completed processing all it's records yet, but the calling Pipeline did execute that step before moving on to the next one. 
     
    Hopefully, some of this made sense. I'll look forward to hearing how others here clarify what you're looking for. 
     
     
    Quickbase University resources: 
    Log into https://university.quickbase.com 
    then go to


    Possibly related community posts: