Forum Discussion
MarkShnier__You
Qrew Legend
5 years ago... The plot thickens. Support says that the steps in a For Each loop run asynchronously and thus the sequence cannot be controlled.
I have asked for an Escalation to Pipeline HQ :) as there are valid use cases when For Each loops need to have a controlled sequence.
... say tuned.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------
I have asked for an Escalation to Pipeline HQ :) as there are valid use cases when For Each loops need to have a controlled sequence.
... say tuned.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------
QuickBaseJunkie
Qrew Legend
5 years ago@Mark Shnier (YQC) I don't see how that's possible... each step has access to information from a prior step. That means it must 'run' the prior step for that info to be used in the current step...
I suppose, however, that if a step doesn't USE any info from a prior step, pipelines could determine there is no reason to wait to run it...
Which is likely the case for your webhooks that don't use any data from prior steps.
------------------------------
Sharon Faust (QuickBaseJunkie.com)
Founder, Quick Base Junkie
https://quickbasejunkie.com
------------------------------
I suppose, however, that if a step doesn't USE any info from a prior step, pipelines could determine there is no reason to wait to run it...
Which is likely the case for your webhooks that don't use any data from prior steps.
------------------------------
Sharon Faust (QuickBaseJunkie.com)
Founder, Quick Base Junkie
https://quickbasejunkie.com
------------------------------
- MarkShnier__You5 years ago
Qrew Legend
I did notice that in the Pipelines dialogue it did say it was starting the For Each loop "in parallel". So I totally get why it would want to essentially fire off a multitude of webhooks (or however it actually does it under the covers) all at once and have them all race to completion so that the Pipeline runs fast.
But I'm hoping that there will be a trick to get them to run in series.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------ - MarkShnier__You5 years ago
Qrew Legend
Right, so maybe the escalation expert will come back with a way to trick it into waiting for he completion of the previous step.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------ - BlakeHarrison5 years agoQrew CaptainThis is one of the "benefits" of Pipelines. Running them in parallel is supposed to increase performance, but I totally understand your use case.
------------------------------
Blake Harrison
bharrison@datablender.io
DataBlender - Quickbase Solution Provider
Atlanta GA
404.800.1702 / http://datablender.io/
------------------------------ - QuickBaseJunkie5 years ago
Qrew Legend
@Mark Shnier (YQC) It ain't pretty, but using conditions (without any actual conditions) may do the trick. My test seemed to put them in the proper order.
------------------------------
Sharon Faust (QuickBaseJunkie.com)
Founder, Quick Base Junkie
https://quickbasejunkie.com
------------------------------ - MarkShnier__You5 years ago
Qrew Legend
Thx for trying Sharon, I gave your method a try and it still did not work for me.
I'm thinking that I would need to get the Pipeline to do some kind of query on the records it just created to get it to wait for the previous step to complete. But in my use case, it's running a webhook and would have no idea that that the webhook was creating records.
------------------------------
Mark Shnier (YQC)
Quick Base Solution Provider
Your Quick Base Coach
http://QuickBaseCoach.com
mark.shnier@gmail.com
------------------------------ - MikeTamoush4 years agoQrew Elite@Mark Shnier (YQC)
Did you ever come up with a work around or fix for this? I am running into to this exact scenario. I have a helper table which takes information and uses it in a table 2 table import to mass create records. I'm seeing the same thing, the helper table is quickly being updated before by T2T is running, thus missing some instances.
How did you solve?
------------------------------
Mike Tamoush
------------------------------ - MarkShnier__You4 years ago
Qrew Legend
Mike,
Usually the best case scenario is that if you can use the helper table but not depend on any of its look up values and instead create a bulk import set and some kind of for each loop the populates that bulk import set. That way the pipeline can run asynchronously.
But if you absolutely need those look up values once the focus is set for that helper table then it has to be a looping pipeline which calls itself. So that means the pipeline itself needs to be a callable pipeline where the first step in the pipeline is pipeline called. So then they would have to be a Nother pipeline which gets triggered to search a table for some records but return only one value from the search. When you create a search step you can limit the results to only X number of results so you could limited to the first result.
Then he would set the focus and call the pipeline which would use that helper tables value and then the next step in the Callable pipeline need to do something to trigger the search again which would still be limited to just return one value.
Of course you need to be doing something so that the search results are different the second time and that eventually there are no results returned from the search. There is a pipeline step you can introduce that can detect if a search returns no results.
------------------------------
Mark Shnier (YQC)
mark.shnier@gmail.com
------------------------------ - MikeTamoush4 years agoQrew EliteYes I need the lookup values so I suppose I'll have to loop my pipeline. I think the max loops would be about 24 which isn't terrible, but it's too bad the for each loop does not complete before starting the next. I guess in most instances it will speed things up though.
------------------------------
Mike Tamoush
------------------------------ - MikeTamoush4 years agoQrew Elite
Actually, I realized my issue is different. My trigger is on a new record added. Right now, I allow my users to add records via an editable report link. This means, as soon as they save, all the records are created virtually simultaneously. This means the pipeline fires a dozen times almost instantly, and my helper table gets updated and a T2T runs, but as everything is in a 'race', its a roll of the dice which ones make it.
I think in my case I need to remove the editable report link so they are forced to add a record individually. At least that's all I can think of for now unless I come up with a different solution than a helper table, but that solution allows for a T2T import which is so fast for creating a lot of records.
------------------------------
Mike Tamoush
------------------------------