Looping through field values in Pipeline
I have a record that contains a text field with multiple values separated by commas (which I can also turn into a multi select field if that makes things easier). I want to loop through the values in the field and within each loop perform a QB record update. In pipelines, I understand how I can perform a loop based off a list of records, but in this case I need the loop to be based off a list of values in a field.... any way to do this? Thanks. ------------------------------ Jennason Quick Base Admin ------------------------------499Views0likes21CommentsPipeline to remove file attachments
I am trying to investigate if there is a way to use a pipeline to remove file attachments? I am automating a process to incorporate data into our system using CSV uploads. The issue is that I would like to keep some of the information from the upload available (e.g. the Requestor, Date Uploaded, File Name) but I don't want to actually archive the file on QuickBase due to space limitations. Is there a way to just delete the file itself without deleting the entire record? You can of course click into the file and manually delete it (see below), but I want this to be automated as I won't be doing the uploads myself. The field type is "file attachment" and it appears to be stored as a URL/Link. It does not behave like other fields in the pipeline and I have not found a way to override it yet. Thanks! ------------------------------ Kevin Gardner ------------------------------414Views0likes16CommentsPipeline Iterate over JSON nested data
I was pulling in JSON data from Ground Control and was having some trouble pulling in some nested data while creating a record. Here is how I solved it. I hope I can save some people time. Used the method posted here to get this started: JSON Handler details Data initially pulled was not an issue at the top level. Nested inside my data is "customFieldValues". Sample: "customFieldValues": [ { "name": "SomeFieldA", "value": "Abcdefg" }, { "name": "SomeFieldB", "value": "1234567" }, { "name": "SomeFieldC", "value": "CwhatIDidThere?" } ] To pull in this value I needed to use a raw_record Jinja expression and state the location in the array. {{b.raw_record['customFieldValues'][0]['value']}} 0 is used since this is the first location in the Array. (This assumes that b. is the reference used for the other fields such as {{b.status}} ) This is placed in the field reference in Create Record step for SomeFieldA. ------------------------------ James Carlos ------------------------------300Views0likes3CommentsPipeline Make Request - Return Make Request JSON Output into fields
** Update: I managed to retrieve the ticket ID by using {{c.json.ticket.id}} The only one which remains is to do the same for the users, though that is not reading the value when formatting the jinja the same as the ticket request, even though the structure of the .json is the same. Hi Everyone, I have a Make Request, which has an endpoint which is pointing to our Zendesk instance: https://instance.zendesk.com/api/v2/users/search.json?query=email:{{a.email}} As you can see above, I am using the email address which is stored in the record field a.email - this checks if this end-user exists on Zendesk. After running the pipeline, it returns the following in the Output (I've redated extraneous data) { "users":[ {"id":13617301225116, "url":"https://instance.zendesk.com/api/v2/users/13617301225116.json", "name":"Nick Green", "email":"redacted@gmail.com" ]} extra data redacted } In the above users array it has the "id":13617301225116 value which I would like to send back to the Quickbase record to populate a text field by using an Update Record action in the pipeline. I use jinja in an attempt to extract the specific value: {{c.content.users.id}} - however this returns a null value. When sending the entire output to the field by just using {{c.content}} I get the proper json structure, though for some reason it seems that jinja is not parsing the returned output to extract the "id" value. Using {{c.content.users.id | tojson}} doesn't work and returns an error: Validation error: Incorrect template "{{c.content.users.id|tojson}}". TypeError: Undefined is not JSON serializable I also checked in with ChatGPT and it recommends using {{c.content.users.id}} Has anyone been able to successfully do the above? Cheers, NickSolved299Views0likes2CommentsPipelines Advanced Query
The goal is to build an Advanced Query in a Pipeline that will compare two User Fields in the record. I am looking for the Record Owner to match the User in FID 9. The query that will not work is {'4'.EX.'9'} What the query says is Record Owner equals 9 not the value of the User in FID 9. Searching for the syntax to get this right has been fruitless. Anyone know a solution? ------------------------------ Don Larson Paasporter Westlake OH ------------------------------299Views1like10Comments(Pipelines) Get Users in Role / Process XML from HTTP API
Hello, any help/advice would be much appreciated. I'm trying to send a reminder email to users in a specific Role, in a specific App, using Pipelines. As far as I can tell, there is not a JSON RESTful API call that does this (Get Users only returns all users for an App, with no info on Roles). However, API_UserRoles returns each user from an app with what Roles they have. In theory, I could somehow loop over this and send the email to only those users with a specific role. I can successfully use the Quickbase Channel -> 'Quickbase APIs' -> 'Make Request' step to call API_UserRoles and get this data. Here's where I run into trouble: How do I process this XML into a form that another step could use (e.g. loop over it and send emails)? I found this question: "To capture an XML response from an API in Pipelines" but I can't seem to figure out how get {{a.json.qdbapi.value}}. When I try to view its contents (emailed to myself) it is blank. There isn't any "Result" field or something like that from this request step available in subsequent steps. Only URL, Method, Headers, Content Type, & Body. For instance, if I want to get the JSON out of the XML (using {{a.json.qdbapi.value}}) with the JSON Handler Channel -> 'Iterate over JSON Records', the 'JSON Source' field states 'No json sources in the pipeline' Thank you for any help you can offer, ~Daniel ------------------------------ Daniel ------------------------------299Views0likes8CommentsPipelines and Filters
Hello, I'm attempting to setup a Pipeline which checks to see if a record has two Numeric Summary Fields which equal each other, and if they do, it updates a multi-text status field from "Assigned" to "Delivered." When attempting to create the Pipeline, I've been struggling to get it to run. When attempting to set the filter so that it checks if Field A equals Field B, it seems like I can't check Field A against another field specifically, only enter a value of my own. I thought about getting around this by using a checkbox formula on the record which says if Field A = Field B, true and then having the Pipeline filter be setup such that it runs when the Checkbox field is true. However, the Pipeline never seemed to trigger off that Checkbox, even when it was changed from unchecked to checked. Is the issue that I'm not actually updating the record, but instead Summary fields which are being updated? If so, is there any way around that? Thank you in advance.200Views0likes6CommentsPipeline Jinja template expression
Hello, I am looking for a jinga expression/syntax which could help me validate the sum(Field-1) > NN value in a step using if condition. Below is the pipeline steps: Table-B structure Record ID#, country, Qty issued 1, UK, 20.5, 2, UK, 30 3, UK, 30.8 Pipeline Step-A When the record is created in table-A Step-B Search records in Table-B where b.country = "UK" If sum(qty) >= 100 Step-C - Create record in Table-C Else Step-D - Create record in Table-D I want the jinga expression to be added in step-B post search is returned to check the sum(qty) and compare if it exceed or equals 100 ------------------------------ MC ------------------------------200Views0likes3CommentsPipeline issue: "invalid literal for Decimal" ???
What the heck does this mean? I can't seem to figure it out. ("1170 Martingale" is the name of the record being modified to trigger the pipeline) I have no idea what the export fields are referring to either. I can't even seem to locate a discussion on here similar to this. Any ideas out there? The pipeline fails to produce the action because of this error. Validation error: Invalid literal for Decimal: u'1170 Martingale' Input (Options) table:"bkcuzgpzg" export_fields:Array[6] 0:"112" 1:"7" 2:"159" 3:"148" 4:"157" 5:"177" Input (Mapping) related_opportunity:"{{a.opportunity}}" related_customer:"{{a.opportunity_contact_full_name}}" scheduled_for:"amanda@getnewclosets.com" schedule_status:"Scheduled" scheduled_date_time:"{{a.updated_at}}" related_company:"{{a.opportunity_company_name}}" activity_type:"Schedule LE" ------------------------------ Thanks, Chris Newsome ------------------------------200Views0likes5CommentsHow do I use Today() in a Pipeline Query line?
I have a pipeline that I need to run every morning and based on several conditions update a set of data. One of the data points is the Effective Date. I don't have the option to drag the value from a field and I need the condition to look at all records from the previous day (the pipeline is scheduled to run at 1:00 AM each morning). The only options I see in the highlighted field is using the calendar and I don't have an option for "today". ------------------------------ Paul Peterson ------------------------------199Views0likes9Comments