When the Expiration Date is On or Before Today, Change the Contract Status to Expired
Hello: I have a contract application. Goal: When the Contract Expiration Date field is on or before today, and the Exception to the Expiration Date Dynamic Form Rule field is unchecked, I want theContract Status field to automatically change to Expired. Question: What is the best method to accomplish the above goal, form rule, text formula, or pipeline? 1. I tried the below form rule. However,it works intermittently. I still see expired contracts that show as active 2. I created a formula - Text field named Contract Status Formula, but I am unsure whether my formula is correct. If( Today() <= [Contract Expiration Date], "Expired") 3. I tried to create apipeline. My first attempt was to Search Records/Update Record. My second attempt was to On New Event/Search Records/Update Record. However, I know I am missing a step. Any step-by-step guidance would be greatly appreciated. Thank you, RoulaSolved109Views1like12Comments(Pipelines) Get Users in Role / Process XML from HTTP API
Hello, any help/advice would be much appreciated. I'm trying to send a reminder email to users in a specific Role, in a specific App, using Pipelines. As far as I can tell, there is not a JSON RESTful API call that does this (Get Users only returns all users for an App, with no info on Roles). However, API_UserRoles returns each user from an app with what Roles they have. In theory, I could somehow loop over this and send the email to only those users with a specific role. I can successfully use the Quickbase Channel -> 'Quickbase APIs' -> 'Make Request' step to call API_UserRoles and get this data. Here's where I run into trouble: How do I process this XML into a form that another step could use (e.g. loop over it and send emails)? I found this question: "To capture an XML response from an API in Pipelines" but I can't seem to figure out how get {{a.json.qdbapi.value}}. When I try to view its contents (emailed to myself) it is blank. There isn't any "Result" field or something like that from this request step available in subsequent steps. Only URL, Method, Headers, Content Type, & Body. For instance, if I want to get the JSON out of the XML (using {{a.json.qdbapi.value}}) with the JSON Handler Channel -> 'Iterate over JSON Records', the 'JSON Source' field states 'No json sources in the pipeline' Thank you for any help you can offer, ~Daniel ------------------------------ Daniel ------------------------------101Views0likes7CommentsPipeline Make Request - Return Make Request JSON Output into fields
** Update: I managed to retrieve the ticket ID by using {{c.json.ticket.id}} The only one which remains is to do the same for the users, though that is not reading the value when formatting the jinja the same as the ticket request, even though the structure of the .json is the same. Hi Everyone, I have a Make Request, which has an endpoint which is pointing to our Zendesk instance: https://instance.zendesk.com/api/v2/users/search.json?query=email:{{a.email}} As you can see above, I am using the email address which is stored in the record field a.email - this checks if this end-user exists on Zendesk. After running the pipeline, it returns the following in the Output (I've redated extraneous data) { "users":[ {"id":13617301225116, "url":"https://instance.zendesk.com/api/v2/users/13617301225116.json", "name":"Nick Green", "email":"redacted@gmail.com" ]} extra data redacted } In the above users array it has the "id":13617301225116 value which I would like to send back to the Quickbase record to populate a text field by using an Update Record action in the pipeline. I use jinja in an attempt to extract the specific value: {{c.content.users.id}} - however this returns a null value. When sending the entire output to the field by just using {{c.content}} I get the proper json structure, though for some reason it seems that jinja is not parsing the returned output to extract the "id" value. Using {{c.content.users.id | tojson}} doesn't work and returns an error: Validation error: Incorrect template "{{c.content.users.id|tojson}}". TypeError: Undefined is not JSON serializable I also checked in with ChatGPT and it recommends using {{c.content.users.id}} Has anyone been able to successfully do the above? Cheers, NickSolved99Views0likes2CommentsJinja Dynamic Variable Name using Loop Index
Assuming I've two fields in my table son1 son2 I'm trying to create two records in another table using jinja loop(which is happening) but the dynamic names are not out of my reach "to": "bsp8bd9qp", "data": [ {% for i in range(2) %} { "17": { "value" : "{{a.son~loop.index}}" }} {% if loop.last == false %},{% endif %} {% endfor %} ] } Below works but predictably gives me value for only son1 and not son2 { "17": { "value" : "{{a.son1~loop.index}}" }} ------------------------------ Prashant Maheshwari ------------------------------68Views0likes2CommentsUpdating predecessors in a pipeline
Hi all, I am creating templates for our projects that contain a set number of tasks done in a specific order. I want to use a pipeline to create and schedule tasks when a new project is created. I'm running into a roadblock where I can't figure out how to keep the predecessors of the tasks so we can set a start date when the project is created, and it will populate all future task dates. I hope that makes sense. The pipeline is working fine to create the tasks, but I can't figure out how to set the predecessors. Here is my table structure (simplified): [TABLE] Template Projects - Project Name [TABLE] Template Tasks - Task Name - Duration [TABLE] Template Project/Tasks Associations - Related Project - Related Task - Predecessors [TABLE] Project - Select Template (Related Template Project) - Start Date [TABLE] Tasks - Related Project - Task Name - Duration - Start Date - Project Finish - Predecessors (Please be kind, I am new to this) Thanks in advance!64Views0likes6CommentsLast Triggered Timestamp
Hi, I started migrating automations to pipelines. Yesterday, I could easily access the timestamp indicating the last triggering of a pipeline directly on the table. However, as of today, I am not able to see on this table anymore. The only method to view the last triggered time is by clicking on the three dots and navigating to "View Activity." Is this a potential problem for our pipelines?56Views1like3CommentsTime of Day field values are not in 24-hour format in Pipelines
Hi all. I think I found a bug with Pipelines where using a Time of Day field isn't in 24-hour format even when you set the field settings to 24-hour. Can anyone think of a possible workaround? Most APIs use 24-hour format so having Pipelines only output in North American format makes it impossible to pass along the correct time format. Steps to reproduce: Create a Time of Day field. In field settings > Value display, enable 24-hour clock. Create a pipeline that sends a value in this field. Pipelines outputs the field in North American time format (for example, 5:00 PM), instead of 24-hour format (ex: 17:00). I tried using Jinja to reformat to 24-hour, but Pipelines doesn't support the jinja filters needed to re-format to 24-hour time. Here's the Jinja expression I tried: {{ time-field|replace(" AM", "").|replace(" PM", "")|stringptime("%I:%M")|strftime("%H:%M") }}Solved52Views0likes4CommentsDate value changes when copied to new table with pipeline.
I am updating an app with a [Fines] table that has 10 pairs of Pmt-Dt-* and Pmt-Amt-* fields. The date fields are Date (not Datetime). 1) I created a new [Payments] with Fine-ID, Pmt-Dt, and Pmt-Amt fields. The new table will support online payments, so we want the new Pmt-Dt field to be Datetime to differentiate between multiple payment attempts on the same day. 2) I created a pipeline to read each Pmt-Amt-* of each record. If Pmt-Amt-* > 0, then the pipeline inserts the payment into the [Payments] table. ISSUE: Although both tables are in the same application, the new [Payments] table dates are created with a timezone adjustment. The application is UTC-05:00 (Eastern US & Canada). • 2012-02-02 (in original Fines table) is inserted as 2012-02-01 19:00 in the new table (5 hours earlier). • 2018-08-08 (in original Fines table) is inserted as 2018-08-07 20:00 in the new table (4 hours earlier, during DST). I can probably use time.delta(hours=5) and time.delta(hours=4) to update the original values, but this will require identify the exact spring-forward and fall-back dates for each year with payments. Is there a better, more automated solution?51Views0likes5CommentsPipelines and Filters
Hello, I'm attempting to setup a Pipeline which checks to see if a record has two Numeric Summary Fields which equal each other, and if they do, it updates a multi-text status field from "Assigned" to "Delivered." When attempting to create the Pipeline, I've been struggling to get it to run. When attempting to set the filter so that it checks if Field A equals Field B, it seems like I can't check Field A against another field specifically, only enter a value of my own. I thought about getting around this by using a checkbox formula on the record which says if Field A = Field B, true and then having the Pipeline filter be setup such that it runs when the Checkbox field is true. However, the Pipeline never seemed to trigger off that Checkbox, even when it was changed from unchecked to checked. Is the issue that I'm not actually updating the record, but instead Summary fields which are being updated? If so, is there any way around that? Thank you in advance.50Views0likes6CommentsJinja Max Value of a Field
I have a Search step that looks at all the child records for a parent. In that Search I am collecting a single Date Time Stamp Field. Then I am evaluating the field to get the max value using this Jinja Expression: {{a|max(attribute='child_table_date')}} It does work, picking the maximum value from all the fields in the child table. However I am getting much more than just a date time stamp. --- child_table_date: 2022-11-04 00:00:00+00:00 created_at: 2022-07-26 02:40:56.063000+00:00 id: 3 last_modified_by: email: DLarson@MCFIndustries.com first_name: Don id: 56472559.bjvz last_name: Larson screen_name: DLarson_MCF record_owner: email: DLarson@MCFIndustries.com first_name: Don id: 56472559.bjvz last_name: Larson screen_name: DLarson_MCF updated_at: 2022-11-17 04:10:31.507000+00:00 ... I hoped to only get 2022-11-04 00:00:00+00:00 which I would use for another step Surely there is something missing from initial expression and it should not be providing everything else about the record and the value of the field. Anyone got a suggestion? ------------------------------ Don Larson ------------------------------Solved49Views0likes5Comments