ContributionsMost RecentMost LikesSolutionsRe: API Connection from SAS to Quickbase-Connection failed Did you solve this somehow? I'm having a similar connection issue. ------------------------------ Daniel Johnson ------------------------------ Re: Pipelines Advanced Query Hey Don, I can't recall what the exact use case was here, but I'm now sure the issue was I was trying to evaluate a logical expression using a query expression. I'm sure if I used {% if a.$prev.status == a.status %}TRUE{% endif %} that the condition would have returned true if the values were actually equivalent, and false if they weren't. The query expression has to look like this: {'fid'.operator.'matching_value'}. Source ------------------------------ Daniel Johnson ------------------------------ Re: Clean up Searchable fields - best practicesActually, you might start by using the JSON handler and calling https://api.quickbase.com/v1/fields?tableId={tableId} first, because then the JSON handler will setup the loop for each field ID, so you wont' have to do that part yourself. ------------------------------ Daniel Johnson ------------------------------ Re: Clean up Searchable fields - best practiceshttps://developer.quickbase.com/operation/updateField ------------------------------ Daniel Johnson ------------------------------ Re: Clean up Searchable fields - best practicesI'm late to the conversation here (was looking for a discussion on whether making lookup fields searchable or not has an impact on performance) but you can do this with the the Quickbase API and pipelines. I am once again shouting out Scott Galloway and his Empower session from last year (https://youtu.be/6gAHfV0PDWk) in this solution. You would have one pipeline call another pipeline, and that second pipeline would loop on itself calling this endpoint, https://api.quickbase.com/v1/fields/{fieldId}?tableId={tableId} first to see if the field exists, and if it does then you would call this endpoint, https://api.quickbase.com/v1/fields/{fieldId}?tableId={tableId} and in the payload have "findEnabled": false. Whether the field exists or not you would loop the pipeline back on itself adding 1 to the field ID each time. Scott's video describes how you can make sure to limit the number of runs that occur only to the number of fields you have based off the field ID number. ------------------------------ Daniel Johnson ------------------------------ Re: Epoch & Unix Timestamp Conversion For anyone looking at this in the post-JavaScript era: ToTimestamp(Date(1970,1,1))+Seconds([UNIX Time]) ------------------------------ Daniel Johnson ------------------------------ Re: Fetch JSON from an array Got a solution to my question from the care team. Since I didn't need multiple records but just one piece of the JSON this worked for the value in field in question: {% for item in b.deltas.object_data.metadata.recents %} {% if loop.last==true %} {{item.user_agent}} {% endif %} {% endfor %} I confirmed by grabbing the ID field in that object, because it acts as the index. So, changing out loop.last==true for loop.first==true and asking for item.id returned a value of 0, which would be the id of the first object in the array. loop.last==true for the last test gave me 12. ------------------------------ Daniel Johnson ------------------------------ Re: Fetch JSON from an array I'm working on a similar issue. This is email tracking data and the provider webhook fires every time an email is opened. However, it includes an array called "recents" that has the recipients email client in it. However, that array sends back all recent opens, and the table in the QB app is a log table that's only concerned with the most recent event. I tried to follow Jeremy's suggestion above about just using the array node for the data we're after, but I got an error that said Invalid JSON records location: '/deltas/object_data/metadata/recents' Here's the sample schema for the first iterate step: { "deltas": [{ "date": 0, "object": "metadata", "type": "track.type", "object_data": { "namespace_id": "text", "account_id": "text", "object": "metadata", "attributes": null, "id": "text", "metadata": { "sender_app_id": 0, "thread_id": "text", "reply_to_message_id": "text", "timestamp": 0, "from_self": true, "link_data": [{ "url": "text", "count": 0 }], "recents": [{ "ip": "text", "user_agent": "text", "timestamp": 0, "link_index": 0, "id": 0 }], "message_id": "text", "payload": "text" } } }] } For that step I didn't specify a JSON Records Path and it's working fine. For the next one, the source is still the incoming JSON request in the trigger, but the sample schema is: { "ip": "text", "user_agent": "text", "timestamp": 0, "link_index": 0, "id": 0 } and I put /deltas/object_data/metadata/recents as the JSON records path. The most recent email open is the last object in that recents array, so I was hoping to use this as a conditional in the for each loop to capture just the last object: {% for item in d.recents %} {% if loop.last %} TRUE {% endif %} {% endfor %} I haven't gotten far enough to see if it will work. Thoughts on the "Invalid JSON records location: '/deltas/object_data/metadata/recents'" error? ------------------------------ Daniel Johnson ------------------------------ Re: How do you loop through an array on a query JSON object item in a Pipeline Hey Roger, I'm trying the following but getting an error: { "to": "12345abcd", "data": [ {% for item in a.body_json.deltas %} { "14":{"value":{{item.date}} } {% endfor %} ] } In pipelines the error just says, "Validation error: Body is not a valid JSON object" and when I try it on JSONlint.com it says, "Expecting 'STRING', '}', got 'undefined'" after the curly brackets before the for loop starts. Any ideas? ------------------------------ Daniel Johnson ------------------------------ Re: URL Button to send API call, open email, then reloadI also wrapped the last redirect line in a URLEncode() and I'm getting the same result: a blank page with the URL in the top bar. ------------------------------ Daniel Johnson ------------------------------