ContributionsMost RecentMost LikesSolutionsTransforming data with the new Quickbase API and Pipelines Recently I was helping a customer with a use case where there was a need for data transformation. The requirement was to take daily snapshots of each customers accounts receivable data, aggregate it by country and then snapshot it. If you are someone who can code and knows how to use RESTful web services, you’re probably thinking "no problem”. Otherwise, if you are someone who knows Quickbase (a real creative problem solver) you're probably thinking “I know I could figure out a way”. The challenge for the developer type is where are you going to host the script, you’ll need to consider chunking the data because of the volume and since your chunking you’ll need to consider throttling, among other things. For the Quickbase creative problem solver, the volume of data would have you pulling your hair out trying to get something working and would likely have a number of daily manual steps in place. For me, I haven’t been a true developer for over 10 years, so everything that goes into the script like error handling, chunking, throttling etc., not to mention standing up an environment to host it, feels like a big effort. This is where Quickbase Pipelines and the New Quickbase API come into play. Using these two new Quickbase features, I was able to quickly develop a strategy and work with the customer to implement it in a couple of days. For the trending analysis the data needed to be aggregated by Country which took the number of records down to ~60 on a daily basis. The daily data once aggregated could be wiped out the next day. So how did we snapshot and aggregate the daily data: Create a summary report grouped by Country Use the new API within Pipelines to execute the report and return the summarized records as a single record. Write them to a snapshot table. With the new API, you can execute reports - more importantly summary reports - and get the entire data set back in JSON form. This is one of the things I really like about the new API. I’ve used this strategy numerous times since the new API was released. Then, with Pipelines, I can easily iterate over the summarized record set with the Pipelines JSON handler and create the snapshot records. With the XML-based API, you were not able to return summarized data with API_DoQuery, and with API_GenResultsTable the summarized columns were not returned as part of the record set. Additionally, with the New API, calls are very much in line with modern APIs: POST https://api.quickbase.com/v1/reports/{reportId}/run?tableId={tableId} With Pipelines, I don’t need to worry about setting up an infrastructure. I just create a new Pipeline. I don’t need to worry about coding syntax, I use the Pipeline Channels and Actions which hide the syntax aspect of coding, so all I need to do is worry about the logic - even throttling and retries are built in. One additional plug here for the New Quickbase API is the developer portal. If you frequently work with the XML Quickbase API, I highly recommend you go check it out. It provides a playground to quickly test out each of the new API calls. In fact, before I built the Pipeline discussed in this post, I used the developer portal to test out and verify my expected results. Pipelines and Object Linking Pipelines and Object Linking As a Quickbase Solution Architect I was more than thrilled when I learned Quickbase was acquiring an Integration Service Platform . Historically, when I've encountered customers with a need to integrate or extend Quickbase I would recommend writing code. Unfortunately, these options were not always easily achieved for a number of reasons. In some cases, there was a skill gap; or if the skill set was there time was not. In other cases, I would recommend leveraging a Quickbase Solution Partner - paid professionals who are highly skilled Quickbase developers; however, sometimes it was hard to find the budget. More recently Quickbase Automations have made building workflows between Quickbase apps much more achievable. Quickbase Webhooks have done the same for simple integrations. Now with Quickbase Pipelines we have the ability to create complex workflows between not only Quickbase apps, but any cloud service that supports RESTful web services. Also, in some cases Pipelines has made it even easier by providing Channels to other cloud services such as SalesForce, Slack, JIRA, and Workday to name a few. Channels are simply UI (User Interface) overlays to some of the more technical aspects of working with a Cloud Services API (Application Programmable Interface). This means Citizen Integrators don't need to know how to code or understand the anatomy of a web request. All they need to know is the logic required to achieve a desired workflow. For example, consider an Asset Tracking use case where Issues are tracked against Assets. When an issue is reported you want to be able to upload pictures to a folder in Box for that specific issue. In this case I can easily build a Pipeline to handle the creation of the folder in Box whenever an issue is created in Quickbase. But that is not all, one of the coolest features of Pipelines, in my humble opinion, is object linking. What that means is Pipelines will handle the mapping of the two objects (I.e. the issue record created in Quickbase and the associated folder created in Dropbox). Now whenever a file is uploaded to Box under an Issues folder, Pipelines can add a URL to the associated Issue in Quickbase. PIPELINE 1: CREATE A FOLDER IN BOX and ESTABLISH LINKING TO THE ISSUE PIPELINE 2: ADD A URL LINK TO THE ISSUE WHEN A FILE IS UPLOADED TO THE ISSUES FOLDER IN BOX This is extremely powerful and might even seem somewhat magical. The power of it is once you have linked the two objects in a Pipeline, the linking persists to be used in subsequent Pipelines, and you don't need to worry about file ids, folder ids, record ids, etc., Pipelines handles it all. To illustrate this further, I've added a video with the above use case; if you still need further assistance you can always reach out to your Quickbase Account Executive or Customer Success Manager. Find more Pipelines How-Tos Below: How to use Pipelines to email a report at a specific time of day Importing Pipeline YAML How to Use the Bucket Channel to Schedule an Archive to Box Build better trend reports with Pipelines Generate template records with Pipelines